AI Writing Aids

Test Case AI

Learn about the Test Case AI for Requirement Entities, powered by LLM technology

Test Case AI in v4.11 introduces a powerful, automated tool designed to support systems engineers in the rapid generation, validation, and management of test cases. By leveraging AI to transform requirements into structured test cases, this feature significantly reduces the time and effort required to create consistent, accurate, and traceable test cases.

Test Engineers often face challenges in bridging the gap between abstract requirements and actionable test cases. Test Case AI simplifies this process by integrating seamlessly into the Test Suite Document, Documents View, and Compilation View (Requirement Class Only) automatically generating test case content aligned with the defined requirements. 

The Test Case AI is particularly valuable in its ability to:

  • Accelerate test case creation while maintaining consistency to Innoslate attributes.
  • Create automatic traceability by establishing direct relationships between requirements and test cases.
  • Allow engineers to focus on refining and validating test content rather than creating it from scratch.

In the following sections, we’ll explore the functionality of Test Case AI, focusing on how it transforms requirements into actionable test cases, validated and embedded directly within the Test Suite Document.

Accessing the “Generate Test Case” Feature

To begin using Test Case AI, navigate to the Test Suite Document. Locate the “Generate Test Case” button under the More dropdown menu, as shown in Figure 1. This is the starting point for initiating the automated test case generation process.

TestCaseAI_Button-1

Figure 1: Generate Test Case Button

Target Requirement Entity and Provide Context


In the Generate phase:

  1. Once users choose the targeted entity in the drop-down, the modal retrieves the Parent Requirement Entity (if applicable) and the chosen Requirement Entity for the selected target requirement. This step ensures that the test case is grounded in the appropriate contextual requirements.
  2. The context of the selected requirement is displayed for reference. Figure 2 shows details about the parent system (e.g., Lunar Rover) and specific child requirements (e.g., Lunar Environmental Conditions). Users are given the opportunity to provide more context and prompt engineering to appropriate the AI response, if they desire.

TestCaseAI_GeneratePhase

Figure 2: Generate Modal Phase

This phase is critical for aligning the generated test case with the system’s requirements, ensuring traceability.

Review and Validate Generated Test Case

Once the AI generates the test case, the process moves into the Verify phase. The modal outputs a complete draft of the test case, with attributes such as Number, Name, Test Description, Expected Result, and Verification Method (Analysis, Inspection, Modeling & Simulation, Test) filled in using the context and AI model training.

Engineers are expected to review and validate the generated test case. This human validation step ensures that the test case aligns with project requirements and test plans.
 A notification reminds users that the generated output may require adjustments for accuracy and completeness.

TestCaseAI_Verify

Figure 3: Verify Modal Phase

Engineers can edit any attributes at this stage, including refining the test steps, expected results, or verification methods.

Embedding the Test Case in the Test Suite

Once validated, the generated test case is embedded directly into the Test Suite Document. This provides an immediate and centralized view of the new test case within the context of other test entities.

TestCaseAI_EmbedInDocument

Figure 4: Generated Test Case Embed into Document

Test Case Entity View

The finalized test case entity can be accessed in its Entity View. Notable details include:

  1. All generated attributes are visible for further edits or updates.
  2. The “Verifies/Verified By” relationship is automatically established, linking the test case back to the targeted requirement during the Generate phase. This ensures a clear and automatic trace between requirements and their corresponding tests.

TestCaseAI_EntityView

Figure 5: Generated Test Case Entity View

Note: The Test Case AI feature is also accessible through the Documents View, providing an alternative way to generate test cases directly from requirement documentation. In this workflow, requirements can be selected and referenced directly within the document, and a test case will be automatically generated.

The key difference in this functional flow is that the generated test case will not be immediately embedded in the Test Suite Document. Instead, it will be stored directly in the project database, where it can be accessed, reviewed, and linked to the appropriate suite later. This approach offers flexibility for users who may prefer to manage and organize test cases outside of the Test Suite Document during initial creation.

Tutorial Video

 

To continue learning about the AI Writing Aids, Click Here.

(Next Article: Risk AI)