SpiraTest's artificial intelligence functionality empowers you to automate the creation of essential project artifacts from requirements, such as user stories, features, epics, and business/system requirements. It allows you to quickly generate a set of standard test cases and BDD Gherkin scenarios that can then be refined and improved as needed. We have different versions that support different LLMs including OpenAI, Anthropic and Llama, running on platforms such as AWS Bedrock and Azure.
The AI SpiraApps run inside SpiraTest and provide you with multiple different options of LLMs and AI platforms. We currently offer the options to use OpenAI GPT models running directly on OpenAI or via. Azure as well as a comprehensive set of models from Anthropic and Llama running on AWS Bedrock.
The Generative AI functionality in SpiraTest lets you:
For example, imagine that you have just created a new requirement or user story that consists of a single-line such as “As a user I want to book a train between two European cities” or “As a user I want to book a flight between two cities”.
Normally, you would now need to manually write the various test cases that cover this requirement, including positive (can log in successfully) and negative tests (failure to log in for various reasons). In addition, you would need to decompose this requirement in a set of lower-level development tasks for the developers to create the user interface, database, and other items that need to be in place to have a working login page.
If using a BDD approach, you might also want to create a set of BDD Gherkin scenarios that describe each use case for a login page more specifically. Finally, you would want to identify and document all potential risks associated with this new feature.
Using the power of Generative AI, you can radically streamline this approach using the new AI dropdown menus available inside SpiraTest:
When you click on the option to generate test cases, the plugin will use the configured LLM and create a set of test cases for the requirement in question. For our sample requirement, you can see that it has generated seven test cases, one for the positive case and six additional negative cases:
Each of the test cases consists of multiple detailed test steps that have a description of the action, the expected result if successful, and any sample data if relevant:
Note that the sample data will most likely be very notional since it does not know valid/invalid logins for your application. Still, the overall structure is correct and will save a lot of manual time writing and documenting the test cases.
In some cases, there may be test cases that have been created manually that are missing requirements. For example, the team imported some test cases from a spreadsheet or MS-Word document and now they need to reverse-engineer the requirements. The good news is that the AI assistant can help with this process as well:
When you click on the option to generate requirements, the system will create multiple requirements for the test case in question:
When using the Behavior Driven Development (BDD) methodology, it is useful to decompose the requirement or user story into different BDD scenarios, both positive and negative. Using the AI Assistant, you can generate a set of BDD scenarios for this requirement, written in the Gherkin syntax. The AI Assistant will automatically convert them into an easy-to-read format with bold for the Gherkin keywords and bullets separating out the components:
In this example, it has created four scenarios, one positive and three negative. Each is written in the Gherkin Given... When.... Then, format and is ready to use.
SpiraTest includes a dedicated AWS Bedrock plugin that lets you connect to your own customer instance of AWS Bedrock running in the region of your choice.
All you need to do is specify the URL, region and connection information and SpiraTest will connect to your LLM instances. We also give you the flexibility to choose your foundational model such as Claude or Llama.
SpiraTest includes a dedicated Azure OpenAI plugin that lets you connect to your private deployed LLMs hosted inside Microsoft Azure, accessible via. the same REST API as the public ChatGPT (OpenAI) version.
Since each instance of the OpenAI GPT models is hosted privately inside a customer's Azure environment, you can use the private URL for your own instance of Azure OpenAI. This ensures that you keep control over your sensitive data, while at the same time benefiting from the productivity improvements of Generative AI.
And if you have any questions, please email or call us at +1 (202) 558-6885
To ensure your satisfaction, we provide product support free with every subscription purchase, which guarantees you unlimited access to our knowledge base, customer forums and helpdesk. Review our support policy.
The Inflectra knowledge base includes a wide variety of helpful support articles written by Inflectra's customer support specialists.
Discover great tips, discussions, and technical solutions from fellow customers and Inflectra's technical experts.
If you can't find the answer you're looking for, please get in touch with us: over email, phone, or online.