<rss version="2.0" xmlns:a10="http://www.w3.org/2005/Atom"><channel><title>Inflectra Customer Forums: SpiraTest Best Practices</title><description>Discussions of best practices for using SpiraTest in specific situations / environments for requirements and test management. Please do not pose general issues in this forum.</description><language>en-US</language><copyright>(C) Copyright 2006-2026 Inflectra Corporation.</copyright><managingEditor>support@inflectra.com</managingEditor><category domain="http://www.dmoz.org">/Computers/Software/Project_Management/</category><category domain="http://www.dmoz.org">/Computers/Software/Quality_Assurance/</category><generator>KronoDesk</generator><a10:contributor><a10:email>support@inflectra.com</a10:email></a10:contributor><a10:id>http://www.inflectra.com/kronodesk/forums</a10:id><ttl>120</ttl><link>/Support/Forum/spiratest/best-practices/List.aspx</link><item><guid isPermaLink="false">threadId=1</guid><author>Steve M (mike.morrey+support@inflectra.com)</author><title>Multiple Operating System test cases - Reporting</title><description> When running several operating systems we had some challenges on how we wanted to get the data out of spira to report and see status on it. Through a few revisions and different ideas we still felt there was something more we could have done. What is the best way to set up these tests so that the results can be reported on by operating system and by the actual test case at the same time.     For example.     I want to see how many times test X was run against Windows 7.    A also want to be able to see all the test run against Windows 7     Thoughts?      Thanks </description><pubDate>Mon, 07 Feb 2011 15:00:35 -0500</pubDate><a10:updated>2011-02-07T18:47:00-05:00</a10:updated><link>/Support/Forum/spiratest/best-practices/1.aspx</link></item><item><guid isPermaLink="false">threadId=11</guid><author>David J (adam.sandman+support@inflectra.com)</author><title>Different ways to manage code review activities in SpiraTest?</title><description>  Hello,        I manage the development and validation activities of a set of statistical programs in a Spiratest project, and I have to perform a quality review for each of the programs.     What I actually do is creating a requirement for each program in my Spiratest project, with a same description (the list of points to verify during the code review), and then a test case from each requirement, with same test steps (the list of points to verify during the code review).     Do you think that it's the best way to perform code review activities using Spiratest? Would you have some advices to avoid creating a requirement for each of the program? </description><pubDate>Tue, 08 Feb 2011 07:45:47 -0500</pubDate><a10:updated>2011-02-08T18:24:06-05:00</a10:updated><link>/Support/Forum/spiratest/best-practices/11.aspx</link></item><item><guid isPermaLink="false">threadId=13</guid><author>Thomas Hartmann (thomas.hartmann@ubidyne.com)</author><title>Multiple Projects Vs. Single Project for Related Systems</title><description> We have an array of related products that share many of the same requirements and test cases. Since SpiraTest treats projects as independent entities, we cannot easily share requirements and test cases between them. Should we consider using a single project? </description><pubDate>Tue, 08 Feb 2011 19:54:53 -0500</pubDate><a10:updated>2011-03-09T15:07:45-05:00</a10:updated><link>/Support/Forum/spiratest/best-practices/13.aspx</link></item><item><guid isPermaLink="false">threadId=254</guid><author>Robert C Mehler (robert.mehler@faro.com)</author><title>Best Practices for Testing Multiple Releases Using SpiraTest</title><description>    Create an entry in SpiraTest for release v1.0     Write test cases, and assign to release v1.0    Create test sets, ensuring they are assigned to release v1.0, and give them to the testers     Perform testing     When testing passed, deploy software    Create an entry in SpiraTest for release v2.0    Select test cases from v1.0 that should be carried forward to v2.0 and assign them to release v2.0    Deactivate (not delete) the release v1.0 entry      Optional GÇô perform smoke test      Write new test cases to exercise v2.0 functionality, and assign to release v2.0    Augment existing test sets, or create additional test sets, ensuring they are all assigned to release v2.0, and give them to the testers     Perform testing     When testing passed, deploy software      14.     Repeat steps 6-12 for future releases, incrementing the release version number each time  </description><pubDate>Mon, 12 Mar 2012 19:09:41 -0400</pubDate><a10:updated>2014-02-24T15:57:27-05:00</a10:updated><link>/Support/Forum/spiratest/best-practices/254.aspx</link></item><item><guid isPermaLink="false">threadId=17</guid><author>David J (adam.sandman+support@inflectra.com)</author><title>Multiple machine configurations</title><description>  Hi,        I'm thinking of how to handle multiple machine configurations in SpiraTest. This is the situation:     Testcase A must be run on all machine configs: 1, 2 and 3.  Testcase B must be run on config 1. It cannot be run on 2 and 3.  Testcase C must be run on config 1 and 2. It cannot be run on 3.     Whats the best practice here? Is it to create a folder structure for each machine configuration and clone test cases that applies to more than one configuration into those folders? Or is there a way of working with custom fields and list to make this work? I must know before I run the tests what configuration that is going to be tested in that run i.e. I want to test one configuration at a time, not mix them. </description><pubDate>Thu, 17 Feb 2011 14:28:19 -0500</pubDate><a10:updated>2025-01-05T17:40:03-05:00</a10:updated><link>/Support/Forum/spiratest/best-practices/17.aspx</link></item><item><guid isPermaLink="false">threadId=70</guid><author>David J (adam.sandman+support@inflectra.com)</author><title>Including comments on an incident email</title><description>  Is there anyway to include the comments (or other custom properties) on an incident email?</description><pubDate>Wed, 11 May 2011 11:37:39 -0400</pubDate><a10:updated>2011-05-13T13:52:35-04:00</a10:updated><link>/Support/Forum/spiratest/best-practices/70.aspx</link></item><item><guid isPermaLink="false">threadId=73</guid><author>David J (adam.sandman+support@inflectra.com)</author><title>How to add Automation TAB in the spiratest?</title><description> Hi,         I am unable to add the "Automation " tab in teh new testcase creation.I tried to login with admin user and go to the administarion page and integarion ,BUT I DID NOT FIND INTEGARION .         Please suggest how can i add .      </description><pubDate>Tue, 17 May 2011 12:03:05 -0400</pubDate><a10:updated>2011-05-17T13:54:43-04:00</a10:updated><link>/Support/Forum/spiratest/best-practices/73.aspx</link></item><item><guid isPermaLink="false">threadId=74</guid><author>David J (adam.sandman+support@inflectra.com)</author><title>How to execute "Testcomplete "scripts from the Spiratest?</title><description> Hi,         How to execute "Testcomplete "scripts from the Spiratest.         Regards </description><pubDate>Tue, 17 May 2011 12:04:05 -0400</pubDate><a10:updated>2018-12-04T02:44:29-05:00</a10:updated><link>/Support/Forum/spiratest/best-practices/74.aspx</link></item><item><guid isPermaLink="false">threadId=105</guid><author>David J (adam.sandman+support@inflectra.com)</author><title>Configuring Test Type items</title><description>  Where can the Administrator create and edit  Test Types ?  I am trialling SpiraTeam using the Online Demo version at present. If we take the produce, I would want to add new Test Types in addition to the  Functional Test  and  Regression Test  options currently available.   Perhaps I'm looking in the wrong places, but I can't see where the Administrator is able to do this.   Is this just a limitation of the Demo version?     </description><pubDate>Thu, 30 Jun 2011 13:56:22 -0400</pubDate><a10:updated>2011-06-30T16:24:38-04:00</a10:updated><link>/Support/Forum/spiratest/best-practices/105.aspx</link></item><item><guid isPermaLink="false">threadId=703</guid><author>David J (adam.sandman+support@inflectra.com)</author><title>test execution with many variables: how to do this efficiently</title><description>&#xD;
Hi,  we currently have about 1000 testcases in our project. 20 testcases of them can be executed with 4 different variables (each having their values): - variable A with values YES/NO - variable B with values 1,2,3 - variable C with values 4,5,6 - variable D with values YES/NO/Planned (resulting in a total of 20x54=1080 combinations) options that I have: - create 1 testcase for each combination, but that is not maintainable - work with a custom field for each variable, but I only have 20 tests out of the 1000 that use these fields, so gives too much clutter - use parameters in the test. Problem here is that I am not triggered for this parameter until actual execution. Ideally I am triggered about this parameter the moment that I add the test in a test set. On that moment, I remember that I should add multiple instances to guarantee that at least some of the combinations are executed.  Anybody had a similar case and found an elegant solution?  Ideally, I would like to set a parameter on test set, where the tests in the test set having a parameter defined in the test case with the same name, take the value of the test set upon execution. In that way, I create test sets with parameters for variables A to D with specific values, and in that way, different combinations are executed in a controlled way.  thx for some help! Filip &#xD;
&#xD;
</description><pubDate>Wed, 04 Sep 2013 08:03:02 -0400</pubDate><a10:updated>2013-09-05T16:26:53-04:00</a10:updated><link>/Support/Forum/spiratest/best-practices/703.aspx</link></item><item><guid isPermaLink="false">threadId=710</guid><author>Erwin Husmann (erwin.husmann@casema.nl)</author><title>requirements and multiple products</title><description>&#xD;
Hi,  part of our organization works with multiple products (each having their own releases), sharing some requirements.  How do I manage requirement coverage for this? As requirements can only be linked to 1 release, I cannot link them to each product(release) the apply to.  eg: requirement A is applicable for  - product 1 release 2.0 - product 2 release 1.5  how to organize releases to cope with this? Will spira 4.1 introduce products in requirements? and if so, how do they relate to releases?  thx for your reply! Filip &#xD;
&#xD;
</description><pubDate>Thu, 12 Sep 2013 13:42:59 -0400</pubDate><a10:updated>2016-06-28T22:43:07-04:00</a10:updated><link>/Support/Forum/spiratest/best-practices/710.aspx</link></item><item><guid isPermaLink="false">threadId=246</guid><author>Gorka Garcia (gorka.garcia@emarsys.com)</author><title>Best way to manage test data with SpiraTest and inside it</title><description>   Which is the best way to handle test data with SpiraTest?       We have lot of testers and test data management has been quite difficult   before Spira-time. As a new user of the tool i want to find out how it   can help as to be more productive and efficient in that area. Testing is   mostly manual testing and the same data can be used between different   testing iterations (rollback to baseline is possible). But one tester's   can ruineach others tests by using the same test data.          I actually have two ideas which might not be new in this forum.          1. Create a new Incidet type called Test Data and give it states "Open",   "Reserved" and "Blocked" with some own attributes. Can i then link my   Test Casese to those new Test Data elements?          2. Use Test Case elements itselves as Test Data elements by grouping   those "special test cases" into Test Data directory and inside it into   different kind of test data directories like Persons, Companies,   Estates, Accounts etc.. Use defaults fields and eg. if Owner is filled,   it will be the indication that the test data is reserved. Link then the   real Test Case elements with data type Test Case elements. Is that   possile? In the execution time - can you see the linked element also?          3. Some better way?          Olli  </description><pubDate>Fri, 24 Feb 2012 22:05:55 -0500</pubDate><a10:updated>2013-09-23T13:35:11-04:00</a10:updated><link>/Support/Forum/spiratest/best-practices/246.aspx</link></item><item><guid isPermaLink="false">threadId=350</guid><author>David J (adam.sandman+support@inflectra.com)</author><title>How to identify test cases, that are not assigned to a release</title><description>&#xD;
Hi there,    I am looking for an easy way to identify the test cases, that are not assigned to a relase for the purpose of mass-assigning them with "tools". Unfortunately, there is no column in test cases on which I could filter on, there is just the drop down to restrict selection to a particular release or all releases, but not for not assigned ones. I have tried "Test Case Detailed Report" with details on "Release" but there are the same selection restrictions as mentioned above. Any ideas?  Thanks, Florian &#xD;
&#xD;
</description><pubDate>Tue, 21 Aug 2012 12:00:35 -0400</pubDate><a10:updated>2012-08-22T20:48:07-04:00</a10:updated><link>/Support/Forum/spiratest/best-practices/350.aspx</link></item><item><guid isPermaLink="false">threadId=420</guid><author>David J (adam.sandman+support@inflectra.com)</author><title>Do SpiraTest provide callback API that can be invoked by auto test tools?</title><description>&#xD;
Do SpiraTest provide callback API that can be invoked by automation test tools such as QTP?&#xD;
&#xD;
</description><pubDate>Wed, 07 Nov 2012 06:09:35 -0500</pubDate><a10:updated>2012-11-07T20:04:33-05:00</a10:updated><link>/Support/Forum/spiratest/best-practices/420.aspx</link></item><item><guid isPermaLink="false">threadId=820</guid><author>Roland Böni (roland.boni@fara.no)</author><category domain="http://www.inflectra.com/kronodesk/thread/tag">assignments</category><title>Multiple testers for a test case</title><description>&#xD;
Hello,&#xD;
&#xD;
    in our test's organization, a tester could pass some test steps and an other want to finish the last step.     Nowaday, we use 2 tests cases for each testers. But we've got difficulties to automaticly pass information from a test case to the other.     Is anyone have the same problem ? How did he solve it ?     Thanks for helping us !     Julien Delvoye. </description><pubDate>Mon, 10 Feb 2014 10:50:49 -0500</pubDate><a10:updated>2014-02-21T14:06:48-05:00</a10:updated><link>/Support/Forum/spiratest/best-practices/820.aspx</link></item><item><guid isPermaLink="false">threadId=500</guid><author>David J (adam.sandman+support@inflectra.com)</author><title>Setting up Test Phases</title><description>Hi , New to SpiraTest and was wondering how you set up seperate testing phases. How do you ensure that the next phase doesn't start until the previous phase is complete? </description><pubDate>Mon, 11 Feb 2013 13:11:28 -0500</pubDate><a10:updated>2013-02-13T20:24:37-05:00</a10:updated><link>/Support/Forum/spiratest/best-practices/500.aspx</link></item><item><guid isPermaLink="false">threadId=618</guid><author>David J (adam.sandman+support@inflectra.com)</author><title>Adding Tests based on the Requirements to a release/iteration</title><description>&#xD;
Hello dear reader ;)  I did add some requirements to the requirement list and added test for each requirement. With this I can see the coverage and test progress for each requirement which is good. Now I add requirements to a release or iteration and would expect the test information gets linked as well. But when I create a test set from that release or iteration it is just empty. Also in the release or iteration list I get shown no coverage for tests. Is it me doing something wrong or is it the tool not supporting this way of using the requirement-test linking information. It feels like redundant and risky to me to have to add the tests again and can not rely on the requirement-test linking in the first place.  Thanks for any type of feed back. Cheers Werner &#xD;
&#xD;
</description><pubDate>Mon, 17 Jun 2013 15:25:46 -0400</pubDate><a10:updated>2013-06-20T09:51:48-04:00</a10:updated><link>/Support/Forum/spiratest/best-practices/618.aspx</link></item><item><guid isPermaLink="false">threadId=689</guid><author>Charles R Heathcote (charles.heathcote@bigpond.com)</author><title>Needed: backward referenz from the linked testcase to all testcases using it - and how to reset execution status</title><description>&#xD;
Dear all, I have to implement testcases so called "lifecycle" ... which for me is to enter (1) an instruction, to modify (2), verify (3) up to processed (9) or deleted (10). Full details are not important here.  The point is, that I do have lifecycle testcases for different instruction types. The first step how to enter (1) is different ... but all other steps (2) up to (10) are exactly the same.  Copying the testcase a lot of times is a bad option in my opinion, as maintenance eg. changes to the description or reorder of steps will take huge effort.   -&gt; So I decided to create a "reusable testcase component" (it's just a normal testcase) which includes all steps (2) up to (10) and than I created one testcase for each instruction type with a speciffic step (1) and as step (2) I linked to the reusable component. This gets me all the other steps needed. Up to that point fine for me. :o)  Now I have 2 issues (and hope somebody can help):  - For changes at the "reusable component" it would be useful to know in which testcases the component was linked (as the change will affect all the testcases). I am able to see it from the testcase that links the component but not the other way round. - After executing the testcase including the linked in testcase component the testcase and the component are marked as e.g. passed. Fine. After deleting the the test run the testcase itself is retested BUT not the "component".   Any ideas? Thanks for your help.  BEST REGARDS OLIVER  &#xD;
&#xD;
</description><pubDate>Fri, 23 Aug 2013 11:40:39 -0400</pubDate><a10:updated>2016-07-21T22:38:00-04:00</a10:updated><link>/Support/Forum/spiratest/best-practices/689.aspx</link></item><item><guid isPermaLink="false">threadId=690</guid><author>Jon Freed (jfreed@edmap.com)</author><title>Workflow transition rules are not valid in list view</title><description>&#xD;
Dear all, for my project I implemented an own (reduced) workflow - which was very straight and easy to do... and it works fine in the full detail views. In the list view for all incidents I can edit the data as well. But there I am able to enter ANY status and spira does not care about the defined transitions. Any ideas?  BEST REGARDS OLIVER  &#xD;
&#xD;
</description><pubDate>Fri, 23 Aug 2013 11:47:57 -0400</pubDate><a10:updated>2015-02-09T15:17:14-05:00</a10:updated><link>/Support/Forum/spiratest/best-practices/690.aspx</link></item><item><guid isPermaLink="false">threadId=691</guid><author>David J (adam.sandman+support@inflectra.com)</author><title>Select all incidents if custom property "Tool ID" (Integer) is filled</title><description>&#xD;
Dear all, my last question... I defined an own custom property so called "Tool ID". For some incidents we might have duplicates in an "other world", so we would like to enter a reference number there. Now I would like to get an overview for all incidents which have field "Tool ID" filled with any value - but not one specific value.  A report would be fine, any other ideas as well.  THANKS OLIVER &#xD;
&#xD;
</description><pubDate>Fri, 23 Aug 2013 11:57:21 -0400</pubDate><a10:updated>2013-08-26T03:28:11-04:00</a10:updated><link>/Support/Forum/spiratest/best-practices/691.aspx</link></item><item><guid isPermaLink="false">threadId=1699</guid><author>David J (adam.sandman+support@inflectra.com)</author><category domain="http://www.inflectra.com/kronodesk/thread/tag">invisible open fields</category><category domain="http://www.inflectra.com/kronodesk/thread/tag"> </category><title>It is difficult to see the open fields in the detail pages</title><description> Hi  Our users complain that it is difficult to see which fields are open - they miss a border lining the open fields.  Or that the open fields have a different colour from the background.  Will you develop a feature that lets the admin choose this matter? </description><pubDate>Tue, 13 Jun 2017 13:00:59 -0400</pubDate><a10:updated>2017-06-15T14:49:58-04:00</a10:updated><link>/Support/Forum/spiratest/best-practices/1699.aspx</link></item><item><guid isPermaLink="false">threadId=746</guid><author>Piotr W Janicki (peterjanicki@gmail.com)</author><category domain="http://www.inflectra.com/kronodesk/thread/tag">screenshot</category><category domain="http://www.inflectra.com/kronodesk/thread/tag"> parameter</category><title>Screenshot as Parameter</title><description>  Image as parameter      
I would like to use screenshot as a parameter for expected results.

    For the same test case in two test sets I will have two different expected result, i.e. different Dialogs for the user.  Is a way to add image as  a parameter? I can add url link to attachment as a parameter but then you can see just link not image  </description><pubDate>Sat, 02 Nov 2013 08:37:57 -0400</pubDate><a10:updated>2013-11-04T00:44:32-05:00</a10:updated><link>/Support/Forum/spiratest/best-practices/746.aspx</link></item><item><guid isPermaLink="false">threadId=763</guid><author>Inflectra Sarah (donotreply6@kronodesk.net)</author><category domain="http://www.inflectra.com/kronodesk/thread/tag">quality</category><title>Test quality</title><description>&#xD;
Hello &#xD;
&#xD;
    There is no specific Quality Control team in my organisation unlike the big giants who do have dedicated support in this area.      In my situation, I am dealing with a third-party supplier, a start-up organisation who are developing a software system. As a technical lead on the project, I am managing the test environment in Spira Team. As I have not witnessed any quality control mechanisms in place from the supplier end, I want to define a quality dimension to the project to evaluate the system quality and the test quality. I am seeking advice if there are ways of implementing the quality dimension within Spira Team and any other best practices you use to evaluate these parameters?     Thanks for your time in advance .         Regards    Praveen  </description><pubDate>Wed, 20 Nov 2013 15:34:37 -0500</pubDate><a10:updated>2013-11-26T15:10:13-05:00</a10:updated><link>/Support/Forum/spiratest/best-practices/763.aspx</link></item><item><guid isPermaLink="false">threadId=790</guid><author>Michael Long (michaellong77@gmail.com)</author><category domain="http://www.inflectra.com/kronodesk/thread/tag">perl</category><category domain="http://www.inflectra.com/kronodesk/thread/tag"> api</category><title>Connecting to Spiratest with PERL</title><description>&#xD;
Hi,&#xD;
&#xD;
    Can you give example how connect to spiratest with PERL?       my $user = "username";   my $pass = "password";        my $schema_uri = 'http:// .com/SpiraTest/Services/v3_0/ImportExport.svc?wsdl';  my $client = SOAP::Lite-&gt;new;  my $response = $client-&gt;service($schema_uri)                            -&gt;Connection_Authenticate($user,$pass);                              I get this error   Service description '.../SpiraTest/Services/v3_0/ImportExport.svc?wsdl' can't be loaded: 503 Service Unavailable     Thanks for your time          </description><pubDate>Wed, 18 Dec 2013 14:41:11 -0500</pubDate><a10:updated>2013-12-23T16:03:59-05:00</a10:updated><link>/Support/Forum/spiratest/best-practices/790.aspx</link></item><item><guid isPermaLink="false">threadId=823</guid><author>Cuong Nguyen (cuong.nguyen@s3corp.com.vn)</author><category domain="http://www.inflectra.com/kronodesk/thread/tag">junit</category><title>Spiratest and JUnit Automated Tests</title><description>&#xD;
I'm trying to set up a connection between Spiratest and my automated tests (Thucydides/Selenium/Junit). But I have no connection so far.  If I debug my test, it doesn't reach the code that is responisble for the connection:    import static org.junit.Assert.*; import junit.framework.JUnit4TestAdapter;  import org.junit.Test; import org.junit.runner.JUnitCore;  import com.inflectra.spiratest.addons.junitextension.SpiraTestCase; import com.inflectra.spiratest.addons.junitextension.SpiraTestConfiguration; import com.inflectra.spiratest.addons.junitextension.SpiraTestListener;  @SpiraTestConfiguration(  url = "http://XXXX/SpiraTest", login = "administrator", password = "XXXX", projectId = 12, releaseId = -1, testSetId = -1  ) public class Test1 {      @Test     @SpiraTestCase(testCaseId = 91)     public void IsStandardArticlePresent() {         boolean status = false;         assertTrue(status);      }      public static void main(String[] args) {         // Instantiate the JUnit core         JUnitCore core = new JUnitCore();         // Add the custom SpiraTest listener         core.addListener(new SpiraTestListener());         System.out.println("Test main");         // Finally run the test fixture         core.run(Test1.class);     }      public static junit.framework.Test suite() {         return new JUnit4TestAdapter(Test1.class);     }  }   &#xD;
&#xD;
</description><pubDate>Wed, 12 Feb 2014 12:57:46 -0500</pubDate><a10:updated>2014-07-24T06:47:52-04:00</a10:updated><link>/Support/Forum/spiratest/best-practices/823.aspx</link></item><item><guid isPermaLink="false">threadId=975</guid><author>Bart Van Raemdonck (bart.vanraemdonck@gmail.com)</author><category domain="http://www.inflectra.com/kronodesk/thread/tag">sprint</category><category domain="http://www.inflectra.com/kronodesk/thread/tag"> release</category><category domain="http://www.inflectra.com/kronodesk/thread/tag"> requirements</category><title>Working with releases/sprints</title><description>
Hi,  In our project we are working with releases and sprints. Each release consists with some sprints, is there a way to add requirements to a sprint AND a release? Now our requirements are attached to a sprint, but if I want an overview for a whole release there is no way to do that.  Kind Regards,  Bart  

</description><pubDate>Wed, 20 Aug 2014 07:42:40 -0400</pubDate><a10:updated>2014-09-03T14:11:57-04:00</a10:updated><link>/Support/Forum/spiratest/best-practices/975.aspx</link></item><item><guid isPermaLink="false">threadId=809</guid><author>Thomas Forest (thomas.forest@bicworld.com)</author><category domain="http://www.inflectra.com/kronodesk/thread/tag">assignments</category><title>Tests set and Multiple users</title><description>Hello,  I am a starter with spiraTest with documentation I undestood well how to create requirements, test cases. I create a tests set attached to test cases. Now I want to attach this test set to many users to make them run it. I searched a lot, I am sure it is obvious please help ! :)  Thanks, Thomas       &#xD;
&#xD;
</description><pubDate>Thu, 23 Jan 2014 16:01:34 -0500</pubDate><a10:updated>2014-01-24T11:09:19-05:00</a10:updated><link>/Support/Forum/spiratest/best-practices/809.aspx</link></item><item><guid isPermaLink="false">threadId=964</guid><author>David J (adam.sandman+support@inflectra.com)</author><category domain="http://www.inflectra.com/kronodesk/thread/tag">automated testing</category><title>Running automated TC's from Rapise in SpiraTest</title><description>&#xD;
&#xD;
Hello,&#xD;
&#xD;
    i have installed SpiraTest and Rapise(trial version)  - i've created several TC's in SpiraTest and configured the relationship between  SpiraTest&amp;Rapise - all it's fine 	   - however when i try to [Execute] the TC from Spira, it generate the TestSetLaunch(on my local machine) - but i receive an error message: "The server could not be contacted to retreive test information. Please go into the settings and verify the server and user information, then run the test file again."  - i've checked all the settings -&gt; all it's fine  Any ideas?     Br,   Daniel Condurache &#xD;
&#xD;
</description><pubDate>Fri, 08 Aug 2014 09:58:28 -0400</pubDate><a10:updated>2014-08-13T18:17:01-04:00</a10:updated><link>/Support/Forum/spiratest/best-practices/964.aspx</link></item><item><guid isPermaLink="false">threadId=1516</guid><author>Kat A (elise.brooks@inflectra.com)</author><category domain="http://www.inflectra.com/kronodesk/thread/tag">test cases</category><category domain="http://www.inflectra.com/kronodesk/thread/tag"> test sets</category><category domain="http://www.inflectra.com/kronodesk/thread/tag"> owners</category><title>Assign test cases within a test set for execution</title><description>&#xD;
Hi,&#xD;
&#xD;
    I have some testers that have created a Test Set containing ~40 test cases or so. They've executed the test set to begin the process, but they want to only have certain test cases within the test set accessible to certain users. We've tried setting the owner for the test case within the test set overview to that user and see if it will allow them to resume the test set on that test case, but the resume option isn't showing up for the tester on their home page. Is there a way that we can do this?      A quick example:    &#xD;
&#xD;
  Test Case Name &#xD;
  Owner &#xD;
  Status &#xD;
&#xD;
 &#xD;
  Test Case 1 &#xD;
&#xD;
  Passed &#xD;
&#xD;
 &#xD;
  Test Case 2 &#xD;
&#xD;
  Failed &#xD;
&#xD;
 &#xD;
  Test Case 3 &#xD;
&#xD;
  Passed &#xD;
&#xD;
 &#xD;
  Test Case 4 &#xD;
  Tester &#xD;
  Not Run &#xD;
&#xD;
  &#xD;
       In this example Test Case is set to Tester, but they can't resume the Test Set that has already been  </description><pubDate>Fri, 29 Jul 2016 13:29:59 -0400</pubDate><a10:updated>2017-03-14T14:41:57-04:00</a10:updated><link>/Support/Forum/spiratest/best-practices/1516.aspx</link></item><item><guid isPermaLink="false">threadId=1366</guid><author>Inflectra Sarah (donotreply6@kronodesk.net)</author><category domain="http://www.inflectra.com/kronodesk/thread/tag">erp testing</category><title>What are best practices for testing processes in ERP environment</title><description> &#xD;
&#xD;
&#xD;
We have started using SpiraTest as test tool for our ERP upgrade activities. As part of the testing we also execute Integration test where we track a specific transaction through the different stages (e.g. submit purchase orde fro supplier, register material receipt for this Purchase Order, enter payables invoice, match invoice to the receipt, pay invoice).  One way to put this in SpiraTest is to define a Test Set, where the Test Set represents the complete process. Then we assign individual Test Cases to this Test Set, where each test cases represents a specific process step (e.g. Create Purchase Order), and optionally weassign these test cases to different individuals/roles so the process can flow from Procurement to Materials to Finance.  Are there may be other ways for configuring these process tests that somebody came accross in their test environment?     </description><pubDate>Tue, 16 Feb 2016 13:42:01 -0500</pubDate><a10:updated>2022-10-03T17:40:51-04:00</a10:updated><link>/Support/Forum/spiratest/best-practices/1366.aspx</link></item><item><guid isPermaLink="false">threadId=1367</guid><author>Inflectra Sarah (donotreply6@kronodesk.net)</author><category domain="http://www.inflectra.com/kronodesk/thread/tag">test execution</category><category domain="http://www.inflectra.com/kronodesk/thread/tag"> owners</category><title>How to notify next tester that a test case has been completed</title><description> &#xD;
&#xD;
&#xD;
When there are multiple test cases assigned to a test set, and the test cases are assigned to different tester, is there an easy way that a tester is notified by email that the previous test case has been completed?  thanks </description><pubDate>Tue, 16 Feb 2016 13:44:02 -0500</pubDate><a10:updated>2016-02-23T18:03:42-05:00</a10:updated><link>/Support/Forum/spiratest/best-practices/1367.aspx</link></item><item><guid isPermaLink="false">threadId=1345</guid><author>Inflectra Sarah (donotreply6@kronodesk.net)</author><category domain="http://www.inflectra.com/kronodesk/thread/tag">test conditions</category><title>Test Conditions</title><description>In order to comply with some testing standards, we need to introduce Test conditions into our projects and want to incorporate this into SPIRA. Is there currently a best practice way of doing this?    Is it possible to add another option alongside Test cases, sets and runs? Or is it possible to modify the 4th option (Automation Hosts) to similar set up as Test cases?     Currently running Spirateam V4.2.0.8     Thanks </description><pubDate>Wed, 13 Jan 2016 18:10:39 -0500</pubDate><a10:updated>2016-03-10T15:45:03-05:00</a10:updated><link>/Support/Forum/spiratest/best-practices/1345.aspx</link></item><item><guid isPermaLink="false">threadId=1548</guid><author>Ashutosh Singh (sammuegal@hotmail.com)</author><category domain="http://www.inflectra.com/kronodesk/thread/tag">keyboard shortcuts</category><title>Shortcut to Save the test case updates, add new step, save the current step</title><description>&#xD;
Please let us know if there are any shortcut keys to Save the test case updates, add new step, save the current step. It would be so helpful if there are any. &#xD;
&#xD;
</description><pubDate>Thu, 06 Oct 2016 07:02:57 -0400</pubDate><a10:updated>2020-04-17T09:48:18-04:00</a10:updated><link>/Support/Forum/spiratest/best-practices/1548.aspx</link></item><item><guid isPermaLink="false">threadId=1494</guid><author>Test Management (QualityAssurance@tryg.dk)</author><category domain="http://www.inflectra.com/kronodesk/thread/tag">installation</category><title>Upgrade from 4.2.0.10 to 5.0.0.5</title><description> &#xD;
&#xD;
&#xD;
Hello!  I am currently running SpiraTest version 4.2.0.10 and preparing to upgrade to  5.0.0.5. Is there any guide or best practises to follow?. Or is it pretty straightforward?  I have connected the application to a external running database on another server, do the database need any adjustments?   I am new to this forum, so please do not hesitate if you need more information. </description><pubDate>Tue, 12 Jul 2016 11:07:32 -0400</pubDate><a10:updated>2018-11-15T19:06:34-05:00</a10:updated><link>/Support/Forum/spiratest/best-practices/1494.aspx</link></item><item><guid isPermaLink="false">threadId=1593</guid><author>manali kadam (kadammanali987@gmail.com)</author><category domain="http://www.inflectra.com/kronodesk/thread/tag">test sets</category><category domain="http://www.inflectra.com/kronodesk/thread/tag"> test cases</category><title>Creating Test Sets from multiple Test Case folders</title><description>Hi,  We have an extensive array of folders and subfolders for test cases covering different parts of different applications. Creating manually a test set for each subfolder is very time-consuming task. Is there some way how to create same test set structure from test case folders?  Thanks, Filip &#xD;
&#xD;
</description><pubDate>Wed, 21 Dec 2016 14:54:51 -0500</pubDate><a10:updated>2019-01-29T06:35:05-05:00</a10:updated><link>/Support/Forum/spiratest/best-practices/1593.aspx</link></item><item><guid isPermaLink="false">threadId=2557</guid><author>David J (adam.sandman+support@inflectra.com)</author><category domain="http://www.inflectra.com/kronodesk/thread/tag">datasync</category><category domain="http://www.inflectra.com/kronodesk/thread/tag"> jira</category><title>Is it possible to map more Jira status to the same Requirements</title><description> Dear all,  I am working in kanban project with Jira. The requirements are stored in Jira. The Kanban process has more stages.  I have set up a one way DataSync Server (issues only from Jira are imported to Spiratest) and it works fine.  I am able to import Jira issues as requirements. Now i would like to map the Jira issue status to the requirement status. As the jira issues have more different stages (status) they have the same meaning for the testing project. i would like to map different jira issues to the same spiratest requirements status.   Is it possiple somehow? If i try to set the jira ids as a comma separeted list, it does not work for me.  Thank you very much for your support!    Kind regars,  Istvan </description><pubDate>Fri, 14 Jan 2022 15:29:48 -0500</pubDate><a10:updated>2024-08-16T17:57:32-04:00</a10:updated><link>/Support/Forum/spiratest/best-practices/2557.aspx</link></item><item><guid isPermaLink="false">threadId=1980</guid><author>Jim R (donotreply5@kronodesk.net)</author><category domain="http://www.inflectra.com/kronodesk/thread/tag">backwards compatibility</category><category domain="http://www.inflectra.com/kronodesk/thread/tag"> cross project</category><title>Test Cases which involve multiple software projects</title><description> We have what we believe o be a very common situation:   a server which provides various REST APIs  various client apps (iOS, Android, Web, Desktop) which communicate with the server  the server and the four client apps are technically 5 different software projects with their own releases and builds  we use test automation  most of our Test Cases involve at least two of the above software projects (server + one of the apps), e.g. the login tests, where the apps are expected to check the credentials against the server  some Test Cases involve three or more of the above software projects, e.g. the iOS app sending a broadcast which should be received by the three other apps  we want to test all kinds of compatibility scenarios like can iOS app v18.3 login against a server v19.1?, can Android app 18.3 receive a broadcast sent from iOS app 19.2?      How would you do the above in SpiraTest? Were quite excited about the Test Configuration feature, where we could Test Case parameters like Client1, Client2, Server and create custom lists with all the currently supported app and server versions, but then we miss out on the traceability via Release and Build.  Alternatively, we could account for the fact that each of our 5 software projects above has their own releases and builds by creating 5 corresponding SpiraTest projects. However, how would we deal with Test Cases which involve multiple software projects (see above)? We could use the cross-project feature introduced in SpiraTest 5.1 by expressing our expectations for such multi-project scenarios as SpiraTest Requirements, copy the same Test Case into all SpiraTest Projects representing the involved software projects, and link all those Test Cases to the one Requirement. However, how would execute all those Test Cases? Using Automation, were scheduling Test Sets and have RemoteLaunch execute them. How would that work if were running a test which involves multiple software projects, represented by multiple Test Case clones across corresponding SpiraTest projects?  Any suggestions very welcome! </description><pubDate>Fri, 01 Feb 2019 07:04:49 -0500</pubDate><a10:updated>2020-07-29T11:09:50-04:00</a10:updated><link>/Support/Forum/spiratest/best-practices/1980.aspx</link></item><item><guid isPermaLink="false">threadId=1884</guid><author>Jim R (donotreply5@kronodesk.net)</author><title>Best way to group multiple test case of applications into different areas?</title><description> I have a 60 applications to test a few are big, that I can split it to a separate project into SpiraTest, but many are not so big and all of them are in one huge bag.  I split those application in to different areas they are related to:  Area 1  -app A  -app B  Area 2  -app C  -app B  Area 3  -app D  but sometimes app B are related with Area 2 as well (it will be a bit different requirements) but my question is to how the single Tester (person who will do the test, which does not have experience with SpiraTest before and have to know which requirements/test cases should he test ? by using simple filter by owner it want work... maybe group it into Test Set, that all test Case related to app 2 will be in one Test Set ? but then I will have  20 Test Sets only with application and another general tests like login or something, it will be confusing.  Any suggestions ?  Regards  Aleksandra </description><pubDate>Mon, 16 Jul 2018 14:23:48 -0400</pubDate><a10:updated>2018-07-19T11:30:08-04:00</a10:updated><link>/Support/Forum/spiratest/best-practices/1884.aspx</link></item><item><guid isPermaLink="false">threadId=1974</guid><author>Atif Hafeez (hafeez.atif@gmail.com)</author><category domain="http://www.inflectra.com/kronodesk/thread/tag">spira</category><category domain="http://www.inflectra.com/kronodesk/thread/tag"> backup</category><category domain="http://www.inflectra.com/kronodesk/thread/tag"> migrate</category><title>How to backup Spira projects</title><description> One of our client is expecting to migrate from SPIRA to Jira. Now the need is, we require to either port or backup data.  1. Do we have any tool which can help in migrating the project data (requirements, test cases, attachments and incidents) to  Jira?  2. If Jira migration tool is not available, is there a way to backup complete project data to MSWord or HTML for future offline references (without spira)?  Please suggest     Thanks  Jagadish </description><pubDate>Wed, 23 Jan 2019 06:18:27 -0500</pubDate><a10:updated>2019-03-07T11:20:58-05:00</a10:updated><link>/Support/Forum/spiratest/best-practices/1974.aspx</link></item><item><guid isPermaLink="false">threadId=1975</guid><author>Jim R (donotreply5@kronodesk.net)</author><category domain="http://www.inflectra.com/kronodesk/thread/tag">spira</category><category domain="http://www.inflectra.com/kronodesk/thread/tag"> backup</category><category domain="http://www.inflectra.com/kronodesk/thread/tag"> migrate</category><title>How to backup Spira projects</title><description> One of our client is expecting to migrate from SPIRA to Jira. Now the need is, we require to either port or backup data.  1. Do we have any tool which can help in migrating the project data (requirements, test cases, attachments and incidents) to  Jira?  2. If Jira migration tool is not available, is there a way to backup complete project data to MSWord or HTML for future offline references (without spira)?  Please suggest     Thanks  Jagadish </description><pubDate>Wed, 23 Jan 2019 06:18:31 -0500</pubDate><a10:updated>2019-01-23T16:49:20-05:00</a10:updated><link>/Support/Forum/spiratest/best-practices/1975.aspx</link></item><item><guid isPermaLink="false">threadId=2132</guid><author>Ilia Poliakov (ilya.polyakov@edetek.com)</author><title>Monthly Server Testing</title><description> I have almost 150 servers that are regularly patched and tested after patching. Planning an initial round of update/testing for all 150 servers. Subsequently will do regular monthly updates for subsets of the 150 servers. Would like to set up Releases in Spiratest, add the servers to the appropriate release and associate test scenarios and scripts to the servers, test and document the results.  And then reuse test scripts in next release. Tried setting up each server as a requirement but once the requirement is added to a release it cannot be added to another release.   Do I load up the set of servers again for the next release?  Or  would it be better to create the servers as a custom list and add them to releases?  Or is using components a better pathway? Has anyone else set up repetitive testing for a set of servers?  </description><pubDate>Fri, 27 Dec 2019 16:19:47 -0500</pubDate><a10:updated>2022-02-23T20:26:20-05:00</a10:updated><link>/Support/Forum/spiratest/best-practices/2132.aspx</link></item><item><guid isPermaLink="false">threadId=2111</guid><author>Kat A (elise.brooks@inflectra.com)</author><title>Notification event based on test run reassignement</title><description> Hello,  Is it possible to trigger a notification event by reassigning a test run?     My client would like to have a test set executed by mutliple operator. Im scratching my head about how to do this in SpiraTest.     Thank you for help </description><pubDate>Wed, 20 Nov 2019 12:22:11 -0500</pubDate><a10:updated>2019-11-20T18:30:38-05:00</a10:updated><link>/Support/Forum/spiratest/best-practices/2111.aspx</link></item><item><guid isPermaLink="false">threadId=2565</guid><author>Thomas Sheaffer (Thomas.Sheaffer@hmhs.com)</author><title>Same test case - Multiple pre-requisits</title><description> On a suggestion from another forum post, im trying to avoid generating a ton of test cases with the same steps, but different data by using parameters and adding the same test case to a test set multiple times, but editing the parameter for each of the test case.  I cant figure out how to insert images here, so let me try to explain an example.  Test case - Move trailer from the yard to a dock door, parameters are trailerID and dockdoor  It works fine adding multiple cases to a test set and changing the parameter, but the parameter is not visible during the test execution and also if I go to Test case -&gt; Test runs, I cannot show the column called parameters, so I get no overview over which cases are related to which parameters.  Are there a way to show parameters during test execution and also a way to get an overview of test runs vs parameters? </description><pubDate>Wed, 02 Feb 2022 09:33:19 -0500</pubDate><a10:updated>2022-07-22T17:07:14-04:00</a10:updated><link>/Support/Forum/spiratest/best-practices/2565.aspx</link></item><item><guid isPermaLink="false">threadId=2738</guid><author>David J (adam.sandman+support@inflectra.com)</author><category domain="http://www.inflectra.com/kronodesk/thread/tag">test case linkage</category><title>how to delete the linkage of a test case from the Test Set </title><description> hi, I am using SpiraPlan Version 7.2.0.0  By mistake, I linked a Test Case in a Test Set to a test case that is in the another Release   Having 1 test case in 2 different Test Sets is not allowed because basically if one test case is executed from one Test Set it will appear as Executed in the other one as well.  I was told to create the same test case (with same name and details) in the correct folder of Test Cases artifact and in the correct location and then link it to the Test Set I should be linked to. Afterwards, I will need to  delete the linkage of the duplicate from the Test Set   Do you know how to do this in SpiraPlan? </description><pubDate>Wed, 04 Jan 2023 19:16:38 -0500</pubDate><a10:updated>2023-01-04T21:06:51-05:00</a10:updated><link>/Support/Forum/spiratest/best-practices/2738.aspx</link></item><item><guid isPermaLink="false">threadId=2891</guid><author>David J (adam.sandman+support@inflectra.com)</author><category domain="http://www.inflectra.com/kronodesk/thread/tag">different products</category><category domain="http://www.inflectra.com/kronodesk/thread/tag"> different programs</category><category domain="http://www.inflectra.com/kronodesk/thread/tag"> combined overview</category><title>Combine the overview about different kind of test sets status</title><description> How to create an overview / one-stop shop to see the test sets or test execution status , incident from different products / programs of spira test.  We have different teams, which are working for different products and define its test cases in different products / programs in spira test. But we perform the test together with the same release and would like to have the reporting in same page to get the whole overview about the release in one page. </description><pubDate>Wed, 17 Jan 2024 16:32:33 -0500</pubDate><a10:updated>2024-01-23T13:44:45-05:00</a10:updated><link>/Support/Forum/spiratest/best-practices/2891.aspx</link></item><item><guid isPermaLink="false">threadId=2897</guid><author>kim vijle (kvij@idirect.net)</author><category domain="http://www.inflectra.com/kronodesk/thread/tag">filter</category><category domain="http://www.inflectra.com/kronodesk/thread/tag"> test set</category><category domain="http://www.inflectra.com/kronodesk/thread/tag"> incident</category><category domain="http://www.inflectra.com/kronodesk/thread/tag"> test case</category><title>Filter func. to add test cases to a test sets with open incident</title><description> We plan to retest only the test cases, which are failed before but are already fixed for the next sprint. Is there any filter function to do that, so I can select all test cases which are linked to incident with status closed or verificaiton and just add them all to a new test set?  Because at the moment I have to check manually one by one all the failed test cases, go to each test run and check the incident.  After that I can add them to the new test set. </description><pubDate>Wed, 24 Jan 2024 07:28:18 -0500</pubDate><a10:updated>2024-08-16T17:56:04-04:00</a10:updated><link>/Support/Forum/spiratest/best-practices/2897.aspx</link></item></channel></rss>