Overview
Many of our customers seek to integrate automation test case execution with Spira so that the results are automatically reflected in Spira as new test runs.
In this example, we consider the case where a team have a cucumber based automation framework.In cucumber you could have tags for each cucumber test case. You would then need to map these test cases with the ones in Spira dynamically. That way, once you execute the regression pack, the results will be automatically updated in Spira against each test case.
Consider the following example Test Case mapping between Spira and Cucumber:
Test case in Spira | Test. Case in cucumber feature | Result |
---|
TC_1 | tag name - CheckLogin | Pass |
TC_2 | tag name - CheckLogout | Pass |
TC_3 | tag name - CreateEmployee | Pass |
TC_4 | tag name - UpdateEmployee | Fail |
TC_5 | tag name - DeleteEmployee | Fail |
Solution
The solution is to add code to your test framework so that once the cucumber script finishes, you can dynamically send back the results to Spira.
The correct REST API call you should make is the following:
POST: projects/{project_id}/test-runs/record
Description
Records the results of executing an automated test
You need to use this overload when you want to be able to set Test Run custom properties
How to Execute
To access this REST web service, you need to use the following URL (make sure to replace any parameters (eg {project_id}) with the relevant value (eg 1):
http://api.inflectra.com/Spira/Services/v5_0/RestService.svc/projects/{project_id}/test-runs/record
Request Parameters
Name | Description |
---|
project_id | The id of the current project |
Request Body
Property | Description |
---|
TestRunFormatId | The format of the automation results (1=Plain Text, 2=HTML) stored in the 'RunnerStackTrace' field |
RunnerName | The name of the external automated tool that executed the test |
RunnerTestName | The name of the test case as it is known in the external tool |
RunnerAssertCount | The number of assertions/errors reported during the automated test execution |
RunnerMessage | The summary result of the test case |
RunnerStackTrace | The detailed trace of test results reported back from the automated testing tool |
AutomationHostId | The id of the automation host that the result is being recorded for |
AutomationEngineId | The id of the automation engine that the result is being recorded for |
AutomationEngineToken | The token of the automation engine that the result is being recorded for (read-only) |
AutomationAttachmentId | The id of the attachment that is being used to store the test script (file or url) |
Parameters | The list of test case parameters that have been provided |
ScheduledDate | The datetime the test was scheduled for |
TestRunSteps | The list of test steps that comprise the automated test These are optional for automated test runs. The status of the test run steps does not change the overall status of the automated test run. They are used to simply make reporting clearer inside the system. They will also update the status of appropriate Test Step(s) if a valid test step id is provided. |
TestRunId | The id of the test run |
Name | The name of the test run (usually the same as the test case) |
TestCaseId | The id of the test case that the test run is an instance of |
TestRunTypeId | The id of the type of test run (automated vs. manual) |
TesterId | The id of the user that executed the test The authenticated user is used if no value is provided |
ExecutionStatusId | The id of overall execution status for the test run Failed = 1; Passed = 2; NotRun = 3; NotApplicable = 4; Blocked = 5; Caution = 6; |
ReleaseId | The id of the release that the test run should be reported against |
TestSetId | The id of the test set that the test run should be reported against |
TestSetTestCaseId | The id of the unique test case entry in the test set |
StartDate | The date/time that the test execution was started |
EndDate | The date/time that the test execution was completed |
BuildId | The id of the build that the test was executed against |
EstimatedDuration | The estimated duration of how long the test should take to execute (read-only) This field is populated from the test case being executed |
ActualDuration | The actual duration of how long the test should take to execute (read-only) This field is calculated from the start/end dates provided during execution |
ProjectId | The id of the project that the artifact belongs to The current project is always used for Insert operations for security reasons |
ConcurrencyDate | The datetime used to track optimistic concurrency to prevent edit conflicts |
CustomProperties | The list of associated custom properties/fields for this artifact |
Sample Code
For those using Java to execute the Cucumber scripts, you can use the following Java sample code to send back the results:
import com.google.gson.Gson;
import java.io.DataOutputStream;
import java.io.IOException;
import java.net.HttpURLConnection;
import java.util.*;
import java.net.URL;
/**
* This defines the 'SpiraTestExecute' class that provides the Java facade for
* calling the REST web service exposed by SpiraTest
*
* @author Inflectra Corporation
* @version 5.0.0
*/
public class SpiraTestExecute {
/**
* The URL appended to the base URL to access REST. Note that it ends with a slash
*/
private static final String REST_SERVICE_URL = "/services/v5_0/RestService.svc/";
public String url;
public String userName;
public String token;
public int projectId;
SpiraTestExecute(String url, String userName, String token, int projectId) {
this.userName = userName;
this.token = token;
this.url = url;
this.projectId = projectId;
}
/**
* Performs an HTTP POST request ot the specified URL
*
* @param input The URL to perform the query on
* @param body The request body to be sent
* @return An InputStream containing the JSON returned from the POST request
* @throws IOException
*/
public static void httpPost(String input, String body) throws IOException {
URL url = new URL(input);
HttpURLConnection connection = (HttpURLConnection) url.openConnection();
//allow sending a request body
connection.setDoOutput(true);
connection.setRequestMethod("POST");
//have the connection send and retrieve JSON
connection.setRequestProperty("accept", "application/json; charset=utf-8");
connection.setRequestProperty("Content-Type", "application/json; charset=utf-8");
connection.connect();
//used to send data in the REST request
DataOutputStream outputStream = new DataOutputStream(connection.getOutputStream());
//write the body to the stream
outputStream.writeBytes(body);
//send the OutputStream to the server
outputStream.flush();
outputStream.close();
connection.getInputStream();
}
/**
* Records a test run
*
* @param testCaseId The test case being executed
* @param releaseId The release being executed against (optional)
* @param testSetId The test set being executed against (optional)
* @param executionStatusId The status of the test run (pass/fail/not run)
* @param runnerName The name of the automated testing tool
* @param runnerTestName The name of the test as stored in JUnit
* @param runnerAssertCount The number of assertions
* @param runnerMessage The failure message (if appropriate)
* @param runnerStackTrace The error stack trace (if any)
* @param endDate When the test run ended
* @param startDate When the test run started
* @return ID of the new test run
*/
public void recordTestRun(int testCaseId, Integer releaseId, Integer testSetId, Date startDate,
Date endDate, int executionStatusId, String runnerName, String runnerTestName, int runnerAssertCount,
String runnerMessage, String runnerStackTrace) {
String url = this.url + REST_SERVICE_URL + "projects/" + this.projectId + "/test-runs/record?username=" + this.userName + "&api-key=" + this.token;
Gson gson = new Gson();
//create the body of the request
String body = "{\"TestRunFormatId\": 1, \"RunnerName\": \"" + runnerName;
body += "\", \"RunnerTestName\": \"" + runnerTestName + "\",";
body += "\"RunnerStackTrace\": " + gson.toJson(runnerStackTrace) + ",";
body += "\"StartDate\": \"" + formatDate(startDate) + "\", " + "\"EndDate\": \"" + formatDate(endDate) + "\",";
body += "\"ExecutionStatusId\": " + executionStatusId + ",\"RunnerAssertCount\": " + runnerAssertCount;
body += ",\"RunnerMessage\": \"" + runnerMessage + "\",";
body += "\"TestCaseId\": " + testCaseId;
if(releaseId != null) {
body += ", \"ReleaseId\": " + releaseId;
}
if(testSetId != null) {
body += ", \"TestSetId\": " + testSetId;
}
body += "}";
//send the request
try {
httpPost(url, body);
}
catch (Exception e) {
e.printStackTrace();
}
}
/**
* Turn the date into the format readable by Spira
* @param d
* @return
*/
private static String formatDate(Date d) {
return "/Date(" + d.getTime() + "-0000)/";
}
/**
* Send a test run to Spira from the info in the given test run
*
* @return ID of the new test run
*/
public void recordTestRun(TestRun testRun) {
Date now = new Date();
recordTestRun(testRun.testCaseId, testRun.releaseId == -1 ? null : testRun.releaseId,
testRun.testSetId == -1 ? null : testRun.testSetId, now, now, testRun.executionStatusId,
"JUnit", testRun.testName, 1, testRun.message, testRun.stackTrace);
}
}