August 22nd, 2016 by inflectra
In the first article in this two-part series we discussed some of the reasons why it might not be desirable to spent the time to write formal test cases. Now this might seem like heresy for a company that sells a test management system to even say this! However to recap from last time, there are some good reasons:
Please read last week's post if you want to get more backrgound on these!
Now in this week's article we're going to describing the Session-based Testing approach we took when testing SpiraTeam 5.0.
As described in this great article on session based testing, the idea is to take the concept of exploratory testing, where testers 'explore' the application, following their intuition, logging bugs and finding problems and apply some structure, without the straightjacket of scripted testing. What you do is take an area of the system or area of functionality and break it down into approximately 45 minute - 1 hour sessions that cover a specific high level objective and give the tester the freedom to follow an unstructured path to find the problems and issues lurking in that part of the system.
These session metrics are the primary means to express the status of the exploratory test process:
The key is that instead of measuring numbers of test cases, test steps and the % completion, you are focusing on the number of high level sessions and whether they turn up actual problems. If you have sessions that result in low levels of problems then you may need to refocus on different objectives or parts of the system.
Normally when you use SpiraTest to do traditional scripted testing, you would have a business analyst create test cases with discrete test steps:
In this example, we have described each of the steps that a tester would need to take if they had never used the system before to achieve a very specific action (e.g. to create a release). However if all of the testers are familiar with the application and have used it before, it takes longer to write this test than execute. More importantly it misses out on all the nuances and edge cases that may lurk outside the scenario described in the script. The best testers find the issues that the business analyst or developer cannot even dream of!!
However it turns out that SpiraTest also is a very good session based testing and exploratory testing tool as well. With its ability to easily allow a tester to write down their observations, embed images into their descriptions and tie back what they did to the functional areas covered (i.e. the requirements), SpiraTest allows testers the freedom to explore, but gives them a powerful way to record their observations.
What we did with the testing of SpiraTeam 5.0 was to brainstorm a list of all the things that had changed and all of the areas that the developers felt were high risk, complex or "might contain a can of worms". We then wrote these into SpiraTeam as test cases:
Some of these were defined upfront, others were specified later based on the results of earlier tests. For example, one of the tests was to verify that the new drag and drop worked. When it was found that one type of grid (the ordered grid) wasn't working the same as the others, a new test case was written, with the objective to focus specifically on that grid.
Now, just as important is how we write the "test steps" inside these session/exploratory "test cases":
In this example (which was written upfront during the design and development of that feature in the appropriate sprint), the steps are just an enumeration of the different cases that needed to be included when verifying the 'drag and drop' functionality. It didn't specify exactly what needed to be tested, just that dragging and dropping in the listed modules had been changed and could have lots of lurking issues.
In this second example, based on the experiences of the tester in the earlier testing activities, she wrote the steps almost as a checklist of things that she wanted to dive deeper into based on where she felt the problems would lie. This specific test was actually written down during the running of a previous one. That is great thing about this approach, you can run one test and write the next one at the same time.
Now that we have these tests defined, it's now time to run them.
Unlike in scripted testing, with exploratory / session based testing, the tester doesn't need to follow a specific script or set of actions. Instead she used the test case as a high level set of objectives and checklist of areas to explore that need particular attention.
During the testing, the tester simply runs the test using the test execution wizard, however unlike traditional testing where the tester records simple Pass, Fail and a short description of what failed, with exploratory testing, she writes a more detailed account of what she's seeing during the testing, using the SpiraTest ability to embed images quickly and easily into the result.
As you can see in the example test run, the result looks a lot like a set of individual defect reports joined together in a sequence. In this example, she found a bunch of issues with the drag and drop functionality. At this point we didn't want to log 5 defects for the issues because the system wasn't yet ready for that, so it was much easier for us to keep track of this one testing artifact than 5 disconnected defects. In this stage of sprint, the functionality wasn't finished yet, so it was very helpful for the developers to get this early look into what issues are lurking,
Once the developers had finished the drag and drop functionality in this sprint and had made sure they had addressed all the identified problems in the test run, our tester could re-execute the same session test. Unlike exploratory testing, the structure of session based testing allows you to perform regression testing using the previous exploratory test as a regression test.
When the tester executed the same test a second time later in the sprint, the test run looked quite different:
In this case the functionality pretty much works correctly, the tester couldn't find any issues in most browsers even when spending considerable time testing the various features and trying all kinds of extreme cases. The exception was in IE9 which had some issues. In a traditional test we'd have needed to create a master test case with parameters and pass through the browser names for each combination. With session-based testing we can simply add the list of browsers to the test case description or just have them written into the release description (this release needs to support IE9+, Firefox, Chrome, etc.) and the testing knows to test all the options. This lets us spend more time testing and less time documenting.
Finally one of the great benefits of using session-based testing rather than pure exploratory testing is that you can still tie back the tests to the requirements and features that are being tested, so you can still get test coverage metrics and know which parts of the system have the most risk:
Furthermore, one of the metrics for session based testing we mentioned at the beginning was number of function areas covered. When you use SpiraTest to manage your requirements and perform session-based testing, the requirements coverage tracking gives you this metric out of the box.
As always in software development, it is important to use the right tool for the right job. As part of a comprehensive test strategy, you should ideally have unit testing, functional testing, performance testing designed and built into your process. However an important tool that is often overlooked is exploratory and free-form testing. In the past it was hard to measure, difficult to reproduce and poorly communicated. With SpiraTest and a session-based testing approach, you can apply a minimum amount of structure to free-form testing and get significant rewards. With session based testing you write your tests as they are performed, you build up a library of experience and future testing ideas, you have the ability to do regression testing, and for the team leadership, you have clear objectives, metrics and results that can be evaluated.
exploratory testing session based testing testing methodologies
Ask an Inflectra expert:
And if you have any questions, please email or call us at +1 (202) 558-6885
SpiraTest combines test management, requirements traceability & bug-tracking
SpiraTeam brings your teams together, managing the entire application lifecycle
SpiraPlan lets you manage your programs and portfolio of projects like never before
Orchestrates your automated regression testing, functional, load and performance
The ultimate test automation platform for web, mobile, and desktop applications
The help desk system, designed specifically for software support teams
Cloud hosted, secure source code management - Git and Subversion
Exploratory testing capture tool that automatically records your testing activity
Let us deal with the IT pain so you don't have to. Or use on-premise if you prefer.
Our customers work in every industry imaginable. From financial services to healthcare and biotech to government and defense and more, we work with our customers to address their specific needs.
Our products do not enforce a methodology on you, instead they let you work your way. Whether you work in agile development, Scrum, XP, Kanban and Lean, Waterfall, hybrid, or Scaled Agile Inflectra can help.
If you want to learn more about application delivery, testing, and more take a look at our whitepapers, videos, background papers, blog, and presentations.
Our suite of Accelerators speed up your deployment and adoption of our products, increasing your return on investment and reducing the cost of ownership.
We collaborate with a wide range of organizations to bring our customers a range of services (including load testing, training, and consulting), complimentary technologies, and specialized tools for specific industries.
Learn how different organizations have benefited from using Inflectra products to manage their software testing and application develooment.
Outstanding support is the foundation of our company. We make support a priority over all other work. Take a look at our support policy.
Discover great tips, discussions, and technical solutions from fellow customers and Inflectra's technical experts.
If you can't find the answer you're looking for, please get in touch with us: over email, phone, or online.
We are constantly creating new videos to help customers learn about our products, including through in depth webinars, all freely available along with a wide selection of presentations.
We provide a number of resources to help customers learn how to get the most out of our products, with free online resources, virtual classrooms, and face to face.
Read about Inflectra, our manifesto, and values. Meet our incredible customers who are building awesome things, and our leadership team that are committed to building a great company.
The Inflectra Blog contains articles on all aspects of the software lifecycle.
In addition we have whitepapers,
background articles, videos and
presentations to help get you started.
Events are a big part of our awesome customer service. They are a chance to learn more about us, our products, and how to level up your skills with our tools.
We partner with educational institutions and individuals all over the world. We are also a great place to work and encourage you to explore joining our team.
Please contact us with your questions, feedback, comments, or suggestions. We'll get back to you as soon as possible.
When you need additional assistance (be it training, consulting, or integration services) our global certified solution provider partner network is ready to help.
At Inflectra, we are fully committed to provide our customers with the very best products and customer service. Check out some of our recent awards.
We want to help developers extend and customize our tools to fit in with their needs. We provide robust APIs, sample code, and open source projects.