October 22nd, 2014 by Adam Sandman
To Report?
I don't argue against reports, in fact, they are one of the most invaluable resources to judge the overall health of the organization where it applies to the delivery of good software. For example…
- Are a significant number of issues coming from a particular module?
- Is one tester better at identifying grammatical issues?
- Who is better at finding things off the beaten path?
- Where do we need to invest time?
- Are there areas that are being over tested vs. under-tested?
There are so many more. All these and more are good reasons to look at, and perform, reporting. But the amount of time to review testing reports and correlate them can be especially arduous.
To Not Report?
Time and effort are not the only arguments against reporting, for example:
- There may be a tendency to test to the report metrics
- Ability of those reading the reports to understand what they are analyzing
- Reactionary responses to negative metrics in the reports, inhibiting honest reporting
The arguments against reports can mostly be dealt with by training and insuring those viewing them have the appropriate frame of reference. So beyond time, it is just training.
In Conclusion
If reports are not a bad thing, but the time suck they create is, there has to be a better way, after all if we can find issues when only testing 30% of the time, imagine what we could do if that ratio was flipped and we could spend 70% of our time testing instead of creating reports.
I would propose that you get all your reports in a compete management solution, such as SpiraTest, and let the reporting and documentation take care of itself.
Now you have flipped the ratio.