Archive for June 2021

Webinar Recap: Rapise 7 - What's Covered in the New Release

June 30th, 2021

On June 29th, Inflectra hosted a webinar on Rapise 7.0 Is Here. Denis Markovtsev, Inflectra's Principal Software Developer, covered the following topics in the live webinar:

  1. New Spira Dashboard

  2. Support for Flaky tests and an automatic rerun of failed tests

  3. Soft assertions and attaching files to test runs.

If you missed the webinar, we have a full recap below 👇

Read More

Spotlight on Rapise 7.0 - New Flaky Testing Support

June 29th, 2021

Two of the major features in our most recent release of Rapise are the completely overhauled Spira Dashboard and the new support for handling the bane of test automation engineers everywhere - flaky tests! In this article we shed some light on how Rapise 7.0 can make flaky tests a thing of the past.

Read More

#TestBowl 2.0 - It's A Wrap, Folks

June 28th, 2021

On June 24, Inflectra hosted its second virtual software testing competition in 2021 - #TestBow 2.0. If you missed this fun event, here is a very quick recap!

Read More

Spotlight on Rapise 7.0 - New Spira Dashboard

June 28th, 2021

Two of the major features in our most recent release of Rapise are the completely overhauled Spira Dashboard and the new support for handling the bane of test automation engineers everywhere - flaky tests! In this article we shed some light on the new dashboard in Rapise 7.0.

Read More

Crowd-Test Your App or Website at #TestBowl 2.0

June 24th, 2021

Do you have a new app, software, or website that is ready for testing? Are you looking to avoid turning your customers into testers? We hear you, and we have a fun solution for you - bring your app/software or website to #TestBowl 2.0!

As part of the Inflectra-run software testing competition - #TestBowl 2.0 on June 24, 2021, we offer to crowd-test your software for free! 

All you need to do is apply before June 18, 2021!

Read More

The Perils of Agile Estimation (Part 2)

June 7th, 2021

In the previous post, we examined why and how agile software estimation goes wrong so often, and we identified three principles that must support any good estimation method:

  1. Reliability - An estimate must reflect the performed work within a reasonable confidence interval.
  2. Objectivity - The same estimate should be given for a task, regardless of who's giving the estimate.
  3. Consistency - Given nothing else has changed, an estimate should not change with time.

I call this the ROC principles. In this post, we'll start looking at techniques and methods we can apply to our estimation process to implement these principles. In order to properly ROC-ify our estimates, the first thing we need to grok is the difference between complex and complicated tasks.

Read More