Blog: Cutting Costs and Managing Software Quality

March 14th, 2019 by Thea Maisuradze

agile

The concepts of quality have been talked for several decades. There are numerous quality gurus that have advanced simplistic frameworks that have stood the test of time and are applicable even today. 

Quality is a Planned Event

One of the first definitions of quality lies in the conformance to requirement. This construct was advanced by Phil Crosby around 1979. Almost two to three decades before, Joseph Juran emphasized the quality of any product or service is a function of how well it is fit for use! One must to always remember that without a structured process on how to produce a product or service, quality can’t be guaranteed. Even agile that advances the notion of “individuals and interactions over processes and tools” came up with guidelines on estimation, practices for games, list of four specific ceremonies with suggested time limits, and somewhat prescriptive questions to maintain progress with daily standup.

 

This concept of process guidance so that we don’t always focus the energy on too much upfront planning was promoted by Edwards Deming with his Plan-Do-Check-Act (PDCA) cycle. Deming’s PDCA, sometimes also called Plan-Do-Study-Act, goes back to late 1950s. As a result, the concepts of iterative and incremental cycles existed in the middle of last century which laid the foundation to the progressive elaboration that the Project Management Institute (PMI) adopted since its first iteration of Project Management Book of Knowledge (PMBOK). A few later, Karou Ishikawa, who is most well-known for the root cause analysis diagram also known as fishbone or Ishikawa diagram, suggested the notion that management should form a committee of the cross-functional team who should have a responsibility for promoting new ideas for building quality. As you can see, the concepts of quality is not an after-thought but a planned event with cross-functional representation with a focus on both product quality and process quality evaluated in small increments iteratively has been sowed as the fine seeds to build great products goes back to several decades.

 

So, how does ISO9000 define managing quality?

The total amount of activities of the general management function which determine the policy in the field of quality, to implement the objectives and the responsibilities in the quality system by specific means, such as quality planning, quality control, quality assurance, and quality improvement.” Therefore, the quality management framework aims at achieving organizational objectives through continuous improvement, customer satisfaction, and innovation. So, the quality of any output and outcome delivered by a project is not just focused on testing. On the contrary, “Project Quality is all about the application and evaluation of protocols, policies, and procedures to ensure validation and verification of project deliverables meeting the project requirements.”

 

Managing Quality

First, we should encourage a system’s approach to value creation. In the last webinar, we discussed the system’s thinking evaluating the customer value-add, business value-add, technical value-add, and process value-add. Such a holistic approach to value creation is critical in both the product development, product management, product marketing, project management, and program management.

 

Then, the project scope or the state of backlog should represent the customer’s voice of quality. Starting from the “needs assessment” to the “solution evaluation,” quality should be represented both by the conformance to requirements as well as continued fitness for use. This would mean that there is traceability from the commercial business requirements to the technical design specifications.  So, when the boundary conditions around the design are challenged, the system should be robust enough to handle or graciously shut down.

 

Such a focus will not come from one group alone and can only from a cross-functional team. Unless the project, product, and the organizational business environment are conducive for people to actively engage with the fear of failure, cross-functional knowledge sharing will not be realized. As people try experiments or spikes, as agile calls it, developing prototypes or proof of concepts, there should be a focus on both continuous improvement and operational excellence. In the four important ceremonies for delivering agile projects better, we discussed the need for review and retrospective sessions where the focus is both on the product and process.

 

Finally, quality should be proactive and so quality management should advance innovative approaches on continuously doing more, better, faster, and cheaper. Our internal benchmarks should be constantly evaluated so that we become operationally better raising the standard of quality excellence. When we do this, we can reach the quality guru’s dream of quality being free resulting in zero defects.

 

Engineering Practices

If the reason for poor quality is because of linear waterfall thinking associated with plan-driven approaches to software development, then, how much has agile really solved poor software quality? Synthesized from the annual State of Agile survey published in 2018 by Version One and illustrated below, adopting agile has only claimed 46% increased software quality while allowing accelerated software delivery among changing priorities. Working within the agile teams, productivity is still not greater and alignment between business and IT is not at its highest.

 

 

Nevertheless, we should credit agile for putting the spotlight on the engineering practices required to build quality by design. The concepts of technical debt maintenance and refactoring, for instance, have been part of agile approaches to software development for some time. One of the agile approaches, eXtreme Programming (XP), even emphasized that the continuous attention to technical excellence and good design will allow the team to respond to change rapidly. However, even the State of Agile survey didn’t report on this until recently in the last two years. So, when we look at these engineering practices that contribute to good software development, these practices are unit testing, coding standards, continuous integration, refactoring, continuous deployment, pair programming, test driven development, automated acceptance testing, collective code ownership, sustainable pace, behavior driven development, and emergent design.

 

 

If we see the trends between 2017 and 2018 in these graphs, you can see some patterns evolve. Ignoring trends where there is almost a flat pattern within + or – 2% tolerance, you can see that our focus seems to in continuously integrate code frequently and take time to refactor the code. Now, refactoring is an approach to make the code more maintainable and not necessarily always have a pronounced effect on functionality or performance. Then, the shift seems to towards test-driven development where the code is written to pass the test. I would like you to recall my reference to “testable” in the CURTAIN approach to requirements definition. Sustainable pace is more towards the timeboxing approach applied even in engineering design solutions.

 

While these are good, you can see some challenges too that don’t contribute to the quality and hence increase the costs. For instance, the trend in unit testing is reducing. Automation is not necessarily a natural solution for unit testing. Additionally, we seem to decrease using recommended best practices in coding standards. These are the standards around naming conventions for variables, classes, and functions, proper formatting of code for readability, use of proper and logical exception handling and developing unit test cases to support test driven development.

 

Moreover, another trend you can see is that although continuous integration is increasing, the continuous deployment is not. Now, this could be because of the technical challenges in implementing the continuous-integration and continuous-deployment (often called CICD) pipeline across the development, test, and production environments. Or, the continued preference for some tools that lack the visibility to all these areas in a single application lifecycle management tool. Whatever be the reason, if we don’t continuously deploy binary to a production-like environment, then, the opportunities for integration and regression testing of all the modules is not realized. This is where automation truly helps and if we increase automation testing but not increase the continuous deployments, the threat of defects continues to thrive.

Finally, if we reduce and eliminate testers because we have automated testing, the opportunities for the testers to act as another pair of eyes is also eliminated.

 

So, as you can see, accommodating change and accelerating time to market has taken over the need for quality. When customers reject the product, requiring rework or extending the project duration or requiring additional iterations to fix the defects, costs of product development increase. Then, how are we promoting quality by design if we fail to accommodate all the engineering best practices?

 

So, the more we allow defect, the more we allow costs to increase. This thought leads us to think about how much does a defect cost. It is needless to summarize that the earlier we detect the defect, the lesser we pay to correct. For instance, if we detected an incorrect or ambiguous requirement earlier, it is easier to correct it as either only the use case or user story is modified. If we have implemented a prototype and the analysis or design is wrong, we have increased the cost of our misunderstanding that allowed the defect. So, as the defect goes through the software lifecycle, the impact of the defect in the customer's hand is high. We will now have many things, requirements, test cases, design, development, and deployment corrected. Not to mention the customer dissatisfaction or loss of trust!

Cost of Quality

These discussions lead us to the notion of the cost of quality. It refers to the total cost of all the efforts related to quality throughout the product life cycle. Now, what we often see is only the iceberg above the water. These above-the-surface issues manifest as defects, scrap work, time spent with inspection, etc. But, what scars is the size of the iceberg under the water! Time spent with maintaining the operations, coming up with workarounds to address design and product deficiencies, time lost in the fire-drills for production issues or field failures, the use of tools beyond their shelf-life because of personal preferences, and rushing work without giving enough time to do a quality job!

 

So, what are the elements of this cost of quality? What are the steps to build the House of Quality and incorporate quality by design? Please view the seminar for more details.

 

For more information on these areas, please review the webinar recording on youtube.

 


Dr. Sriram Rajagopalan is a project management guru with extensive software development and project management experience in many industries. Dr. Rajagopalan lead Inflectra's agile project management training course: Journey Into Agile With Inflectra - A Free Webinar Course

Spira Helps You Deliver Quality Software, Faster and with Lower Risk.

Get Started with Spira for Free

And if you have any questions, please email or call us at +1 (202) 558-6885

Free Trial