Is Agile Development Better than Waterfall Development Overall?

June 12th, 2019 by inflectra

agile waterfall hybrid scrum methodologies

A user on Quora recently asked the potentially incendiary question: "is agile development better than waterfall?". Rather than getting into a potentially religious war about software development methodologies (almost as contentious as deciding on coding standards in C++, curly braces on the same line or next line? anyone? ) I thought this question might be a good place to discuss some of the benefits of each approach (agile vs. waterfall) and ways to choose which parts of each methodology make sense for you and your project.

Why Did Waterfall Become Prevalent?

We have to remember the evolution of the software development industry. In the 1970s, computers were large mainframes where you had to rent time on the computer, encode your software into punched cards and feed them in for batch processing. A scary thought is that some of the systems we rely on today (banks, airline reservation systems) at their core are still based on such systems. Why else would credit cards get batched up each day and then captured for payment at 8:00pm each day. Behind every whiz bank web service and REST API is often a 1970s COBOL program with more patches, steps and workarounds than a Rube Goldberg device.... anyway I digress!

[Creative Commons: https://www.flickr.com/photos/jurvetson/10438860]

In this environment, the cost to make any changes was prohibitive. You wrote the requirements, designed the software specifications, wrote the code, sent it for processing, and tested the results to make sure it worked as expected. You repeated the testing and fixing until the results were correct. Then you repeated the same steps on the production environment with live data.Hence the waterfall methodology was born.

If you wanted to change things (for example handle a new type of bank account), you had to write up a change request, meet with the requirements analysis team, submit the design for review, code the changes, test on the development environment, and finally (after much fixing and retesting) go live.

This was all based on some assumptions that were true at the time:

  • It is cheaper and faster to catch a potential issue when defining the requirements. If you could get the requirements "right" and think of all the potential side effects, constraints, and edge cases you could avoid having to redo work later
  • It is cheaper to catch and fix an issue in design then in development. So you had lots of design reviews, architecture sessions, use of UML tools, models, risk analysis to find all potential flaws. This tended to create 'over-design'
  • It is cheaper to prevent the bugs in development than fix in test. So you had lots of code reviews, you wrote very module, extensible code, you used OO frameworks and other ways to try and encourage reuse and prevent bugs. Of course this added a lot of code that might not be used or needed.
  • It is cheaper to fix any bugs in testing than when it has gone live into production. You could not rapidly deploy in this environment. Code changes would take weeks to put into production, so a bug fix in production was expensive (and disruptive).

As technology improved in the late 1990s, these assumptions were no longer valid (faster compilers, concurrent source code tools like CVS, continuous integration tools like CruiseControl, automated unit test frameworks), but still the methodologies (RUP, V-model) were holdovers from the past

Things have changed a lot since then, the cost to make changes is much lower than the cost of development requirements that no one needs or doesn’t meet the real needs of users (vs. what they think they want). That being said, there are benefits from some upfront analysis and requirements gathering, and for many industries, you cannot simply “fail fast”, imagine if you did that with a bank or a airline flight safety system… so there are things we can learn from both approaches.

The Agile Revolution

With the publication of the Agile Manifesto and the adoption of agile methodologies such as Extreme Programming, Scrum, Kanban (and now scaled frameworks), the methodologies finally caught up with the technology. When you also throw in DevOps, and the ability to take a new requirement (aka user story), have it designed, developed, testing and integrated and deploy into production many times each day, you can see how far we have come, and why agile is now the dominant methodology. However to be able to use such an approach, there are a new set of assumptions that need to be true:

  • You can actually make a change to the system after release - this may sound obvious, but agile assumes that changes to the system in production are possible and such changes are not cost-prohibitive. For example, if you are writing software for a satellite launch where you won't be able to easily change the code after it's "live" it may not be a valid assumption. Similarly software being burnt onto a ROM and shipped as an embedded system cannot be changed as easily as a web based consumer application.
  • There aren't external costs or constraints to changing the design - if you are working in life sciences, you have to validate all changes, with regulatory signoff for each change. That doesn't mean that agile is not a valid approach, but it means that you may need to deploy into a 'proxy-production' environment and adjust what 'agile means'
  • You have users that are willing to provide feedback and guide the design - agile assumes that you have engaged users that care about the system and can see a minimum viable product (MVP) and give useful feedback so that it can evolve into a better product. Imagine you are building a piece of software to meet the needs of a legal document and there are no users allowed to provide changes. That sadly creates a sub-optimal system, but sometimes it is the reality

That being said, even if your situation precludes fully embracing agile, there are practices that are good software engineering regardless:

  • Test as much as you can, with an emphasis on automated testing at the unit, module, system level.
  • Ensure that you develop early, experiment and test as you go. Don't rely on just paper designs that have been tried out.
  • Have your testers act as surrogate users, have them provide feedback.
  • Use modern source code management (e.g Git) and CI tools (e.g. Jenkins)
  • Avoid paper documentation (and I'm including you - Excel and Word in that list, we know who you are), use a modern tool such as SpiraTeam for your planning.

Spira Helps You Deliver Quality Software, Faster and with Lower Risk.

Get Started with Spira for Free

And if you have any questions, please email or call us at +1 (202) 558-6885

Free Trial