The definition of Waterfall

Waterfall is a software development methodology and was the primary methodology used by software teams and companies until the mid 2000s when it largely fell out of favor of Agile methodologies.

It is a sequential, linear approach to software development where each phase of the development process occurs in a specific order and is completed before moving on to the next phase.

The general process includes the following phases:

  1. Requirements gathering
  2. Design
  3. Implementation
  4. Testing
  5. Deployment
  6. Maintenance

As mentioned above, each phase is completed before the next phase begins. Once a phase is completed, it is typically not revisited without starting the entire process over again because of the fairly rigid, linear process flow that lends itself to building heavy dependencies into requirements.

In the waterfall analogy, this would be like trying to climb back up the waterfall. A key assumption of waterfall is that the requirements and design will not change significantly during the development process. Thus, there is heavy focus on getting everything right in the early stages before costly mistakes are made.

Despite some obvious drawbacks, a waterfall model can still be useful in situations where requirements are well-defined and knowable upfront and design isn’t likely to change, because of these known properties. A good example is operating system development. However, it is generally less effective (and even detrimental) in situations where requirements are not well-understood, where there are many unknowns, or where changes are likely to occur during the development process, since waterfall isn’t inherently flexible or adaptable to unique situations.