Seeing is believing.
Iteration ReviewThe Iteration Review is a cadence-based event, where each team inspects the increment at the end of every Iteration to assess progress, and then adjusts its backlog for the next iteration.
During the Iteration review, each Agile Team measures and then demonstrates its progress by showing working stories to the Product Owner and other stakeholders to get their feedback. Teams demonstrate the significant new behavior and knowledge gained from the iteration’s Stories, Spikes, Refactors, and Nonfunctional Requirements (NFRs). The preparation for the iteration review begins during Iteration Planning, where teams start thinking about how they will demo the stories to which they have committed. ‘Beginning with the end in mind’ facilitates iteration planning and alignment, fostering a more thorough understanding of the functionality needed, ahead of iteration execution.
The iteration review provides a way to gather immediate, contextual feedback from the team’s stakeholders on a regular cadence. The purpose of the iteration review is to measure the team’s progress by showing working stories to the Product Owner and other stakeholders to get their feedback. The iteration review serves three important functions:
- It brings closure to the iteration timebox, to which many individuals have contributed to provide new value to the business
- It allows team members to demonstrate the contributions they have made and to take some satisfaction and pride in their work
- It provides an opportunity for the team to receive feedback to improve the solution under development
The iteration review starts by going over the Iteration Goals and discussing their status. It then proceeds with a walk-through of all the committed stories. Each completed story is demoed as part of a working, tested system—preferably in a staging environment that closely resembles the production environment. Spikes are demonstrated via a presentation of findings. Stakeholders provide feedback on the stories that are demoed, which is the primary goal of the review process.
After the demo, the team reflects on which stories were not completed, if any, and why the team was unable to finish them. This discussion usually results in the discovery of impediments or risks, false assumptions, changing priorities, estimating inaccuracies, or over-commitment. These findings may lead to further study in the Iteration Retrospective and the identification of improvements to support better planning and execution in future iterations. Figure 1 shows an iteration review in action.
In addition to reflecting how well it did within this latest iteration, the team also determines how it’s progressing toward its Team Program Increment (PI) Objectives. It finishes the event by refining the Team Backlog, based on the feedback received, before the next iteration planning event.
Attendees at the iteration review include:
- The Agile team, which includes the Product Owner and the Scrum Master
- Stakeholders who want to see the team’s progress, which may also include other teams.
Below are some tips for running a successful iteration review event:
- Limit preparation by team members to about one to two hours.
- Timebox the event to about one to two hours.
- Minimize the use of slides. The purpose of the iteration review is to get feedback on working software functionality, hardware components, etc.
- Verify completed stories meet the Definition of Done (DoD).
- Demo incomplete stories, too, if enough functionality is available to get feedback.
- If a significant stakeholder cannot attend, the Product Owner should follow up to report progress and get feedback.
- Encourage providing constructive feedback and celebration of the team’s accomplishments.
Teams that are practicing Continuous Delivery or Continuous Deployment should also do more frequent story or Feature reviews. Once functionality has reached the ready-for-deployment state, key stakeholders should review it.
Learn More Leffingwell, Dean. Agile Software Requirements: Lean Requirements Practices for Teams, Programs, and the Enterprise. Addison-Wesley, 2011.  Leffingwell, Dean. Scaling Software Agility: Best Practices for Large Enterprises. Addison-Wesley, 2007.
Last update: 10 February 2021