Seeing is believing.
The Iteration Review is a cadence-based event where each team inspects the increment at the end of every iteration to assess progress, and then adjusts its backlog for the next iteration.
The purpose of the iteration review is for each Agile Team to measure and then demonstrate its progress by showing working stories to the Product Owner (PO), and other stakeholders, to get their feedback. Teams demonstrate every new Story, Spike, Refactor, and Nonfunctional Requirement (NFR). The preparation for the review begins during Iteration Planning, where teams start thinking about how they will demo the stories they’re committing to. “Beginning with the end in mind” facilitates iteration planning and alignment, fostering a more thorough understanding of the functionality needed prior to iteration execution.
The importance of the Iteration Review cannot be overstated: It provides the only way to gather immediate, contextual feedback from the team’s stakeholders. The purpose of the Iteration Review is to measure the team’s progress by showing working stories to the Product Owner and other stakeholders to get their feedback. The iteration review serves three important functions:
- It brings closure to the Iteration timebox, to which many individuals have contributed to provide new value to the business
- It gives teams an opportunity to show the contributions they have made to the business, and to take some satisfaction and pride in their work and progress
- It allows stakeholders to see working stories and provide feedback
The iteration review starts by going over the Iteration Goals and discussing their status. It then proceeds with a walk-through of all the committed stories. Each completed story is demoed in a working, tested system—preferably in a staging environment that closely resembles the production environment. Spikes are demonstrated via a presentation of findings. Stakeholders provide feedback on the stories that are demoed, which is the main goal of the review process.
After the demo, the team reflects on which stories were not completed, if any, and why the team was unable to finish them. This discussion usually results in the discovery of impediments or risks, false assumptions, changing priorities, estimating inaccuracies, or over-commitment. These findings may result in further discussion in the Iteration Retrospective about how the next iterations can be better planned and executed. Figure 1 illustrates an iteration review in action.
In addition to showing how well it did within this latest iteration, the team also reflects on how it’s progressing toward its PI Objectives. It finishes the event by refining the Team Backlog before the next iteration planning.
Attendees at the Iteration Review include:
- The Agile Team, which includes the Product Owner and the Scrum Master.
- Stakeholders who want to see the team’s progress, such as members of other teams.
Although ART stakeholders may attend, their interests and level of detail are usually better aligned with the System Demo.
Below are some tips for a successful Iteration Review:
- Limit demo preparation by team members to about one to two hours.
- Timebox the meeting to about 1-2 hours
- Minimize the use of PowerPoint slides. The purpose of the iteration review is to get feedback on working software functionality, hardware components, etc.
- Demo incomplete stories, too, if enough functionality is available to get feedback.
- Verify completed stories meet the Definition of Done (DoD)
- If a major stakeholder cannot attend, the Product Owner should follow up to report progress and get feedback.
- Make sure feedback is provided in a constructive way, and that the team celebrates its accomplishments.
Teams that are practicing continuous delivery or continuous deployment should also do more frequent story or feature reviews. Once functionality has reached the ready-for-deployment state, is should be reviewed with the key stakeholders most interested in it.
Learn More Leffingwell, Dean. Agile Software Requirements: Lean Requirements Practices for Teams, Programs, and the Enterprise. Addison-Wesley, 2011.  Leffingwell, Dean. Scaling Software Agility: Best Practices for Large Enterprises. Addison-Wesley, 2007.
Last update: 8 June, 2017