What if we found ourselves building something that nobody wanted? In that case, what did it matter if we did it on time and on budget?
Lean User Experience (Lean UX) design is a mindset, culture, and a process that embraces Lean-Agile methods. It implements functionality in minimum viable increments and determines success by measuring results against a benefit hypothesis.
Lean UX design extends the traditional UX role beyond merely executing design elements and anticipating how users might interact with a system. Instead, it encourages a far more comprehensive view of why a Feature exists, the functionality required to implement it, and the benefits it delivers. By getting immediate feedback to understand if the system will meet the real business objectives, Lean UX provides a closed-loop system for defining and measuring value.
Generally, UX represents a user’s perceptions of a system—ease of use, utility, and the effectiveness of the user interface (UI). UX design focuses on building systems that demonstrate a deep understanding of end users. It takes into account what users need and wants while making allowances for the user’s context and limitations.
A common problem, when using Agile methods, is how best to incorporate UX design into a rapid Iteration cycle that results in a full-stack implementation of the new functionality. When teams attempt to resolve complex and seemingly subjective user interactions, while simultaneously trying to develop incremental deliverables, they can often churn through many designs, which can become a source of frustration with Agile.
Fortunately, the Lean UX movement addresses this by using Agile development with Lean Startup implementation approaches. The mindset, principles, and practices of SAFe reflect this thinking. This process often begins with the Lean Startup Cycle described in the Epic article and continues with the development of Features and Capabilities using a Lean UX process described here.
As a result, Agile teams and Agile Release Trains (ARTs) can leverage a common strategy to generate rapid development, fast feedback, and a holistic user experience that delights users.
The Lean UX Process
In Lean UX, Gothelf and Seiden  describe a model that we have adapted to our context, as Figure 1 illustrates. It follows SAFe’s Continuous Delivery Pipeline and focuses more on team-level activities.
The Lean UX approach starts with a benefit hypothesis: Agile teams and UX designers accept the reality that the ‘right answer’ is unknowable up-front. Instead, teams apply Agile methods to avoid Big Design Up-front (BDUF), focusing on creating a hypothesis about the feature’s expected business result, and then they implement and test that hypothesis incrementally.
- Feature – A short phrase giving a name and context
- Benefit hypothesis – The proposed measurable benefit to the end user or business
Outcomes are measured in Release on Demand and best done using leading indicators (see Innovation Accounting in ) to evaluate how well the new feature meets its benefits hypothesis. For example, “We believe the administrator can add a new user in half the time it took before.”
Traditionally, UX design has been an area of specialization. People who have an eye for design, a feel for user interaction, and specialty training were often entirely in charge of the design process. The goal was ‘pixel perfect’ early designs, done in advance of the implementation. Usually, this work was done in silos, apart from the very people who knew the most about the system and its context. Success was measured by how well the implemented user interface complied with the initial UX design. In Lean UX, this changes dramatically:
“Lean UX literally has no time for heroes. The entire concept of design as a hypothesis immediately dethrones notions of heroism; as a designer, you must expect that many of your ideas will fail in testing. Heroes don’t admit failure. But Lean UX designers embrace it as part of the process.” 
Continuous Exploration takes the hypothesis and facilitates a continuous and collaborative process that solicits input from a diverse group of stakeholders – Architects, Customers, Business Owners, Product Owners, and Agile Teams. This further refines the problem and creates artifacts that clearly express the emerging understanding including personas, empathy maps, and customer experience maps.
Principle #9 – Decentralize decision-making provides additional guidance for the Lean UX process: Agile teams are empowered to do collaborative UX design and implementation, and that significantly improves business outcomes and time-to-market. Moreover, another important goal is to deliver a consistent user experience across various system elements or channels (e.g., mobile, web, kiosk) or even different products from the same company. Making this consistency a reality requires some centralized control (following Principle #9) over certain reusable design assets. A design system  is a set of standards that contains whatever UI elements the teams find useful, including:
- Editorial rules, style guides, voice and tone guidelines, naming conventions, standard terms, and abbreviations
- Branding and corporate identity kits, color palettes, usage guidelines for copyrights, logos, trademarks, and other attributions
- UI asset libraries, which include icons and other images, templates, standard layouts, and grids
- UI widgets, which include the design of buttons and other similar elements
These assets are an integral part of the Architectural Runway, which supports decentralized control while recognizing that some design elements need to be centralized. After all, these decisions are infrequent, long-lasting and provide significant economies of scale, as described in Principle #9, Decentralize decision-making.
With a hypothesis and design in place, teams can proceed to implement the functionality in a Minimum Marketable Feature (MMF). The MMF should be the minimum functionality that the teams can build to learn whether the benefit hypothesis is valid or not. By doing this, the ARTs apply SAFe Principle #4 – Build incrementally with fast, integrated learning cycles, to implement and evaluate the feature. Teams may choose to preserve options with Set-Based Design, as they define the initial MMF.
In some cases, early designs could initially be extremely lightweight and not even functional (ex., paper prototypes, low fidelity mockups, simulations, API stubs). In other cases, a vertical thread (full stack) of just a portion of an MMF may be necessary to test the architecture and get fast feedback at a System Demo. However, in some instances, functionality may need to proceed all the way through to deployment and release, where application instrumentation and telemetry  provide feedback data from production users.
MMFs are evaluated as part of deploying and releasing (where necessary). There are a variety of ways to determine if the feature delivers the right outcomes. These include:
- Observation – Wherever possible, directly observe the actual usage of the system, it’s an opportunity to understand the user’s context and behaviors.
- User surveys – When direct observation isn’t possible, a simple end-user questionnaire can obtain fast feedback.
- Usage analytics – Lean-Agile teams build analytics right into their applications, which helps validate initial use and provides the application telemetry needed to support a Continuous Delivery model. Application telemetry offers constant operational and user feedback from the deployed system.
- A/B testing – Is a form of statistical hypothesis comparing two samples, which acknowledges that user preferences are unknowable in advance. Recognizing this is truly liberating, eliminating endless arguments between designers and developers—who likely won’t use the system. Teams follow Principle #3 – Assume variability; preserve options to keep design options open as long as possible. And wherever it’s practical and economically feasible, they should implement multiple alternatives for critical user activities. Then they can test those other options with mockups, prototypes, or even full stack implementations. In this latter case, differing versions may be deployed to multiple subsets of users, perhaps sequenced over time and measured via analytics.
In short, measurable results deliver the knowledge teams need to refactor, adjust, redesign—or even pivot to abandon a feature, based solely on objective data and user feedback. Measurement creates a closed-loop Lean UX process that iterates toward a successful outcome, driven by actual evidence of whether a feature fulfills the hypothesis, or not.
Implementing Lean UX in SAFe
Lean UX is different than the traditional, centralized approach to user experience design. The primary difference is how the hypothesis-driven aspects are evaluated by implementing the code, instrumenting where applicable, and gaining the actual user feedback in a staging or production environment. Implementing new designs is primarily the responsibility of the Agile Teams, working in conjunction with Lean UX experts.
Of course, this shift, like so many others with Lean-Agile development, can cause significant changes to the way teams and functions are organized, enabling a continuous flow of value. For more on coordinating and implementing Lean UX —and more specifically how to integrate Lean UX in the PI cycle—read the advanced topic article Lean UX and the Program Increment Lifecycle.
Learn More Ries, Eric. The Lean Startup: How Today’s Entrepreneurs Use Continuous Innovation to Create Radically Successful Businesses. Random House, Inc. Kindle Edition.  Gothelf, Jeff and Josh Seiden. Lean UX: Designing Great Products with Agile Teams. O’Reilly Media. 2016.  Leffingwell, Dean. Agile Software Requirements: Lean Requirements Practices for Teams, Programs, and the Enterprise. Addison-Wesley, 2011.  Telemetry is an automated process by which measurements and other data are collected at remote or inaccessible points and transmitted for monitoring and analysis. Telemetry could apply to both technical and business aspects of the functionality (e.g., performance monitoring of solution components, automated A/B testing).
Last update: 30 August 2019