“It’s just what I asked for, but not what I want”
—The Night Before Implementation poem, Author Unknown
Behavior Driven Development
Behavior Driven Development (BDD) is a Test-First, Agile Testing practice that provides Built-In Quality by defining (and potentially automating) tests before, or as part of, specifying system behavior. BDD is a collaborative process that creates a shared understanding of requirements between the business and the Development Team. Its goal is to help guide development, decrease rework, and increase flow. Without focusing on internal implementation, BDD tests are business-facing scenarios that attempt to describe the behavior of a Story, Feature, or Capability from a user’s perspective.
When automated, these tests ensure that the system continuously meets the specified behavior even as the system evolves. That, in turn, enables Release on Demand. Automated BDD tests can also serve as the definitive statement regarding the as-built system behavior, replacing other forms of behavioral specifications.
Align on System Behavior
Aligning on precisely what to build is a challenge when developing innovative systems. In addition, new ideas are difficult to communicate with the diverse set of stakeholders responsible for system implementation. Figure 1 illustrates the three perspectives (called the triad ) required to clearly define solution behavior:
- Customer-centric stakeholders understand customer and business needs and the relative desirability and viability of a new requirement
- Development-centric stakeholders understand the solution space and technological feasibility
- Test-centric stakeholders consider the exceptions, edge cases, and boundary conditions for the new behavior
Together, this group reaches alignment on exactly what to build to reduce the rework associated with building the wrong thing and to accelerate the flow of value.
The Behavior-Driven Development Process
The BDD process moves through three phases—discovery, formulation, and automation—where the acceptance criteria are transformed into acceptance tests that are later automated. The process begins in the discovery phase, where the Product Owner or Product Manager creates acceptance criteria as part of writing a story or feature (see the confirmation part of 3Cs in the “Writing Good Stories”). The discovery process is collaborative, and team members also discover and contribute additional criteria.
As a backlog item moves closer to implementation, the formulation phase solidifies acceptance criteria by creating acceptance tests. Initial acceptance criteria are often described with ambiguous, general terms. The formulation phase resolves these ambiguities by turning the scenarios into detailed acceptance tests that are specific, clear, unambiguous examples of behavior.
The automation phase automates the acceptance tests, so they can be run continuously and validate that the system always supports the new behavior.
BDD’s goal is to express requirements in unambiguous terms, not simply to create tests . The result may be viewed as an expression of requirements or as a test, but the result is the same. Acceptance tests serve to record the decisions made in the conversation between the team and the Product Owner so that the team understands the specific intended behavior. There are three alternative labels to this detailing process:
- Behavior Driven Design (BDD)
- Acceptance Test–Driven Development (ATDD),
- Specification by Example (SBE)
Although slight differences exist in these approaches, they all emphasize understanding requirements before implementation.
A Behavior-Driven Development Example
Behavior description begins with a story, feature, or capability specified by its acceptance criteria. All of these are defined using terms from the customer’s domain, not from the implementation. Here is an example story and its acceptance criteria:
The acceptance criteria could also be written in ‘Given-When-Then’ (GWT) format as shown below:
Given a speed limit
When the car drives
Then it is close to the speed limit but not above it
Even then, elaborated acceptance criteria are typically insufficient to code the story. To remove ambiguity, formulate the scenario into one or more examples that specify the details of the behavior, resulting in a specific acceptance test:
Given speed limit is 50 mph
When the car drives
Then its speed is between 49 and 50 mph
In collaboration with the team (the triad), additional acceptance criteria and scenarios will emerge, for example: When speed limit changes, the speed changes without excessive force.
This criterion results in an additional test (or tests) that stipulate what excessive deceleration is acceptable:
Given speed limit is 50 mph
When speed limit changes to 30 mph
Then deceleration rate should be less than 5 feet/sec/sec
Figure 3 illustrates the BDD process that begins with a story and details its specification in two dimensions. Horizontally, additional acceptance criteria detail the story’s requirements. Vertically, additional acceptance tests detail those acceptance test requirements.
Automating Acceptance Tests
Automating these business-facing tests is an important reason to use the Given-When-Then format. Frameworks including Cucumber and ‘Framework for Integrated Testing’ (FIT) can be used to support this syntax. To support regression and continuous delivery, tests should be automated wherever possible.
Story acceptance tests are written and executed in the same iteration as the code development. If a story does not pass its tests, the team does not receive credit for that story. Features and capabilities have their own acceptance tests that show how several stories work together in a broader context. Typically, these tests represent the behavior of larger workflow scenarios and should run during the Iteration in which the feature or capability is finished.
Learn More Pugh, Ken. Lean-Agile Acceptance Test-Driven Development: Better Software Through Collaboration. Addison-Wesley, 2011.  Leffingwell, Dean. Agile Software Requirements: Lean Requirements Practices for Teams, Programs, and the Enterprise. Addison-Wesley, 2011.
Last update: 12 September 2018