Sprint 0
The objective of Sprint 0 is to unblock development for all team members working on the project. In Sprint 0 roles are not strictly defined. All team members augment the product manager’s workflow and contribute to the stories—from clarification to establishing acceptance criteria. This frees the product manager to spend valuable time with the team or to extract needed information from the client.
The whole team helps the product manager refine the backlog stories. Estimate, ask questions, and further refine the backlog using the INVEST principle—independent, estimable, small.
Suggestions for allocating Sprint 0 activities according to each role are as follows:
ROLE | RESPONSIBILITY |
---|---|
Product manager | Refines the backlog by creating high-priority stories that meet DoR during week one of Sprint 0; arranges introductions between the client team and the Devbridge working team; and establishes recurring sprint ceremony invites. |
Technical project lead | Creates high-level architecture documentation (involves other team members if necessary) and updates the Confluence page with engineering materials. |
Testers | Help the PM to refine the backlog and write acceptance criteria. The lead starts work on the testing strategy document. |
Designers | Draft and own a style guide for the project and prepare assets for the stories planned in Sprint 1. If sales materials exist, designers refine or repurpose assets to enable the front-end developers. If necessary, they can also run parallel research efforts for the discovery element of dual-track scrum. |
Developers | Set up the project structure, environment access, continuous integration, tools, and common components. At a minimum, a development environment should be ready by Sprint 1; ideally, a staging environment will be ready as well. Integration spikes are carried out to validate technical feasibility. |
Implementing product and technical best practices
One of the reasons for this book’s existence is to spread awareness of the unique configuration of our process—the things that make us successful, our clients continuously impressed with our work, and our teams constantly growing as professionals. We have often found ourselves discussing and comparing two similar projects, trying to determine why one was a poster child, and another a recovery effort from doom. Products are complex, many of the moving pieces—subjective, human in nature.
We decided that each project could be evaluated based on a set of criteria involving product health and technical maturity.
The product health score focuses on our best practices around process, product management, and product design. The aggregate score represents stakeholder attendance in sprints, consistency of retrospectives, standup participation from the full team, existence of definitions of ready and done, availability of users for product testing, and many other variables.
The technical maturity score, on the other hand, represents our adherence to best practices from a software engineering perspective. The score represents implementation of best practices such as CI/CD, testing automation, environment access, performance testing, and many others.
Both scores are represented as percentages and higher numbers are better—project A is 75 percent technical maturity; project B is 95 percent maturity. Please don’t feel like this rating applies to your individual contribution and that lower scores somehow indicate you’re “failing”—quite the opposite. We leverage these ratings during a state of the union with the client and force the most painful issues—tracking, understanding, and communicating unreasonable bottlenecks (such as limited access into a staging environment, reluctance to attend demos). The scores are a lever we can use to improve your quality of life on the project, not to mention the general outcome of the product.