Why does a daily standup or scrum team have a definition of done (DoD)? It’s simple - everyone involved in a project needs to know and understand what "done" means. What is DoD? It is a clear and concise list of requirements a software increment must adhere to in order to be considered a completed user story, sprint, or be considered ready for release. However, for organizations just starting to apply Agile methods, it might be impossible to reach immediately. Your organization needs to identify the problems and work as a team to build your version of DoD to solve them.
The following conversation occurs during your daily standup:
Product Manager (PM): "Is the user story done?" Developer (Dev): "Yes!" Quality Assurance (QA): "Okay, we will execute our manual and/or automated tests today."
Later that same day:
QA: "We found several issues, did Dev perform any code reviews or write any unit tests?" PM (to Dev): "QA found several issues, did you do any code reviews or unit testing?" Dev: "No, the code was simple. It was going to take too much time to write unit tests."
Has this ever happened to you?
In traditional development practices, Dev finishes the code and hands it off to QA. Then QA spends hours, days, and sometimes weeks reviewing documentation, executing test cases, and holding bug-bash parties. This methodology was efficient initially, but now, an organization may realize it isn’t working as expected. The problems start when developers deliver code late in the sprint, not allowing enough time for code reviews, testing and bug fixes. This leads to undone work which will compound over multiple sprints, and can cripple a release. [caption id="attachment_13120" align="aligncenter" width="450"]
https://www.scrumalliance.org/community/articles/2014/january/why-using-a-definition-of-done-in-an-agile-project No one should neglect the importance of getting things done, and everyone needs to have a clear definition of “done” as an organization.
Determining the definition of “done” is an essential conversation every development team should have. A couple of elements can help paint a clear picture of the meaning of “done” for your organization. First, seriously consider creating lean user stories. This allows Dev to code complete on small, testable functionality of a story. This is a game changer. QA will get small chunks of completed code that can be tested throughout the entire sprint (same lean user story), versus waiting until the end of a sprint. I truly believe QA needs to be embedded and involved early, so they are working along with Dev and clearly understand the sprint deliverables. To make this efficient, Dev and QA must work on the same thing at the same time. Embedded QA has several benefits. They create transparency, help build in quality early, provide daily feedback, and more. If everything works out, it will eliminate the tradition of QA waiting for Dev to finish coding before starting QA tasks (development and testing). Until you change, you are still waterfall. Second, by following at least some of the guidelines listed below, your organization can start specifically defining its own meaning of DoD:
Quality Of Work - inconsistent standards lead to bugs, unhappy customers, and poor maintainability. DoD effectively becomes a team’s declaration of values.
Transparency - everyone understands the decisions being made and the work needed to complete a releasable increment.
Generate Feedback - as progress towards “done” is made on a releasable increment, the opportunity for feedback should be built in. This can be accomplished in many ways, including code review, architecture review and automated testing.
Clear Communication - progress is easy to track and report. The remaining work to do is clear.
Expectation Setting - common understanding among all developers, product owners and quality assurance. When we say a task is done, everyone on the team knows what that means. When tasks are planned, they are estimated to account for the entire DoD.
Better Decisions and Planning - work is planned to accommodate the DoD. Extra time can be estimated in the interest of ensuring a task is completed to the standards of the DoD.
The team owns, validates, and iterates over "done." What elements need a checklist for DoD?
User Story - story or product backlog item
Sprint - collection of features developed within a sprint
Release - potentially shippable state
The key principle of DoD is to have a predefined checklist for the user story, sprint, and release that the team agrees on. It is important to understand that everyone's checklist will be different. The lists below are only samples, and not definitive, as each project may require its own definitions. When is your Team “Done" with a User Story in a Sprint?
Acceptance criteria are met
Peer review has been performed
Code is checked in
All types of testing are completed
Any other tasks and specified "Done” criteria are met
When is your Team "Done" with a Feature in a Release?
Story planning is complete
All code reviews have been performed
Bugs are resolved
All types of testing are complete, with a 100% success rate
All appropriate documentation is in place
Any other specified tasks and "Done" criteria are met
When is your team "Done" with a Release?
Satisfied with sprint(s) completion
Deployment to stage
All types of testing are complete, with a 100% success rate
Rollback / remediation planning
Deployment to production
Production sanity checks are met
Release notes are complete
Training has been performed
The DoD is a comprehensive checklist that will add value to activities that assert the quality of a feature. It captures activities that can be committed by the team, which leads to improvement of the product and processes, minimized risk, and much clearer communication at each level (story, sprint, release), along with other benefits. Look for ways to grow your story DoD so that you can consistently build and release quality software quickly. Greg Sypolt (@gregsypolt) is a senior engineer at Gannett and co-founder of Quality Element. He is a passionate automation engineer seeking to optimize software development quality, coaching team members how to write great automation scripts, and helping testing community become better testers. Greg has spent most of his career working on software quality - concentrating on web browsers, APIs, and mobile. For the past 5 years he has focused on the creation and deployment of automated test strategies, frameworks, tools, and platforms.