As it’s pointed out, the DoD is never a one-size-fits-all set of rules, it heavily depends on the specifics of the project. Hence, it will (and should) differ in a quick and dirty Proof of Concept and in a complex, quality-critical system. It’s essentially experience and common sense which should be the determinants of a helpful DoD.
The following is an example of a definition of done agreement on a feature level in a medium-sized, Scrum-based, quality project. For convenience, it’s divided into three parts.
Make sure that User Stories are aligned to the INVEST rule and are estimated in Story Points. Both the Team and the Product Owner must have a common understanding of the requirements,
Check if there is any pending Code Review, if so – do that first,
Put the ID of corresponding ticket in the commit messages,
Frequently comment on your progress and log time spent on the task in your ticket system,
Write unit-tests if applicable,
Bugfixes: always unit-test the bugged path to avoid regression,
Align with the logging guidelines, so it’s easy to track problems in future,
Consider code static analysis feedback,
Consider configuration, deployment and Continuous Integration adjustments,
Let your peers review your code,
If possible, let someone else write a User Acceptance Test for the new feature,
Don’t merge your patch to the main branch, unless it’s green and you have 1h to react,
Let the CI server run all levels of tests & ensure they are all green,
If applicable – let someone else perform manual tests for your User Story
Write down the testing steps and their results in the ticket, so that it’s easy to repeat the tests when needed,
It should clearly communicated to all team members that any disobedience will be severely punished and no mercy whatsoever will be shown! ;-)
What’s the definition of done in your project? In what ways does it differ?
We just sent you an email. Please click the link in the email to confirm your subscription!