Marcello is considered a brilliant programmer by the team, willing to tackle any task is set upon him, he writes lots of very complicated code in little time. "It will be done in a couple of days", he likes to declare right before diving into his magic. Few days later you can find him typing effortlessly on the keyboard, immersed in his thoughts: "A couple more days, it's going to be very cool".
Some time later, he finally announces triumphantly: "It's done!". The new code is immediately committed to the main line ready to be tried out by designers, eager to cut their teeth with the new shiny piece of technology they were impatiently waiting for.
The new code crash the build.
This story is far too common in our world, but i wouldn’t blame the programmer, often very passionate about his work, so willing to please the hungry designers. The pressure to deliver features is often enormous, bugged and untested features are released, more time is spent fixing past features than developing new ones. And pressure piles up.
I blame the lack of a clear Definition Of Done, that is able to define without ambiguities when a feature is ready to be deployed to the rest of team and go into production.
The final goal of the Definition of Done is to increase the rate at which working features are produced by the programming team in the long term.
The most common Definition Of Done that I've encountered in the Industry (and often used myself) is something similar to:
It's done when I wrote the code and it works on my machine.
This Definition Of Done is far from being unambiguous: in fact it means different things for each member of the team. To begin with, it's not clear what we mean here by "it works". Have I tried any possible condition in game? Or it works just in my testbed and I'm assuming it works in the game as well? Has someone from QA done a play through? Has he smoke tested the entire game or just a level? Has someone from the design team tried to use the feature? Does it actually do what the designers need? Do all automated tests pass? Is there any automated test at all?
And what about code quality: can code be changed easily if something doesn't work (and it will not work) or if something needs to be modified later (and it will need to be modified)?
Too many open questions without clear and unique answers to make this definition safe for the team. When a feature is "done", the content creators or other engineers can work on it with confidence, knowing that the chance of crashing or not doing what was asked for are minimized. Not zero. Just minimized.
A more useful Definition Of Done is a checklist of activities that needs to capture, more or less closely, the following concepts:
Uniqueness. A feature is "done" at the same stage of development regardless of who has been working on it.
Automation. Criteria can be checked automatically by tools, impact on actual production is minimal.
Quality. Code quality and implementation quality are guaranteed.
Agreeability. The entire team has to agree and stick to the Definition of Done, including designers and management.