When is it done?
Various teams have different definitions of when a feature is done. Is something done when it's deployed? Or when it's merged? Or is it done when it's being used actively by users? The definition will vary greatly depending on your product.
As web developers we have it easy. We can push to a server and update the code to be used by all our users instantly. However, back when I was building desktop applications, the feedback loop was much longer. We would release a new version every 6 months.
If something is only done when someone uses it, shipping like this means features are only "done" every six months.
Obviously, this was not case. Teams building Desktop applications have a vastly different version of "done." For instance, deploying a bug for a web developer is probably not as bad as having a bug creep into a desktop application's release; a web developer can simply fix the issue in minutes by deploying new code.
It is an interesting thought experiment to list what your team needs to consider a feature "done."
How we do it
For us at Rainforest, this is what I came up with.
- The code is unit tested
- All the tests pass
- The code was reviewed
- The code was merged to our "develop" branch, which gets automatically pushed to staging
- A Rainforest Test was written for this feature
- Instrumentation was added in at least one of the tools we use (NewRelic, Librato, Mixpanel, etc)
- The Rainforest tests have passed against our staging server
- The Rainforest tests did not trigger an exception caught by our exception monitoring tool
- The code is pushed to production
- Monitored metrics are within expected range (our monitoring tools are configured to alert us if the metrics deviate from expected values)
- There are no performance regressions
- Users are using the new feature
- There are no exceptions raised in production
- Support requests for this feature have fully been addressed
- The task is closed in Asana
This process obviously varies to adapt itself to the reality of the situation. Some features might not need monitoring because they are too trivial. Some feature might require the additional complexity of feature flags and incremental rollout because they are too complex.
"Done" plus moving fast
One thing that I find very interesting in this brave new world of "ship often" is that things are not "done" simply because they are deployed. We still have to measure usage, check for production issues and support issues. These might involve deploying more code to address those issues. Features are now completed by several incremental deploys.
What is your definition of done?Filed under: code review, unit testing, and qa metrics