Various teams have different definitions of when a feature is done. Is something done when it's deployed? Or when it's merged? Or is it done when it's being used actively by users? The definition will vary greatly depending on your product.
As web developers we have it easy. We can push to a server and update the code to be used by all our users instantly. However, back when I was building desktop applications, the feedback loop was much longer. We would release a new version every 6 months.
If something is only done when someone uses it, shipping like this means features are only "done" every six months.
Obviously, this was not case. Teams building Desktop applications have a vastly different version of "done." For instance, deploying a bug for a web developer is probably not as bad as having a bug creep into a desktop application's release; a web developer can simply fix the issue in minutes by deploying new code.
It is an interesting thought experiment to list what your team needs to consider a feature "done."
For us at Rainforest, this is what I came up with.
This process obviously varies to adapt itself to the reality of the situation. Some features might not need monitoring because they are too trivial. Some feature might require the additional complexity of feature flags and incremental rollout because they are too complex.
One thing that I find very interesting in this brave new world of "ship often" is that things are not "done" simply because they are deployed. We still have to measure usage, check for production issues and support issues. These might involve deploying more code to address those issues. Features are now completed by several incremental deploys.
What is your definition of done?