Practices for high-quality software delivery
Ways of working help to define how we approach our work.
By developing a consistent set of practices, organisations can enable learning across projects and make it easy for teams to share knowledge and insights.
A well-defined process helps to ensure success and promotes the ability of the team to self-manage in varied types of projects.
Our ways of working are based on our experience at Fluxus, but are always open to improvement. We share them here and the info in this article can provide a baseline for others to work from. In the second of this two-part series we look at delivery and learning.
Our kanban board measures the progress of features. It measures how much value has been delivered, not how much work has been done, or who did it.
The board shouldn’t include bugs or technical tasks, because these don’t give us a good idea of overall progress.
Each card should represent one feature from the impact map.
One person can only be assigned to one card, with no exceptions. Multiple people may be assigned to the same card if they’re working together.
In-progress cards should be moved to done before starting any new cards.
• A system for tracking feature development
• A visible marker of progress
• A signal for when activity is needed on a feature
• There to tell the product owner how the things they care about are progressing
Kanban is not:
• A to-do list of personal tasks
• A list of everything that needs to happen
• A holding-pen for opinions or ideas
• There to explain why features aren’t being worked on
• Progress is measured in features delivered.
• Similarly-sized features are easy to substitute without changing scope.
• Anything that isn’t a feature is an implementation detail.
• Technical dependencies are checklist items on the relevant card.
Definition of Done
Before starting work on a feature, you need to know that you can deliver it - what done means and when you have finished.
This starts with a conversation between the person delivering the feature and the product owner (or stakeholder) who will sign it off. As part of this you will create and discuss acceptance criteria. These should be focused on the business impact and be in plain English. These can be used as the basis for BDD tests.
In addition to the acceptance criteria for an individual feature, the project will have a wider ‘Definition of Done’ which will apply to every feature. This should be tailored to suit a project but for example, to be considered “done”, a feature must typically:
1. Have acceptance criteria agreed with the product owner
2. Pass peer review of code
3. Have automated tests, and these must pass, including regression tests
4. Meet the acceptance criteria
5. Be approved by the product owner
6. Be releasable without manual steps
Continuous delivery means that as soon as a feature is completed, we are ready to deploy it to production.
To do this, our codebase needs to stay in a releasable state at all times.
Once a feature meets the Definition of Done, it should be releasable immediately to production. Automated deployment procedures should ensure that there is no disruption or downtime except in rare cases.
Everything about our ways of working is subject to continuous improvement.
We should aim to deliver faster, with higher quality, greater reliability and lower cost over time.
This means that we must learn from our experiences. Improvements made on one project must be spread to others.
To do this, we will need to share technology, processes and techniques across projects, and meet regularly to discuss how to diffuse any improvements made.
• Impact Mapping by Gojko Adzic
• Lean UX by Jeff Gothelf with Josh Seiden
• Lean Startup by Eric Ries
• Continuous Delivery by Jez Humble and David Farley
• FIRE by Dan Ward
• The Goal by Eli Goldratt
Further suggestions welcome!