The following article has been published in ComputerWeekly and TechTarget.
“Plandek uses proprietary algorithms to synthesise complex fuzzy data sets to provide actionable insights designed to improve productivity today and early-warning signs to mitigate against the problem projects of tomorrow.”
Ponsonby writes as follows…
Now that Agile is officially mainstream, development teams must not allow old, ingrained habits to resurface and dilute their potential. This is a genuine risk.
Agile is, after all, a relative term and fairly meaningless unless qualified. So do you know how agile your development is? One way to embed the culture change required to answer that key question is through self-improvement (SI) processes underpinned by the right agility metrics.
Agile is already closely linked to SI — let’s remember that the Agile Manifesto states: “At regular intervals, the team reflects on how to become more effective, then tunes and adjusts its behaviour accordingly.”
In other words, Agile is about continuous, team-driven SI. The fact that retrospectives are among the top five Agile techniques underscores SI’s importance (source: State of Agile report).
Nevertheless, SI efforts regularly fail due to inadequate leadership and follow-through. Teams either don’t have the right tools to collect the data or that they set the wrong metrics. The latter can be especially problematic when Agile development projects are scaling.
At Plandek and in former positions, our leadership team has had the opportunity to work hands-on with a wide variety of Agile engineering teams – from start-ups to very large, distributed enterprises. Based on these experiences, we have identified five critical areas for effective Agile team SI.
1. Commitment & sponsorship
Agile teams are almost always busy and resource-constrained. As a result, the intention to always improve (in a structured and demonstrable way) often loses out to the pressures of the day job – delivering to the ceaseless demands of the business.
SI calls for a serious commitment from engineering leadership through a structured, long-term and well-implemented SI programme. This includes establishing a monthly team target-setting cycle, implementation and review. These must be supported with robust tools that provide the necessary metrics and reporting.
2. Agreed metrics
Selecting and agreeing on metrics is often the most contentious issue. Failing to reach consensus and buy-in on metrics is why so many Agile programmes fail.
As the Agile author Scott M. Graffius put it: “If you don’t collect any metrics, you’re flying blind. If you collect and focus on too many, they may obstruct your field of view.”
By nature, Agile encourages a myriad of different methodologies and workflows that vary from team to team. However, this does not mean it is impossible to agree on a set of meaningful Agile metrics to build a self-improvement programme. Our initial research across more than 100 projects in 12 months has revealed the following Agile metrics to be deterministic of better outcomes and can be tracked and improved in all teams:
- The key enabler metrics – best practice and tool use. Some team members argue that tool usage (e.g. Jira) is so inconsistent that data collected from within is not meaningful (garbage in, garbage out). However, it’s possible and useful to measure how much team members adhere to simple processes that are part of your software development lifecycle.
- Sprint disciplines and consistent delivery of sprint goals (Scrum Agile).
- The proportion of time spent/velocity/efficiency of writing new features (productive coding).
- Quality and failure rates.
- The proportion of time spent/efficiency of bug fixing and rework.
- Teamwork, team wellness and the ability to collaborate effectively.
3. Data context
A metric on its own is only meaningful when viewed in the right context. This means you need to harvest and combine the right data sources. To gather meaningful insights into the team processes, data should come from systems that developers use, including workflow management software like Jira; code repositories like GitHub, Bitbucket, TFS or Gitlab; code quality tools like Sonarqube; and time tracking systems like Harvest or Tempo.
4. Tracking metrics in near real-time
Once metrics are agreed upon, making them available to teams in as near to real-time as possible without creating added work is essential. This means not expecting people to collect data manually. Besides creating extra work, the data will be retrospective and less useful. Look for tools that give teams the metrics they need in near real-time without requiring manual input.
5. Celebrate success
Once you’ve agreed on what success looks like, it’s important to pause and celebrate when teams reach key milestones. You want people to regard SI as a positive and motivating process. SI lends itself well to gamification and communicating recognition to the most-improved teams, competence leaders, centres of excellence, etc. Celebrating success shouldn’t be seen as a ‘fluffy’ optional but essential to reinforcing the right behaviours and raising morale. Also: teams that measure SI gather the data necessary to win agile development awards.
Well-implemented SI programmes can deliver a profound and sustained improvement in metrics that are indicative of productivity and timing accuracy. Typical results that we see are:
- 10%+ velocity improvement
- 10%+ improvements in flow efficiency
- 15%+ reduction in return rates and time spent reworking tickets (returned from QA)
- 30%+ improvement in sprint completion accuracy (Scrum Agile)
- Greatly improved team collaboration and team wellness.
Without question, SI calls for major cultural change and sustained effort. Technology leadership needs to give teams the tools (and encouragement) to drive the process. Without ongoing support, many teams will not have the time or inclination to drive their own SI as they strive to meet their short-term delivery objectives.
Fortunately, new tools are becoming available that help teams get SI programmes right and improve genuinely monitor and improve agility. The results we have seen in practice show it is well worth the effort.