The following article was posted by InfoQ: https://www.infoq.com/articles/metrics-agile-teams/

The importance of metrics to Agile teams

We are fortunate to have the opportunity to work with a great variety of engineering teams – from those in start-ups to very large, distributed enterprises.

Although definitions of “engineering excellence” vary in these different contexts, all teams aspire to it. They also share the broad challenge of needing to balance the “day job” of delivering high quality, high value outcomes against the drive to continually improve.

Continuous Improvement (CI) inherently requires metrics against which to measure progress. These need to be balanced and meaningful (i.e. deterministic of improved outcomes). This creates two immediate issues:

We view CI as vital in healthy and maturing Agile environments. Hence metrics to underpin this process are also vital. However, CI should be owned and driven by the teams themselves so that teams become self-improving. Ergo, CI programmes become SI (Self-Improvement) programmes.

This article focuses on how teams can implement a demonstrably effective SI programme even in the fastest moving and most resource constrained Agile environments so that they remain self-managing, deliver value quickly, and continue to improve at the same time

The Size of the Prize

The concept of CI has been around for a long time. It was applied perhaps most famously in a business context in Japan and became popularised with Masaaki Imai’s 1986 book “Kaizen: the Key to Japan’s Competitive Success.”

The CI principle is very complementary with core Agile principles. Indeed, the Agile Manifesto states:

At regular intervals, the team reflects on how to become more effective, then tunes and adjusts its behaviour accordingly.

There are two key themes here – firstly, CI and secondly, that CI is driven by the teams themselves (SI). This raises the question as to what role of leadership should take in this improvement process.

Our evidence shows that the size of the prize is very significant. Well implemented SI programmes can deliver significant and sustained improvement in metrics that underpin your time to value (TTV) – for example:

However, achieving these goals is hard and requires sustained effort. Technology leadership needs to give teams the tools (and encouragement) to own and drive the self-improvement process. Without constant support, many teams will not have the time or inclination to drive their own self-improvement as they strive to meet their short-term delivery objectives.

The tools needed for effective Agile team Self-Improvement

The principle of team Self-Improvement (SI) is simple and powerful, but very hard to deliver effectively. It requires four important things:

  1. A serious long-term commitment and sponsorship from both the leadership team and the teams/squads themselves – and requires effort and resources over a prolonged period of time to realise iterative improvement
  2. An agreed, objective set of metrics to track progress – making sure that these metrics are actually the right ones, i.e. deterministic of the desired outcome
  3. A means for teams to easily track these metrics and set targets (with targets calibrated against internal and external benchmarks)
  4. An embedded process within teams to make the necessary changes; celebrate success and move on.

Agile teams are almost always busy and resource-constrained. As a result, the intention of always improving (in a structured and demonstrable way) often loses out to the pressures of the day job – delivering to the evolving demands of the business.

In our experience, successful SI requires coordination and stewardship by the technology leadership team, whilst empowering teams to own and drive the activities that result in incremental improvement. Therefore this needs to be in the form of a structured, long-term and well implemented SI programme.

Implementing an effective team Self-Improvement programme

Self-Improvement needs a serious commitment from the leadership team within engineering to provide teams with the tools they need to self-improve.

This will not be possible if the organisation lacks the BI tools to provide the necessary metrics and reporting over the full delivery lifecycle. Firstly, the reporting found within common workflow management tools like Jira is not optimised to provide the level of reporting that many teams require for an effective SI programme. Secondly, teams use a number of tools across the delivery cycle, which often results in data existing in siloes and not integrated to reflect a full view of end-to-end delivery.
Teams should seek out BI tools that address these challenges. The right tools will give product and engineering teams meaningful metrics and reporting around which to build robust SI programmes.

Metrics for SI

As mentioned in the intro, selecting and agreeing metrics is often the most contentious issue. Many programmes fail simply because teams could not agree or gain buy-in on meaningful sets of metrics or objectives.

By its very nature, Agile encourages a myriad of different methodologies and workflows which vary by team and company. However, this does not mean that it’s impossible to agree achieve consensus on metrics for SI.

We believe the trick is to keep metrics simple and deterministic. Complex metrics will not be commonly understood and can be hard to measure consistently, which can lead to distrust. And deterministic metrics are key as improving them will actually deliver a better outcome.

As an example – you may measure Lead Times as an overall proxy of Time to Value, but Lead Time is a measure of the outcome. It’s also important to measure the things that drive/determine Lead Times, levers that teams can actively manage in order to drive improvements in the overarching metric (e.g. determinant metrics like Flow Efficiency).

The deterministic metrics we advocate are designed to underpin team SI, in order to steadily improve Agile engineering effectiveness.

The (determinant) metrics are grouped into six key areas. These are:

  1. The key enabler – best practice and tool use
  2. A key push-back is often that tool usage (e.g. Jira) is so inconsistent, that the data collected from within it is not meaningful (the old adage of “garbage in, garbage out”).
  3. However, there are some simple disciplines, that can themselves be measured, that greatly improve data quality.
  4. In addition to focusing on best practice “hygiene” metrics, teams can build their self-improvement initiatives around five further determinant metric sets…
  5. Sprint disciplines and consistent delivery of sprint goals (Scrum Agile)
  6. Proportion of time spent/velocity/efficiency of writing new features (productive coding)
  7. Quality and failure rates and therefore…
  8. Proportion of time spent/efficiency of bug fixing and re-work
  9. Teamwork, team wellness and the ability to collaborate effectively.

From these six areas, we believe these are some of the most common and meaningful metrics around which a team can build an effective self-improvement programme:

In our experience, a highly effective Agile SI programme can be built around these metric sets. We’ve also found that having an integrated view of the full delivery cycle across the right tools in a single view, underpinned by these core metrics reveals key areas that can be optimised, i.e. low hanging fruit that can materially improve Time to Value.

Metrics should be available in near real-time to the teams, with minimal effort. If teams have to collect data manually, the overall initiative is likely to fail.

A sample SI Dashboard

When all team members have a near real-time view of the metrics that they’ve signed up to, these become a core part of daily stand-ups and sprint retrospective reviews.

The aim is not to compare these metrics across teams – instead the key aim is to track improvement over time within the team itself. Leadership teams need to remain outcome focused, whilst enabling and empowering teams to identify and make incremental improvements that will improve those outcomes.

Running the SI programme

Team SI is unlikely to take place consistently and sustainably across teams, without committed leadership. The SI programme needs to be formally established on a monthly cycle of team target-setting, implementation, review, and celebration of success (see below).

Team Leaders and Scrum Masters need to strike the right balance of sponsoring, framing and guiding the programme with giving teams the time and space they need to realise improvements.

SI is designed to be a positive and motivating process – and it is vital that it is perceived as such. A key element of this is remember to celebrate success. It’s easy to “gamify” SI and find opportunities to recognise and reward the most-improved teams, competence leaders, centres of excellence, and so on.

Target setting

Questions often arise around target setting and agreeing what success looks like. Some organisations opt only to track individual teams’ improvement over time (and deliberately not make comparisons between teams). Still others find benchmarks useful and divide them into three categories:

  1. Internal benchmarks (e.g. measures taken from the most mature Agile teams and centres of excellence within the organisation)
  2. External competitor/comparator benchmarks – some tools provide anonymised benchmarks across all metrics from similar organisations
  3. Agile best-practice benchmarks – these are often hard to achieve but are obvious targets as the SI programme develops.

The SI programme leader/sponsor can view progress against these benchmarks and look back over the duration of the programme to view the rate of improvement.

In summary:

At regular intervals, the team reflects on how to become more effective, then tunes and adjusts its behaviour accordingly.

  1. formal sponsorship by technology leadership in the form of recognition and a suitable framework to manage the long-term process; and crucially
  2. a set of meaningful and agreed Agile metrics that underpin the process of SI and track performance improvement over time; and
  3. a means to surface these metrics in near real time, with minimum/no effort involved for the teams themselves.
The following article has been published in JaxEnter

https://jaxenter.com/second-age-agile-159373.html

As all LOTR fans will know, the Second Age in Middle Earth lasted 3,441 years – from the defeat of Morgoth to the first defeat of Sauron. Lots happened in Middle Earth during the period and many chickens came home to roost (metaphorically speaking).

In many ways, Agile is entering a similar age. It’s been more than 15 years since the Agile Manifesto was conceived and adoption has been very rapid. It is estimated that 88 percent of all US businesses are involved in some form of Agile methodology in some part of their technology operations.

As such, Agile finds itself approaching the top of the “S” of the adoption curve (see below). As with all innovations approaching late-adoption maturity, the honeymoon period is over and businesses working in Agile are under increasing pressure to demonstrate that their Agile transformations are successful and adding real business benefits.

The lack of Agile metrics platforms

Technology teams are very familiar with measuring output, performance and quality and are not short of quant data. Surprisingly, however, there are very few BI solutions available that aim to measure the overall effectiveness of Agile software development teams across the full delivery cycle– from ideation to deployment.

The solutions out today there tend to focus on one element within the overall Agile process – e.g. code quality tools (focused on coding itself), and workflow management plug-ins that look at certain aspects of the development process, yet often exclude pre-development and post-development stages.

Indeed the “Agile metrics platforms” or “Agile BI” sector is so embryonic, that analysts like Gartner do not yet track it. The closest related sector that Gartner analyses is “Enterprise Agile Planning Tools, which, although related, is focused on planning rather than the efficiency and quality of the output.

Fortunately, newer solutions are emerging vying to answer this unmet consumer need. To create a balanced set of Agile metrics that track overall effectiveness, look for new systems that ingest and analyse data from a variety of tools that software development teams use in their everyday work.

What should you measure?

It is reasonable to assume that all Agile transformations broadly aim to deliver against the Agile Manifesto’s number one objective: the early and continuous delivery of value. As the Manifesto states:

“Our highest priority is to satisfy the customer through early and continuous delivery of valuable software”.

The Manifesto’s subsequent aims support this guiding principle and can be summarised as:

The challenge and key question is how do you measure these objectives and demonstrate that “Agile is working”? This opens up the contentious subject of Agile metrics.

Navigating the politics of Agile metrics

Why are Agile metrics contentious? There are many protagonists within large technology teams. Each has their own distinct views as to:

This makes selecting Agile metrics extremely important. Unless the process involves the key protagonists (from teams to delivery managers to heads of engineering) the metrics may not be accepted or trusted. In those circumstances, there is little point in collecting metrics, as teams will not drive to improve them and show the desired progress.

Meaningful Agile metrics

This is our take on a meaningful set of metrics for Agile development teams to track and demonstrate improvement over time.

As the table shows, some of the metrics are used by the team only and will not be compared across teams. Some can be aggregated across teams in order to give managers an overall view of progress.

Team view Agile metric set

These metrics are by no means definitive and readers will doubtless disagree with some. Since they have shown to be deterministic of outcomes, however, they provide a very useful starting point for development teams in this ‘Second Age of Agile’.

Plandek is delighted to confirm that it now provides an integration with CI/CD tools (such as Jenkins, CircleCI, GoCD and TeamCity) to provide yet more insight to its growing customer base.

Plandek now offers the most complete end-to-end view of the software delivery cycle to enable Agile software delivery teams to greatly reduce delivery risk and improve delivery efficiency.

Most organisations apply an Agile methodology to some or all of their software development and there is a growing recognition for the need for improved Agile governance.  Strong Agile governance ensures that the end-to-end Agile development process is transparent, with meaningful metrics to quantify and mitigate delivery risk and ensure optimal efficiency.

The Plandek BI and analytics platform is unique in providing the largest library of delivery, engineering and Agile metrics suitable for all levels within the technology organisation – from the teams themselves, to the CIO.

Plandek, the rapidly growing SaaS provider of software delivery metrics and analytics, was co-founded in 2017 by Dan Lee (founder of Globrix) and Charlie Ponsonby (founder of Simplifydigital). It has struck a chord with its unique take on end-to-end software delivery metrics and analytics, accessed in near real-time via the Plandek dashboard.

Plandek is growing very rapidly and works with global organisations such as Reed Elsevier, News Corporation, Autotrader – as well as a growing portfolio of European technology-led businesses such as Preqin and Secret Escapes.

Dan Lee, Co-CEO of Plandek commented on the new development: “We are delighted to announce Plandek’s integration with the full suite of CI/CD tools.  It gives our customers a unique and powerful view of the end-to-end software delivery cycle – enabling them to greatly reduce software delivery risk”.

For enquiries please contact Darina Lysenko: dlysenko@plandek.com.

Getting the Agile balance right

The author of this guest blog was Director of Quality Engineering at a global information analytics business specialising in science and health. The author was responsible for overall Quality Engineering strategy across one of the business units before recently taking a position closer to home. 

 

Whether you’re talking about governments or Agile development, the decision to centralise power and aim for consistency or to let individual groups self-govern can be polarising. As ever, there are no black and white answers.

We have learned through experience that different groups move at different speeds and work in different ways for very legitimate reasons. We accept this as a good thing and it reflects the spirit of Agile. Equally, there needs to be some degree of consistency for other legitimate reasons. Getting that balance right is both a challenge and an opportunity.

My team covers quality engineers across various locations and various stages of Agile maturity. Most teams hold the usual Scrum ceremonies and do two-week sprints. Other Agile practises vary and teams are empowered to find ways of working that work best for them. Another thing that varies is our use of tools. We have consistency across the board for some, such as the use of Jira, but the tools in use for CI/CD, version control, static code analysis etc. can vary.

“Plandek was the only tool we found that let us integrate data from multiple tools and Jira instances into a single dashboard.”

This brings me to why we decided to work with Plandek for Agile metrics.  Other alternatives we looked required teams to work in a consistent way, using the same tools and even a single Jira instance – clearly not practical for us. Plandek was the only tool we found that let us integrate data from multiple tools and Jira instances into a single dashboard.

Since we’re running multiple Jira instances and a couple hundred different ‘projects,’ or workstreams, just having a tool like Plandek that can integrate and present Jira data is proving extremely valuable. We started implementing it in our group late last year and have completed the initial rollout phase of over 100 projects

In terms of metrics, the individual teams are experimenting with determining which metrics are best suited to the way they work. Quite a few squads, for example, work very true to the Kanban style. Anything to do with velocity, or that requires estimation is not relevant to them because they don’t estimate story points. Teams are free to use metrics that are useful to them. We also have a handful of rolled up metrics that we report on monthly and Plandek has helped reduce what was previously a lot of time-consuming manual tasks to gather the data across the group.

“Just like Agile itself, people and teams embrace [Plandek] at different speeds. Some immediately see great value in being able to measure certain things and identify the bottlenecks.”

Before Plandek we had no way of gathering metrics in a rolled up fashion at all, so we’re definitely seeing value. However introducing Agile metrics is not without its challenges. Just like Agile itself, people and teams embrace it at different speeds. Some immediately see great value in being able to measure certain things and identify the bottlenecks. Others are concerned about being overly-monitored, so we have to reassure those people that we aren’t using metrics as a stick or viewing data out of context. To do this, we agreed as a management team to turn off the ability to drill down into individual’s data and then let individual teams decide if they want to turn it back on again.

What’s helped drive metrics adoption most successfully are the engagement sessions we’ve held with Plandek. In these, Plandek works closely with a particular team that volunteers to showcase their actual (not demo) data to others in the company. Those teams really gain a lot of value from being able to learn from the Plandek consultants, and they also get visibility for their work across the company.

Going back to my original point, there’s nothing black and white about Agile. Trying to achieve total consistency goes against the whole Agile ethos. At the same time, some level of consistency, especially when you’re scaling, is desirable because it provides some common ground for sharing best practices and continuous improvement. It’s much easier for people to adjust to a gain, then a loss, so I would advise other companies in our situation to hang onto some reins of consistency while empowering teams with the flexibility to adapt Agile – and metrics – to their unique requirements.

 

Found this article useful? Share it with your network and tag Plandek.

Stay tuned – we will be publishing our next CXO Blog post shortly! Follow Plandek on LinkedIn for more updates.

February 2019 – Plandek debuts new UK advertising campaign

Plandek the rapidly growing SaaS provider of Agile BI,analytics and reporting debuted its first UK advertising campaign in Canary Wharf London in February.

The “FrAgile” campaign dramatizes the challenge of implementing large scale Agile software development methodologies in large organisations.  It alludes to the need for meaningful metrics to measure progress in the journey towards Agile“engineering excellence”.  Without meaningful metrics down to the team level, teams are unable to self-diagnose and self-improve their processes.  With the result that Agile can become Fragile.

The digital outdoor campaign is focused on the Greater London area and supports a direct marketing campaign targeted at large enterprise CTOs and CIOs.

Charlie Ponsonby, Co-CEO of Plandek commented:

“We are delighted to be launching our first commercials in the UK market.  Plandek is really striking a chord with CTOs under pressure to ensure that their Agile software delivery teams deliver great results. And the campaign dramatizes how Plandek’s innovative Agile BI platform can help meet that challenge.”

Plandek (www.plandek.com) is an Agile BI, analytics and reporting platform help technology teams deliver Agile software development projects more productively and predictably.

Plandek’s big data platform mines the data history from dev teams’ tool sets (e.g. Jira, Git) to reveal and track levers right down to the team and individual level, that are highly predictive of project productivity and predictability – in order to significantly improve Agile project outcomes.

Plandek is based in London and global clients include: ReedElsevier, News Corporation and Worldpay.