Contact Us

Measurement and Agile – Oil and Water? (Part 3)

Did you catch part 2 of the series?
Start Reading

Welcome back! As we’ve seen in parts one and two of this blog series, many Agilists may shy away from measurements and metrics, but we discussed ways to not only make it safe for organizations to measure but even derive great value from the insights gained.

In part three, we’ll look at some useful metrics that you may consider for adoption in your organization and how you may automate or instrument some of the data gathering.

Measurements to Consider

Here are some good starting points and examples for measurements you might consider:

Area Question Metric Notes
Performance How quickly do we deliver? Feature cycle time Time from commitment to delivery of a feature
    Throughput Number of work items completed per time unit (e.g. stories/week or features/month)
    Velocity Sum of story points delivered per sprint (or time unit) by a team
  How predictable are we? % of planned story points (or # of work items) delivered compared to plan Requires planning ahead/forecasting/commitment
    PI / Program Predictability For SAFe release trains: sum of actual business value delivered by a PI vs. plan
Quality Are we releasing quality? Defect escape rate # of high/medium priority defects within two weeks of a release (could be normalized against # of stories released)
    Defect density From SDPI: count of defects divided by man days (team size times the number of workdays in a time bucket)
  How many quality issues do we have to deal with? Size of defect backlog (time to resolve all) Total # of high/medium priority defects in the backlog divided by average team throughput
Business Outcomes Are we delivering value to our customers? Product NPS Use standardized NPS measurements
    Business value point velocity Sum of business value points delivered per month (assuming those are BV points are captured)
Health How healthy is our culture? Engagement Engagement survey
  How’s our team morale? Happiness Regular measurements of team happiness, e.g. by sprint
  Are our teams stable? Team churn (%) Sum of team members added/removed (absolute) within a time period divided by size of team at beginning
  How mature are our teams? Team maturity Requires agreement on maturity stages and regular measurement

 

Sources of Data

Now that we’ve seen a number of decent measurements, where should all this data come from? A lot of the quantitative data may already be available in your ALM such as Jira, VersionOne or CA Agile Central (aka. Rally). That said, sometimes small configuration changes are required to make sure certain data is captured properly. In other cases, process changes may be needed (example: not all teams are sizing stories in story points or aren’t doing it consistently), but don’t underestimate the potential impact of process changes across all your teams and the effort required to implement them. You might be better off, in this example, to switch to a throughput metric that just uses a count. Again, we don’t want the need for certain measurements to dictate how teams should practice Agile.

Your ALM may also offer various dashboards, visualization or “reports” that allow you analyze the data. That said, take the output with a grain of salt. I’ve seen various situations where aggregating data across teams ran into challenges, for example:

  • Not all teams size their work items (or not all of them).
  • Some teams run sprints (with potentially different iteration lengths), others practice Kanban (without iterations).
  • The ALM’s project structure is different from how you need to aggregate the data.
  • Defects aren’t captured (or only partially) in the ALM but reside in a separate defect tracking system that is not integrated.
  • Defects found during development are mixed in with defects found in production.
  • Story or feature statuses are used differently across teams (affects cycle time calculations).
  • Not all stories/defects items rollup consistently to features or epics.

Larger organizations may operate a multitude of different ALM systems. Some teams might only use physical boards or run non-structured, e.g via Trello.

A note on business outcome data: In practice I’m finding that this data is often not consistently stored and maintained in a system of record. ALM (and Project Portfolio Management [PPM]) systems tend to focus on work items (large and small) while HR and performance management systems may store organizational and individual goals, but these often don’t map cleanly to business outcomes.

Qualitative data is – for reasons also outlined in the first part of this post – more difficult to come by since it is usually not captured automatically by a system that’s already in place, especially when it comes to health-related measurements. Ways to capture this data include team-based surveys, e.g. at the end of the sprint, PI or quarter, and having the Scrum Masters or Agile Coaches capture the teams’ votes or responses in a consistent manner, such as in Excel sheets. Consistency is important in order to be able to aggregate the data later across teams.

In my next post (Part Four), I will provide some more ideas and guidelines around measuring important things while still staying true to the Agile principles.

By René Rosendahl, AgilityHealth Product Strategist
Image source: rawpixel-594763 via Unsplash

Comments

Stay connected with us

Be the first to know the latest news about AgilityHealth