Contact Us

Measurement and Agile – Oil and Water? (Part 2)

Have you read part 1?
Start Reading

In this blog series, we’re continuing to explore measurement and metrics in the Agile space to help organizations and teams be more successful. If you missed part one, you can read it here.

Below, you’ll find a few more recommendations for establishing and rolling out an organizational measurement and metrics framework.

3. Make sure you take a balanced view

Software development, organizational health and performance are complex subjects and deserve a balanced view. Focusing only on one or a few specific areas may lead to local optimizations, but create systemic issues. If we only measured and optimized for system throughput, we might neglect overall quality, which often falls by the wayside when teams are pushed to deliver too much too quickly. Also, people could try to ”game the system” because they’re incentivized to make only one specific metric look better.

I’m not promoting a full-blown implementation of the somewhat outdated balanced scorecard, but I’m encouraging including most or all of the cornerstones of an organizational ecosystem, such as delivery performance, quality, culture, achievement of business outcomes, and customer satisfaction. By doing so, any undesired consequences will likely be visible and we get the whole picture and understand all of the implications of our choices.

4. Gather data on a regular basis and in a consistent manner

Once suitable metrics are identified, gather the data in a consistent manner and on a regular basis. Consistency is important in order to allow for trending of data over time and identification of patterns. A regular cadence is necessary to ensure data is available at the right resolution and appropriate for the data velocity (rate of data change), but also frequency of inspection. Often, processes like Scrum, SAFe, or Kanban have natural cadences which lend themselves to regular data capture such as every sprint, week, or program increment (PI). You don’t want to capture too frequently or lose out on important data points. Some measurements naturally require longer cycles, such as customer net promoter score (NPS), which is likely more meaningful monthly or quarterly than weekly.

5. Get value from the data

All efforts up to this point will be meaningless if you can’t derive value from the data you’re gathering. So you’ll want to review the data on a regular basis (e.g. monthly) with key stakeholders. During these reviews, it’s important to not only look at the data, but to derive actual actionable insights from it. The more you can articulate what is happening, why it is happening and “now what” (what actions should be taken), the more valuable these reviews become. It will often be helpful not just to look at study the current snapshot of the metrics, but put them in context, and understand trending and development over time. Example: Your actions could be quite different if your NPS is currently at -20 based on whether it was -50 or 50 the last time.

In order to not challenge the audience with cognitive overload, limit the number of metrics and charts you’re sharing while also not glossing over the information at too high of a level. Summarize and aggregate if necessary as long as that doesn’t obfuscate key characteristics and nuances. This might also require you to normalize metrics and calculate ratios in order to compare apples to apples, e.g. use percentages vs. absolute numbers or divide by time or team size to eliminate differences in time frames or composition.

Give the audience time to digest the information and ask questions; many may not be familiar with the content or format at first and might need to time to absorb it. Look for common patterns and what seems to be working well, but stay away from comparing teams (or release trains etc.) against each other directly.

The review meetings should not be about assigning blame, but finding areas of opportunity, defining clear actions with owners, and following up in order to make the organization better and help it grow. If people feel they’re getting punished for “poor” measurements, they’re simply incentivized to always “fly below the radar” and do whatever they can to make things always (artificially) look good, just so nobody calls them out. These negative behaviors would be a breeding ground for unintended consequences[1], including gaming.

Lastly, let’s not forget to also celebrate successes instead of just looking for problems. Take what seems to be working and see if it can be shared and replicated.

6. Iterate on the measures and the process

“Inspect and adapt” applies in this domain as well: Start easy; get value quickly and iterate on everything from what is being measured; to how it’s captured to the way you review. Don’t lose sight of the ultimate goal which should be to derive value and help the organization and its teams get better.

Keep in mind that while measurements are helpful, they are always a simplification of a complex adaptive system and therefore never able to capture every aspect. All measurement systems have gaps, and can never fully and appropriately describe an organizational context and its dynamics.

“Not everything that can be counted counts and not everything that counts can be counted.” (attributed to Albert Einstein)

Next Up

Hopefully these suggestions got you thinking about or even started with measurements. In part three of this post we’ll take a closer look at some sample goals and measurements to consider, and how you might go about automating the gathering and the analysis of your data.

By René Rosendahl, AgilityHealth Product Strategist
Image source: Carlos Muza via Unsplash

[1] In one organization, the Help Desk staff was measured on not keeping support tickets open for more than a week. The intent was admirable, in that the technicians were supposed to help users in a timely manner and fix their problems. In practice, I was told by the Help Desk on several occasions that they would have to close my unresolved ticket because it had been open for too long and that I was supposed to just “resubmit” it as a new ticket, so they could continue to work on the issue. Obviously, they never measured the users’ satisfaction with their services.

Comments

Stay connected with us

Be the first to know the latest news about AgilityHealth