Remember that these metrics measure the health of a process but not the health of a product. Not a single one addresses customer outcomes, impact, retention, etc.
I completely agree. The ultimate indicator of success is that the customers are delighted by the product. Delight is hard to measure, though. Retention, etc., are indirect indicators at best because they do not point to specific actions. I'd rather have conversations with the customers. That's one of the reasons I think it's a good idea for engineers to rotate through tech support.
Any work (and an entire product is the worst-case scenario) that is not in your customer's hands is a liability on the books. The Lean folks call it "inventory." It is money spent that is not producing revenue—a net loss. The metrics that matter are measuring the complete system of work. Local optimization (a single person or team) doesn't work because making a team faster doesn't usually increase the speed of the entire system. That fast team will just be producing inventory, which cannot be taken up by the next (slower) team. The main exception is a fully autonomous cross-functional team that handles an entire story, doing everything from product discovery all the way through delivery to the customers. The other exception is a team that works so slowly that the downstream teams are starved for work, waiting for them (a "bottleneck").
Enjoyed the blurb Allen. It made me realize that bickering over the “right and wrong” way of measuring is useless. The bottom line is, any measures that help you ask questions that lead to better delivery of value are useful questions. Lead time, Cycle Time, Throughput, WIP, Mean Time To Recovery, Complete & Accurate…etc. every “Agile Guru” claims one set or another is “best” or “most useful.
I’d be willing to bet, IYE, you’ve seen the ones that were helpful be helpful. And that’s the point. A solid leader or team that asks questions about delivering value to their customer will eventually improve how they deliver value to those customers.
And the debate will rage on. And the debate will be just as useless.
There is no "right" or "wrong," but there are plenty of metrics that don't measure anything useful or actionable. Throughput metrics like velocity and lines-of-code per day are actively destructive, for example. Metrics like how often you hit an estimate are also destructive. Given that what we need to improve is the entire system of work, the useful metrics are the ones that help us identify and pinpoint things like bottlenecks. Once a bottleneck is discovered, we (might) need metrics to understand what the problem is and how to fix it. I say "might" because, more often than not, metrics are unnecessary. The real issue, that is, is the use of metrics at all in situations where they provide no value. Most companies collect way too many metrics. Software is created by humans (even if they use AI tools), and metrics applied to humans are rarely useful. As Demming said, we can only measure 3% of what's useful.
Remember that these metrics measure the health of a process but not the health of a product. Not a single one addresses customer outcomes, impact, retention, etc.
I completely agree. The ultimate indicator of success is that the customers are delighted by the product. Delight is hard to measure, though. Retention, etc., are indirect indicators at best because they do not point to specific actions. I'd rather have conversations with the customers. That's one of the reasons I think it's a good idea for engineers to rotate through tech support.
Are there precursor metrics for products that are not released? Pre-production metrics?
Any work (and an entire product is the worst-case scenario) that is not in your customer's hands is a liability on the books. The Lean folks call it "inventory." It is money spent that is not producing revenue—a net loss. The metrics that matter are measuring the complete system of work. Local optimization (a single person or team) doesn't work because making a team faster doesn't usually increase the speed of the entire system. That fast team will just be producing inventory, which cannot be taken up by the next (slower) team. The main exception is a fully autonomous cross-functional team that handles an entire story, doing everything from product discovery all the way through delivery to the customers. The other exception is a team that works so slowly that the downstream teams are starved for work, waiting for them (a "bottleneck").
Enjoyed the blurb Allen. It made me realize that bickering over the “right and wrong” way of measuring is useless. The bottom line is, any measures that help you ask questions that lead to better delivery of value are useful questions. Lead time, Cycle Time, Throughput, WIP, Mean Time To Recovery, Complete & Accurate…etc. every “Agile Guru” claims one set or another is “best” or “most useful.
I’d be willing to bet, IYE, you’ve seen the ones that were helpful be helpful. And that’s the point. A solid leader or team that asks questions about delivering value to their customer will eventually improve how they deliver value to those customers.
And the debate will rage on. And the debate will be just as useless.
There is no "right" or "wrong," but there are plenty of metrics that don't measure anything useful or actionable. Throughput metrics like velocity and lines-of-code per day are actively destructive, for example. Metrics like how often you hit an estimate are also destructive. Given that what we need to improve is the entire system of work, the useful metrics are the ones that help us identify and pinpoint things like bottlenecks. Once a bottleneck is discovered, we (might) need metrics to understand what the problem is and how to fix it. I say "might" because, more often than not, metrics are unnecessary. The real issue, that is, is the use of metrics at all in situations where they provide no value. Most companies collect way too many metrics. Software is created by humans (even if they use AI tools), and metrics applied to humans are rarely useful. As Demming said, we can only measure 3% of what's useful.