Close this search box.

Measuring Agility: It’s Not The Same


Metrics of the Past Have Taught Us A Lesson

Many traditional KPIs can instigate and proliferate bad behaviors in software development. In the past, programmers have measured their progress by how many lines of code they’ve written, quality was measured by the number of bugs fixed, and user needs were considered met by checking off a list of requirements.

Traditional software development approaches, which are infamously known as Waterfall methods, included several common KPI ingredients: performance, process, and delivery.

Performance goals → fast delivery
Process goals → lean and at the lowest cost
Product or service delivery goals → all or nothing requirements, error free and perfect

On the surface these all seem relevant and worthy of measuring. What could possibly go wrong?

Goal That Drives Bad Behavior
Unintentional Outcome

Deliver things fast

Focus on low cost

Make processes lean

Aim for “error free” processes and products

Accept all or nothing

Metrics in themselves can result in people figuring out ways to play the “number game”. They will do anything to make the “numbers look good”. Unfortunately, bad metrics have found their way into the agile environment. It’s important to recognize and repackage them into something productive. Metrics today need to create the type of behaviors that cultivate the “being agile” world many of us are striving towards.

Hard vs. Soft Metrics

The metrics of the past focused on what is known as hard metrics. The metrics that support the agile definition of success require soft metrics.

Hard metrics rely on verifiable data points. They use quantitative data points to evaluate a typical work routine and how well the employees carry out their assigned tasks.

Unlike hard metrics, soft metrics use subjective data and interactive responses to determine effectiveness. Soft metrics stress the impact that human capital has on business outcomes.

Hard metrics aren’t all bad, but soft metrics create a necessary balance. Here are three hard and soft metrics that balance each other and can set a very different tone for the working environment.

Soft Metrics
Hard Metrics

Performance goals – fast delivery

Healthy teams

Process goals – lower cost

Minimum, high-value products

Product/service goals – error free, perfection

Learn fast and adjust

Another way to think about it is: hard metrics indicate if you are doing agile, soft metrics measure if you are being agile.

Agile Redefines Success

The Agile approach fundamentally redefined success factors for software delivery. People no longer work in silos and their success can no longer be measured in an individual capacity. If the product fails, the team fails, and everyone on that team fails. With the measurement of success based on a team’s effort, the ability to finger point is significantly reduced.

Let’s do a quick exercise to determine how you are measuring success. Make a check beside each criteria that you are using. Do you have more agile or legacy checks?

Agile Definitions of Success

Legacy Definition of Success


Measure how well value was received


Measure how well the plan was followed


Measure product success


Measure project success


Measure if an outcome was achieved


Measure if the documented requirements were followed


Measure if quick feedback was received and adjustments were made quickly


Measure if changes were avoided and the solution was sold as-is


Measure collaboration


Measure individual contributions to the team and target bottlenecks


Measure shared learning achievements


Measure individualized learning achievements


Measure and reward team cohesiveness


Measure and award individual contributions


Measure the ROI (return on investment)


Measure cutting costs

Agile project metrics are different because they are people-centric.  They recognize that people drive the outcomes. It’s more than delivering a product in the agile mindset. It becomes what the team did (or didn’t do) in their pursuit of creating a product as a team.

Consider the Values Defined in the Agile Manifesto

Individuals and Interactions Over Processes and Tools

By just counting lines of code, developers don’t need to collaborate with their colleagues or even talk to others in the development life cycle.  As long as they are producing code – no matter if it’s good, bad, or even usable – they’re meeting the goal.

The intent of an agile approach is for developers to collaborate with their colleagues, talk through designs, and transfer knowledge as they are producing code. This approach will likely feel slower at first.  In the long run, this team collaboration will actually enable faster (more complete) work because we deliver usable software, not just lines of code.

Working Software Over Comprehensive Documentation

Using bug fixes as measurement misses the point:  Why are there so many mistakes in the first place? In the Agile mindset, instead of counting how many defects you close and how fast you close them, count the days of being “error free” (i.e., having delivered “working software”).

I’ve seen signs in factories that display the number of days without an accident. We should create similar radiators to measurement software quality. If a team can go 90, 60, or even 30 days with no reported urgent bugs, that’s a metric worth celebrating! If 30 days is not attainable, get to the root of what’s causing the urgent bugs in the first place.

Customer Collaboration Over Contract Negotiation

Don’t think that having a lot of meetings means that true collaboration is taking place. In many environments people walk out of a meeting in agreement but once the work starts and something doesn’t go as planned (or something unplanned gets identified), the finger pointing starts. More time is spent trying to recall what was or was not decided in the previous meeting rather than trying to move forward to find a solution.

I’ve also seen co-located teams sitting side-by-side, but all wearing head phones. They can go the whole day without talking. Co-location does not necessarily equal collaboration either.

A good way to measure collaboration is to consider how much time is spent paired with a team mate, how much time is spent working on the same work item, voluntary participation in group problem solving, and transparency of what you are working on: “working out loud”.

Responding to Change Over Following a Plan

The concept of “responding to change” often leads novice teams to believe that planning is not necessary, and that they should simply deliver whatever the product owner has prioritized next. This can lead to significant churn on projects, as priorities seem to change on a daily basis.

Responding to change does not mean the team has no input or power to push back, especially when it’s justifiable. You want to use measurements that show the team is responding to change but that it’s not creating chaos. Capacity, risk, value, and impact on the team all need to be considered when a change is proposed.

What’s Next

Are you using metrics that are undermining your organization’s shift to business agility?  Maybe it’s time to make some changes.

Design a framework to architect and measure success for your agile organization in our Agile Metrics and Value Management course.

One of the benefits of Agile learning quickly from new information and making the necessary adjustments as soon as possible. So, how do you measure success in an agile environment?  Read Ali’s post on Metrics for Business Agility for ideas.

All the best,


Editor’s Note: This blog post was has been previously published by B2T on our previous website. Due to its popularity, Kathy has updated its content to be more comprehensive and accurate for the state of today’s environment.

Kathy Claycomb

Managing Partner, Lead Expert

Kathy Claycomb brings over 35 years of experience to the classroom. She has participated in all phases of solution development using everything from agile to waterfall methodologies (and quite a few in between). Before joining B2T, her career spanned roles from application developer to Senior Director of Services at various organizations. Kathy has broad industry background including transportation, manufacturing, insurance, energy, healthcare, and banking.

Kathy’s first love is teaching, and throughout her career she has always managed to spend a portion of her time instructing. She has an engaging, highly interactive teaching style that ensures students leave the course with a thorough grasp of the material. Her students consistently praise her teaching abilities and her talent for drawing on her personal experience to enhance their learning.

Kathy served as the Technical Editor for Business Analysis for Dummies, 2nd Edition.