SlideShare a Scribd company logo
The Philosophy of
Agile Metrics
1
Being Lean with Metrics
In order for Agile Metrics be in line with
the Agile philosophy, the following
conditions should be met:
• Use the minimum number of metrics
that will provide all information
necessary to meet business goals
• Measure outcomes, not outputs
• Measure results, not activity
• Measure work items completed, not
time spent per task
• Assess trends, not snapshots
• Provide feedback on frequent and
regular intervals
• Automate, automate, automate
Delivering Business Value
Agile believes that the metrics are a
means to deliver the business value to
the client. Therefore, Agile recommends
a very lean, minimal set of metrics that
really serves the purpose of the project.
The Agile metrics should be simple,
minimal/lean, outcome-based and
easy to track and report frequently.
The primary metric in an Agile
project is whether working
software actually exists, and is
demonstrably suitable for its
intended purpose. This is
determined empirically, by
demonstration, at the end of every
single iteration and product
increment
All teams and projects are
encouraged to pivot most of their
measuring-attention to this fact.
All other metrics are subordinate
to that objective and the overriding
goal of keeping the focus on rapid
delivery of quality, working
software.
Team Performance Metrics
2
Calculating Velocity
Velocity = ∑ Story points fully-delivered
a in a sprint
Note that partially delivered stories do not
count towards a teams velocity (just like
a half delivered pizza!)
Forecasting Future
Release Content
While product backlogs are not usually
estimated for the full project scope, it is,
nonetheless, important to predict what
the broad scope will be for future
releases. This helps facilitate product
support training or marketing.
Technical leadership and Technical
Architects usually make a high level
estimation on the future stories using a
rolling average of the team’s velocity
taken from the last three sprints.
The boundary line can change frequently,
due to:
• Team velocity fluctuations
• Refinement of the epic backlog into
smaller stories
• Backlog re-ordering
A release burn-up chart is used to display
the likelihood of delivering all desired
features by the release date.
Calculating Team
Productivity
When release forecasting, consider
changes in team capacity due to
holidays, sickness or training.
Busy periods such as summer or
compliance periods for mandatory
training can create velocity “noise” that
can mask underlying performance issues,
and so it is sometimes useful to create a
metric that filters these variances out.
Productivity =
3
Sprint Velocity
Actual Sprint Effort
Team Velocity
Team Velocity is perhaps the single
most important metric used in Agile
development.
It illustrates two key characteristics
of Agile projects:
1. Forecast of the content of future
releases
2. Variations in team productivity
over time
Caution should be applied to the
way team velocity metrics are
displayed and used across Agile
teams.
Velocity competition encourages
teams to find ways to artificially
increase their velocity by increasing
the size of stories or cutting corners
in development.
0
5
10
15
20
25
30
35
40
45
50
1 2 3 4 5 6 7 8 9 10
Effortremaining
Working Days
The Sprint
Burn-Down Chart
4
Purpose: Track how much work remains in
the Sprint
Goal: Reach the x axis i.e. complete all the
work assigned to the sprint
Things to watch out for:
• A sudden drop – indicating abandoned
stories or tasks
• Flat-lining – indicating stalled progress
• A spike – indicating a change in scope
during Sprint
The sprint burn-down chart is used to
demonstrate the overall progress and
health of the sprint.
Effort remaining is typically measured
using one of the following:
• Estimated Remaining Hours
• Often used – but developers can
feel they are estimating unknown
quantities, and the chart can be
distorted by optimism.
• Outstanding Tasks
• Reflects work actually done, but
makes the assumption that the
amount of work doesn’t change
during the sprint.
• Remaining Story Points
• A direct reflection of actual value
delivered, and identifies likely
shortfall in sprint delivery
Estimated Remaining Hours
When to use:
• More mature teams where effort remaining
is predictable on a more granular level
When to avoid:
• For less mature Agile teams who are less
sure of how much time is required to
complete tasks
Outstanding Tasks
When to use:
• If working on very tight deadlines
When to avoid:
• If tasks are frequently added during the
sprint
Remaining Story Points
When to use:
• Less mature teams where effort remaining is
harder to predict on a more granular level
When to avoid:
• For more mature Agile teams where effort
remaining is predictable on a more granular
level
Actual Burndown
Ideal Burndown =
Remaining effort
Total Days in Sprint
Agile Project Metrics
5
Agile Project
Metrics
If scaling Agile, metrics should be
tracked at a project level as well as
at a team level. Keeping track of
whether the Agile project is going to
successfully make the intended
release date at production ready
quality is critical information to any
business that is scaling Agile.
Metrics on a program or release
level can be divided into:
1. Progress metrics
2. Quality metrics
Progress metrics
On Agile projects that run across multiple
releases and teams it can be difficult to
get a view of the overall health and
progress of the Agile project.
On Agile projects it is critical to
understand the status of the overall
project and if the overall performance is
improving over time. This can be done
using:
• A Release burn-up chart
• An epic burn-up chart
Quality metrics
Rather than fixing defects, Agile methods
strive to prevent them. In Agile the quality
of the product should improve throughout
the delivery lifecycle as practices improve
and the team gels together. This can be
done using:
• Defect Rate
• Running Tested Features (RTF)
The Release
Burn-Up Chart
On Agile projects, progress towards
a milestone is usually displayed
graphically using either a burn-
down or burn-up chart:
• Burn-down charts work best for
sprints
• Burn-up charts work best for
releases or epics
In the spirit of Agile a release burn-
up chart is a very simple and visual
way Agile projects to display
progress towards a release.
Using a burn-up chart, it is clear,
even on a paper printout, what the
effect will be of adding scope to a
sprint.
7
0
20
40
60
80
100
120
140
1 2 3 4 5 6 7 8 9
StoryPoints
Sprint
Purpose: Tracks how much work is complete
Goal: Reach the estimated forecast for total work to be completed line in an upwards curve
as the velocity of the team increases over time
Estimation: Technical leadership and technical architects usually make the estimation on
future stories to meet the release using a rolling average of the team’s velocity taken from
the last three sprints.
Things to watch out for:
• Estimation visibility – watch out that Scrum team members do not get visibility over
the estimates given for individual stories as this could effect their own estimations
during Sprint Planning sessions
• Flat-lining – indicating that velocity is not increasing over time and that expected
progress toward the release is not being made
Estimated forecast of work to be
completed in order to meet release
Work completed
0
100
200
300
400
500
600
700
800
1 2 3 4 5 6 7 8 9 10
Estimated Epic Budget Actual Story Points Cumulative Story Points
8
The Epic Burn-Up
Chart
Purpose: Tracks how much work is complete per epic
Goal: The Cumulative Story Points meets the Estimates Epic Budget once all work on the
Epic has been completed
• Estimation: Technical leadership and technical architects usually make the estimation
on future stories to meet the release using a rolling average of the team’s velocity
taken from the last three sprints.
Things to watch out for:
• Expended effort – this report requires an Agile management tool to automate this
process to avoid spending excessive amounts of time and effort creating the report for
each epic
Re-estimation due to a
Scope increase On larger scale Agile projects, and
on many smaller scale Agile
projects too, epics will span across
multiple teams and releases.
Tracking progress on epic level is a
useful way to:
1. Provide visibility into how much
effort is being expended per epic
across different teams, sprints and
projects
2. Make an informed assessment
into how much progress is being
made on the development of
different features to plan for future
releases
Defect Rate
Defect Rate is used to understand and
measure the quality associated with a
given engagement.
It is a ratio of total number of defects
over the effort. That ratio that can be
improved upon and compared with
other similar projects over time.
Defect Rate allows for an Agile project
to put the steps in place to
continuously improve the quality what
is produced every Sprint and reduce
the amount of effort fixing defects over
time.
9
Measurements required:
1. Number of Defects (by Sprint) = No of
components not meeting specifications
that need to be repaired or replaced each
Sprint
2. Actual To Date (ATD) = the amount of
effort for completed or commenced work
on a project within a specified timeframe
(e.g 250 hours per Sprint).
Defect Rate =
Overall target is <= 0.05 but the lower the
better is always the aim:
Potential actions to improve Defect
Rate:
• Scheduling in regular Brown Bag
sessions to train team members on
defect management process and how
to use the defect tracking tool.
• More stringent application of Product
Owner sign off of User Stories
• More stringent Code Review’s between
scrum team members
0
0,02
0,04
0,06
0,08
0,1
0,12
1 2 3 4 5 6 7 8 9 10
DefectRate
Sprint
Defect Rate Green Red
Number of defects
Actual to Date
Defect rate in red so action
must be taken to reduce Defect
Rate to improve quality
In Agile development, a potentially
releasable increment of ‘done’ product is
the result of each Sprint. However, as new
code is built each Sprint, existing code is a
risk of breaking effecting the quality of the
existing built product.
Running Tested Features (RTF) measures
the quality of the product built to date and
whether existing code remains ‘potentially
releasable’.
RTF can be calculated at Project level or
Release level.
Calculating RTF:
10
Running Tested
Features (RTF)
# of completed user stories
that still pass all the
acceptance tests
Total # of completed user
stories to date
x100
Measurements required:
1. No. of completed user stories to
date
This is the number of user stories in the
Release backlog that are built, tested and
accepted by Product Owner
2. No. of completed user stories that
still pass all the acceptance tests
Potential actions to improve RTF:
• Prioritise regression defect fixes over
new development to ensure the quality
of the product built incrementally
remains high and technical debt is not
accrued
Note: Running RTF is collected and
reported weekly. It should be re-
calculated every time a regression trust in
run. In an Agile project test should be
automated and run as frequently as
possible.
0
20
40
60
80
100
120
140
1 2 3 4 5 6 7 8 9 10
#ofUserStories
Sprint
Cumulative # of comleted user stories
Cumulative # of user stories that still pass all the acceptance
tests
Sprint
1 2 3 4 5 6 7 8 9 10
RTF 100%100%100% 96% 91% 78% 86% 90% 83% 81%
5 10 15 22 30 35 60 83 95 107
5 10 15 23 33 45 70 92 115 132
Indicating 22% of ‘done’ user stories
are not working and cannot be
deployed to production

More Related Content

Agile Metrics

  • 1. The Philosophy of Agile Metrics 1 Being Lean with Metrics In order for Agile Metrics be in line with the Agile philosophy, the following conditions should be met: • Use the minimum number of metrics that will provide all information necessary to meet business goals • Measure outcomes, not outputs • Measure results, not activity • Measure work items completed, not time spent per task • Assess trends, not snapshots • Provide feedback on frequent and regular intervals • Automate, automate, automate Delivering Business Value Agile believes that the metrics are a means to deliver the business value to the client. Therefore, Agile recommends a very lean, minimal set of metrics that really serves the purpose of the project. The Agile metrics should be simple, minimal/lean, outcome-based and easy to track and report frequently. The primary metric in an Agile project is whether working software actually exists, and is demonstrably suitable for its intended purpose. This is determined empirically, by demonstration, at the end of every single iteration and product increment All teams and projects are encouraged to pivot most of their measuring-attention to this fact. All other metrics are subordinate to that objective and the overriding goal of keeping the focus on rapid delivery of quality, working software.
  • 3. Calculating Velocity Velocity = ∑ Story points fully-delivered a in a sprint Note that partially delivered stories do not count towards a teams velocity (just like a half delivered pizza!) Forecasting Future Release Content While product backlogs are not usually estimated for the full project scope, it is, nonetheless, important to predict what the broad scope will be for future releases. This helps facilitate product support training or marketing. Technical leadership and Technical Architects usually make a high level estimation on the future stories using a rolling average of the team’s velocity taken from the last three sprints. The boundary line can change frequently, due to: • Team velocity fluctuations • Refinement of the epic backlog into smaller stories • Backlog re-ordering A release burn-up chart is used to display the likelihood of delivering all desired features by the release date. Calculating Team Productivity When release forecasting, consider changes in team capacity due to holidays, sickness or training. Busy periods such as summer or compliance periods for mandatory training can create velocity “noise” that can mask underlying performance issues, and so it is sometimes useful to create a metric that filters these variances out. Productivity = 3 Sprint Velocity Actual Sprint Effort Team Velocity Team Velocity is perhaps the single most important metric used in Agile development. It illustrates two key characteristics of Agile projects: 1. Forecast of the content of future releases 2. Variations in team productivity over time Caution should be applied to the way team velocity metrics are displayed and used across Agile teams. Velocity competition encourages teams to find ways to artificially increase their velocity by increasing the size of stories or cutting corners in development.
  • 4. 0 5 10 15 20 25 30 35 40 45 50 1 2 3 4 5 6 7 8 9 10 Effortremaining Working Days The Sprint Burn-Down Chart 4 Purpose: Track how much work remains in the Sprint Goal: Reach the x axis i.e. complete all the work assigned to the sprint Things to watch out for: • A sudden drop – indicating abandoned stories or tasks • Flat-lining – indicating stalled progress • A spike – indicating a change in scope during Sprint The sprint burn-down chart is used to demonstrate the overall progress and health of the sprint. Effort remaining is typically measured using one of the following: • Estimated Remaining Hours • Often used – but developers can feel they are estimating unknown quantities, and the chart can be distorted by optimism. • Outstanding Tasks • Reflects work actually done, but makes the assumption that the amount of work doesn’t change during the sprint. • Remaining Story Points • A direct reflection of actual value delivered, and identifies likely shortfall in sprint delivery Estimated Remaining Hours When to use: • More mature teams where effort remaining is predictable on a more granular level When to avoid: • For less mature Agile teams who are less sure of how much time is required to complete tasks Outstanding Tasks When to use: • If working on very tight deadlines When to avoid: • If tasks are frequently added during the sprint Remaining Story Points When to use: • Less mature teams where effort remaining is harder to predict on a more granular level When to avoid: • For more mature Agile teams where effort remaining is predictable on a more granular level Actual Burndown Ideal Burndown = Remaining effort Total Days in Sprint
  • 6. Agile Project Metrics If scaling Agile, metrics should be tracked at a project level as well as at a team level. Keeping track of whether the Agile project is going to successfully make the intended release date at production ready quality is critical information to any business that is scaling Agile. Metrics on a program or release level can be divided into: 1. Progress metrics 2. Quality metrics Progress metrics On Agile projects that run across multiple releases and teams it can be difficult to get a view of the overall health and progress of the Agile project. On Agile projects it is critical to understand the status of the overall project and if the overall performance is improving over time. This can be done using: • A Release burn-up chart • An epic burn-up chart Quality metrics Rather than fixing defects, Agile methods strive to prevent them. In Agile the quality of the product should improve throughout the delivery lifecycle as practices improve and the team gels together. This can be done using: • Defect Rate • Running Tested Features (RTF)
  • 7. The Release Burn-Up Chart On Agile projects, progress towards a milestone is usually displayed graphically using either a burn- down or burn-up chart: • Burn-down charts work best for sprints • Burn-up charts work best for releases or epics In the spirit of Agile a release burn- up chart is a very simple and visual way Agile projects to display progress towards a release. Using a burn-up chart, it is clear, even on a paper printout, what the effect will be of adding scope to a sprint. 7 0 20 40 60 80 100 120 140 1 2 3 4 5 6 7 8 9 StoryPoints Sprint Purpose: Tracks how much work is complete Goal: Reach the estimated forecast for total work to be completed line in an upwards curve as the velocity of the team increases over time Estimation: Technical leadership and technical architects usually make the estimation on future stories to meet the release using a rolling average of the team’s velocity taken from the last three sprints. Things to watch out for: • Estimation visibility – watch out that Scrum team members do not get visibility over the estimates given for individual stories as this could effect their own estimations during Sprint Planning sessions • Flat-lining – indicating that velocity is not increasing over time and that expected progress toward the release is not being made Estimated forecast of work to be completed in order to meet release Work completed
  • 8. 0 100 200 300 400 500 600 700 800 1 2 3 4 5 6 7 8 9 10 Estimated Epic Budget Actual Story Points Cumulative Story Points 8 The Epic Burn-Up Chart Purpose: Tracks how much work is complete per epic Goal: The Cumulative Story Points meets the Estimates Epic Budget once all work on the Epic has been completed • Estimation: Technical leadership and technical architects usually make the estimation on future stories to meet the release using a rolling average of the team’s velocity taken from the last three sprints. Things to watch out for: • Expended effort – this report requires an Agile management tool to automate this process to avoid spending excessive amounts of time and effort creating the report for each epic Re-estimation due to a Scope increase On larger scale Agile projects, and on many smaller scale Agile projects too, epics will span across multiple teams and releases. Tracking progress on epic level is a useful way to: 1. Provide visibility into how much effort is being expended per epic across different teams, sprints and projects 2. Make an informed assessment into how much progress is being made on the development of different features to plan for future releases
  • 9. Defect Rate Defect Rate is used to understand and measure the quality associated with a given engagement. It is a ratio of total number of defects over the effort. That ratio that can be improved upon and compared with other similar projects over time. Defect Rate allows for an Agile project to put the steps in place to continuously improve the quality what is produced every Sprint and reduce the amount of effort fixing defects over time. 9 Measurements required: 1. Number of Defects (by Sprint) = No of components not meeting specifications that need to be repaired or replaced each Sprint 2. Actual To Date (ATD) = the amount of effort for completed or commenced work on a project within a specified timeframe (e.g 250 hours per Sprint). Defect Rate = Overall target is <= 0.05 but the lower the better is always the aim: Potential actions to improve Defect Rate: • Scheduling in regular Brown Bag sessions to train team members on defect management process and how to use the defect tracking tool. • More stringent application of Product Owner sign off of User Stories • More stringent Code Review’s between scrum team members 0 0,02 0,04 0,06 0,08 0,1 0,12 1 2 3 4 5 6 7 8 9 10 DefectRate Sprint Defect Rate Green Red Number of defects Actual to Date Defect rate in red so action must be taken to reduce Defect Rate to improve quality
  • 10. In Agile development, a potentially releasable increment of ‘done’ product is the result of each Sprint. However, as new code is built each Sprint, existing code is a risk of breaking effecting the quality of the existing built product. Running Tested Features (RTF) measures the quality of the product built to date and whether existing code remains ‘potentially releasable’. RTF can be calculated at Project level or Release level. Calculating RTF: 10 Running Tested Features (RTF) # of completed user stories that still pass all the acceptance tests Total # of completed user stories to date x100 Measurements required: 1. No. of completed user stories to date This is the number of user stories in the Release backlog that are built, tested and accepted by Product Owner 2. No. of completed user stories that still pass all the acceptance tests Potential actions to improve RTF: • Prioritise regression defect fixes over new development to ensure the quality of the product built incrementally remains high and technical debt is not accrued Note: Running RTF is collected and reported weekly. It should be re- calculated every time a regression trust in run. In an Agile project test should be automated and run as frequently as possible. 0 20 40 60 80 100 120 140 1 2 3 4 5 6 7 8 9 10 #ofUserStories Sprint Cumulative # of comleted user stories Cumulative # of user stories that still pass all the acceptance tests Sprint 1 2 3 4 5 6 7 8 9 10 RTF 100%100%100% 96% 91% 78% 86% 90% 83% 81% 5 10 15 22 30 35 60 83 95 107 5 10 15 23 33 45 70 92 115 132 Indicating 22% of ‘done’ user stories are not working and cannot be deployed to production