SlideShare a Scribd company logo
The Agile Revolution of IBM
Alan Kan
Technical Manager, IBM Software Delivery Solutions
alankan@nz1.ibm.com
alan@alankan.net
2
 Why we needed to change
 Making the change
– Process
– People
– Tools
 Measuring Agile progress
Agenda
• How Agile is IBM now?
• Why did IBM Move to Agile?
• Our Journey of Becoming Agile
• Process
• People
• Tools
• Measuring the Impact
3
a globalteam US 10,400
Canada 3, 354
Latin America 303
EMEA 4,713
AP 8,153
Japan 282
Total 27,106*
Toronto,Ottawa,
Montreal, Vancover,
Victoria
Edinburgh
London/Staines
Milton Keynes
Hursley
Warwick
Haifa/Rehovot
Beijing
Shang Hai
Yamato
Taipei
LaGaude
Paris
Pornichet
Toulose
Beaverton
Kirkland
Seattle
Almaden
Agoura Hills
Costa Mesa
El Segundo
Foster City
San Francisco
SVL/San Jose
Las Vegas
Rochester
Minneapolis
Boulder
Denver
Lenexa,KA
Tucson
Pheonix
Austin
Dallas
Bedford, MA
Bedford, NH
Cambridge
Lexington
Littleton, MA
Waltham, MA
Westford – 528
Cork
Dublin
Galway
Boeblingen
Bangalore
Gurgaon
Hyderabad
Mumbai
Pune
Cairo
Rome -
Gold Coast
Sydney
Canberra
Fairfax
Raleigh
Charlotte
Lexington,
KY
Atlanta
Boca Raton
Tampa
Perth
Krakow
Warsaw
Sao Paulo
Malaysia
Delft – 61
Stockholm
Malmo
New York, NY
Pittsburg
Piscataway
Poughkeepsie
Princeton
Somers
Southbury
Helsinki
El Salto
Hong Kong
Singapore
74 Acquisitions
89 Labs
1198 products
506 releases / year
92% Growth Since 2001
10,000 resources from
acquisitions
# customers - 11, 867
% Efficiencies – 7% YoY
Growth Market 8,878 (33%)
Major Market 18,227 (67%)
4
Practices we adopted
Iterative
Development
API
first
end
game
retrospectives
always have
a client
continuous
integration
community
involvement
new &
noteworthy
adaptive
planning
continuous
testing
consume your
own output
drive with
open eyes
validate
reduce stress
learn
attract
to latest
transparency
validate
update
feature
teams
show progress
enable
validate
live
betas
feedback
sign
off
End of iteration
demos/reviews
Ranked
Product Backlog
Burndown User Stories
Daily Standup
independent
testing
exploratory
testing
Definition of
Done
PMC
TDD
Planning Poker
Leadership Role
Note: Goals are either internal IBM statistics or industry benchmarks.
Metric Goal
2006
Measurement
2011 Measurement
Maintenance / Innovation 50/50 42% / 58% 31% / 69%
Customer Touches / Product 100 ~10 ~ 400
Customer Calls -5% YoY ~ 135,000
~100,000 (-19% since
2009)
Customer Defect Arrivals -5% YoY ~ 6,900 ~2200
On Time Delivery 65% 47% 94%
Defect Backlog 3 Months 9+ Months 3 months
Enhancements Triaged 85% 3% 100%
Enhancements into Release 15% 1% 21%
Customer Sat Index 88% 83% 88%
Beta Defects Fixed Before GA 50% 3% 95%
Rational’s rewards
6
2006 IBM software
group reality
7
Software Group Acquisition Milestones
8
cost of poor quality
$25/defect $100/defect $16,000 per defect$450/defect $241,000 per defect$158,000 per defect
 Total Dollars Spent on Escapes
 Trend of Percentages in each Area over time
 Trend of Spend on L3 versus Technical Debt
 Trend of Spend vs Revenue
9
“A large UK bank initiated its APM effort to take a 90:10 ratio for run-the-bank / grow-the-bank down to
a more reasonable 40:60 ratio. Dell shifted its maintenance-to-innovation ratio from 80:20 to 50:50.”
The Application Portfolio Management Landscape — Combine Process And Tools To Tame The Beast, Phil Murphy, Forrester Research, Inc. April 15, 2011
Insufficient spend on
strategic projects
10
we needed to change
 Organize differently
 Develop differently
 Deliver differently
 Measure differently
11
Respond to fast changing
environment
Reduce process overhead
Better manage
outsourcing / contractors
Improve morale
Enhance quality
why move to agile?
12
People
Process
Tools
three areas of Change
13
Leadership Role
process
14
Initial Issues – Water Scrum Fall
15
Domain Complexity
Straight
-forward
Intricate,
emerging
Compliance requirement
Low risk Critical,
audited
Team size
Under 10
developers
1000’s of
developers
Co-located
Geographical distribution
Global
Enterprise discipline
Project
focus
Enterprise
focus
Technical complexity
Homogenous
Heterogeneous,
legacy
Organization distribution
(outsourcing, partnerships)
Collaborative Contractual
Disciplined Agile
Delivery
Flexible Rigid
Organizational complexity
Issues with Agility@Scale
16
auditable processes needed change
17
generic iteration definitions
endgame
release
M1a
plan
develop
stabilize
4 weeks
warm-up
retrospective
initialreleaseplan
decompression
M1
plan
develop
stabilize
…
plan
develop
stabilize
sign-off
sign-off
sign-off
4 weeks 4 weeks
fix-spit&polish
test
fix
test
 4 week iterations ⇒ end with an end of iteration demo
 8 week milestones ⇒ announced with New & Noteworthy ⇒ retrospective at the end
Retrospective
New&Noteworthy
End of iteration
demo
18
What is in a practice?
 Key concepts
 Work products
 Tasks
 Guidance
 Measurements
 Tool mentors
19
(*) Based on Mike Cohn, Agile Estimating and Planning
StrategyStrategy
PortfolioPortfolio
ProductProduct
ProjectProject
IterationIteration
DailyDaily
agile planning onion
 Agile Teams
Plan at
Innermost
Level
 “Required” at
all levels
20
Leadership Role
people
21
Leadership Role
lean training evolution
• Poppendieck collaboration
– Two day Disciplined Agile Workshop (9000+ trained)
• Additional focused workshops
– Leading Agile Teams & Project Management
• Deep dives on practices
– “show me how its done right in SWG today”
• Lean Series
– Complements Agile curriculum
• Collaborative leadership workshop
– Focused on middle management and executives to enable collaboration over
isolation or coordination
22
people do what you inspect
47%
100%
2006 2009
On Schedule Delivery
23
On-Site
(San Jose)
Off-Shore
(Bangalore)
Near-Shore
(Toronto)
Analysis Design Construction
Function &
Performance
Test
Component
Test
Deployment Project
Management
100%100%
40%
60%
70%
30%
60%
40%
80%
20% 20%
20%
60%
Geographic allocation and mapping
24
lessons for executives
• A completion date is not a point in time, it is a probability distribution
0 6 12
Plans/Resource estimates
Scope
Product features/quality
• Scope is not a requirements document, it is a continuous negotiation
• A plan is not a prescription, it is an evolving,
moving target
Actual path and precision of Scope/Plan
Uncertainty in
Stakeholder
Satisfaction Space
Initial PlanInitial State
25
Leadership Role
tools
26
Leadership Role
tools
Optimizing how
people work while
minimizing face-to-
face interactions
Increasing control
by integrating
workflows and “forcing”
new habits
Collaboration
Continuously improve
through real-time
measurement and
constant steering.
OptimizationProcess Automation
keys across all disciplines
27
Leadership Role
Number of comparisions
How important left vs right
pair-wise story comparison
28
Leadership Role
fast voting and ranking
Legend:
007 Integrated process
tailoring…
#1 for OSD directors
drops to #6 for Rational
044 Global Collaboration
#1 for IT Tiger Team
drops to #7 for Rational
027 Reporting
#1 for IT Accelertor team
remains on top for Rational
29
Leadership Role
tools
view plan by business value
30
Leadership Role
overall progress tracking
• End of Iteration Demos
• Definition of Done
• Done Criteria in Plan Items
• Risk Tracking in Plan Items
• Progress Reporting across Projects (planned)
31
Leadership Role
tools
done criteria
32
continuous integration
• Multi-staged
continuous integration
• Developer (continuous)
• Team (continuous)
• Product (weekly)
• Composite product
(weekly)
A Team’s Build Dashboard
33
Leadership Role
composite build
34
Leadership Role
retrospectives
35
Leadership Role
cross repository queries
36
Leadership Role
development and test relationship
37
Leadership Role
measuring
Agile
38
Executive
Dashboard
Development
Health
Business
Health
Development
Quality
Perceived
Quality
 Defect Backlog
 Test Escapes
 Functional Test Trends
 Critical Situations
 System Test Trends
 S-Curve Progress
 Automation Percentage
 Customer Test Cases
 Consumability Scorecard
 Defect Latency
 Quality Plan Commitments
 Test Coverage
 Defect Density
 Build Health
 Project Velocity
 Staffing Variance
 Process Timeliness
 Iteration/Milestone Status
 Severity Analysis
 Security Vulnerabilities
 Static Code Analysis
 Requirements Met
 IPD Timeliness
 Transactional Survey
 PMR / Call Rates
 Critical Situations
 Cost of Support
 Installability
 RFE SLAs
 Usability
 Consumability
 Scalability
 Integrations with other
products
 User Experience / Doc
 Time to Resolution
 APAR:PMR ratio
 Post-GA metrics
 Transparency
 Sales Plays
 Partner Enablement
 Support Enablement
 Technical Enablement
 Sales Enablement
 MCIF Index
 Alt Packaging
 OEMs
 XL hits
 Tactics
 ROI
 Pipeline / Multiplier
 Revenue
PracticesVulnerability Assessment
Concurrent Testing
Test Driven Development
Whole Team
Team Change Management
Evolutionary Architecture
Requirements Management
executive measurement
39
Leadership Role
tools
improving Bottom-line Growth
SW Revenue per DE HC $M
2004 2005 2006 2007 2008 2009
RevenueperHC$M
E/Ras%
Rev per DE HC
E/R
40
Leadership Role
tools
doing More with Less
Capacity
2003 2004 2005 2006 2007 2008 2009*
HC/ProductGA
SWGRevenuein$$’s
HC / Product GA
SWG Revenue
41
Leadership Role
Note: Goals are either internal IBM statistics or industry benchmarks.
Metric Goal
2006
Measurement
2011 Measurement
Maintenance / Innovation 50/50 42% / 58% 31% / 69%
Customer Touches / Product 100 ~10 ~ 400
Customer Calls -5% YoY ~ 135,000
~100,000 (-19% since
2009)
Customer Defect Arrivals -5% YoY ~ 6,900 ~2200
On Time Delivery 65% 47% 94%
Defect Backlog 3 Months 9+ Months 3 months
Enhancements Triaged 85% 3% 100%
Enhancements into Release 15% 1% 21%
Customer Sat Index 88% 83% 88%
Beta Defects Fixed Before GA 50% 3% 95%
Rational’s rewards
42
© Copyright IBM Corporation 2012. All rights reserved. The information contained in these materials is provided for informational purposes only, and is provided AS IS without warranty of any kind,
express or implied. IBM shall not be responsible for any damages arising out of the use of, or otherwise related to, these materials. Nothing contained in these materials is intended to, nor shall have
the effect of, creating any warranties or representations from IBM or its suppliers or licensors, or altering the terms and conditions of the applicable license agreement governing the use of IBM
software. References in these materials to IBM products, programs, or services do not imply that they will be available in all countries in which IBM operates. Product release dates and/or capabilities
referenced in these materials may change at any time at IBM’s sole discretion based on market opportunities or other factors, and are not intended to be a commitment to future product or feature
availability in any way. IBM, the IBM logo, Rational, the Rational logo, Telelogic, the Telelogic logo, and other IBM products and services are trademarks of the International Business Machines
Corporation, in the United States, other countries or both. Other company, product, or service names may be trademarks or service marks of others.
Alan Kan
Technical Manager, IBM Software Delivery Solutions
alan@alankan.net
www.linkedin.com/in/zenmaster/
43
Thank you to our sponsors
Thank you to our
sponsors

More Related Content

The Agile Revolution of IBM

  • 1. The Agile Revolution of IBM Alan Kan Technical Manager, IBM Software Delivery Solutions alankan@nz1.ibm.com alan@alankan.net
  • 2. 2  Why we needed to change  Making the change – Process – People – Tools  Measuring Agile progress Agenda • How Agile is IBM now? • Why did IBM Move to Agile? • Our Journey of Becoming Agile • Process • People • Tools • Measuring the Impact
  • 3. 3 a globalteam US 10,400 Canada 3, 354 Latin America 303 EMEA 4,713 AP 8,153 Japan 282 Total 27,106* Toronto,Ottawa, Montreal, Vancover, Victoria Edinburgh London/Staines Milton Keynes Hursley Warwick Haifa/Rehovot Beijing Shang Hai Yamato Taipei LaGaude Paris Pornichet Toulose Beaverton Kirkland Seattle Almaden Agoura Hills Costa Mesa El Segundo Foster City San Francisco SVL/San Jose Las Vegas Rochester Minneapolis Boulder Denver Lenexa,KA Tucson Pheonix Austin Dallas Bedford, MA Bedford, NH Cambridge Lexington Littleton, MA Waltham, MA Westford – 528 Cork Dublin Galway Boeblingen Bangalore Gurgaon Hyderabad Mumbai Pune Cairo Rome - Gold Coast Sydney Canberra Fairfax Raleigh Charlotte Lexington, KY Atlanta Boca Raton Tampa Perth Krakow Warsaw Sao Paulo Malaysia Delft – 61 Stockholm Malmo New York, NY Pittsburg Piscataway Poughkeepsie Princeton Somers Southbury Helsinki El Salto Hong Kong Singapore 74 Acquisitions 89 Labs 1198 products 506 releases / year 92% Growth Since 2001 10,000 resources from acquisitions # customers - 11, 867 % Efficiencies – 7% YoY Growth Market 8,878 (33%) Major Market 18,227 (67%)
  • 4. 4 Practices we adopted Iterative Development API first end game retrospectives always have a client continuous integration community involvement new & noteworthy adaptive planning continuous testing consume your own output drive with open eyes validate reduce stress learn attract to latest transparency validate update feature teams show progress enable validate live betas feedback sign off End of iteration demos/reviews Ranked Product Backlog Burndown User Stories Daily Standup independent testing exploratory testing Definition of Done PMC TDD Planning Poker
  • 5. Leadership Role Note: Goals are either internal IBM statistics or industry benchmarks. Metric Goal 2006 Measurement 2011 Measurement Maintenance / Innovation 50/50 42% / 58% 31% / 69% Customer Touches / Product 100 ~10 ~ 400 Customer Calls -5% YoY ~ 135,000 ~100,000 (-19% since 2009) Customer Defect Arrivals -5% YoY ~ 6,900 ~2200 On Time Delivery 65% 47% 94% Defect Backlog 3 Months 9+ Months 3 months Enhancements Triaged 85% 3% 100% Enhancements into Release 15% 1% 21% Customer Sat Index 88% 83% 88% Beta Defects Fixed Before GA 50% 3% 95% Rational’s rewards
  • 8. 8 cost of poor quality $25/defect $100/defect $16,000 per defect$450/defect $241,000 per defect$158,000 per defect  Total Dollars Spent on Escapes  Trend of Percentages in each Area over time  Trend of Spend on L3 versus Technical Debt  Trend of Spend vs Revenue
  • 9. 9 “A large UK bank initiated its APM effort to take a 90:10 ratio for run-the-bank / grow-the-bank down to a more reasonable 40:60 ratio. Dell shifted its maintenance-to-innovation ratio from 80:20 to 50:50.” The Application Portfolio Management Landscape — Combine Process And Tools To Tame The Beast, Phil Murphy, Forrester Research, Inc. April 15, 2011 Insufficient spend on strategic projects
  • 10. 10 we needed to change  Organize differently  Develop differently  Deliver differently  Measure differently
  • 11. 11 Respond to fast changing environment Reduce process overhead Better manage outsourcing / contractors Improve morale Enhance quality why move to agile?
  • 14. 14 Initial Issues – Water Scrum Fall
  • 15. 15 Domain Complexity Straight -forward Intricate, emerging Compliance requirement Low risk Critical, audited Team size Under 10 developers 1000’s of developers Co-located Geographical distribution Global Enterprise discipline Project focus Enterprise focus Technical complexity Homogenous Heterogeneous, legacy Organization distribution (outsourcing, partnerships) Collaborative Contractual Disciplined Agile Delivery Flexible Rigid Organizational complexity Issues with Agility@Scale
  • 17. 17 generic iteration definitions endgame release M1a plan develop stabilize 4 weeks warm-up retrospective initialreleaseplan decompression M1 plan develop stabilize … plan develop stabilize sign-off sign-off sign-off 4 weeks 4 weeks fix-spit&polish test fix test  4 week iterations ⇒ end with an end of iteration demo  8 week milestones ⇒ announced with New & Noteworthy ⇒ retrospective at the end Retrospective New&Noteworthy End of iteration demo
  • 18. 18 What is in a practice?  Key concepts  Work products  Tasks  Guidance  Measurements  Tool mentors
  • 19. 19 (*) Based on Mike Cohn, Agile Estimating and Planning StrategyStrategy PortfolioPortfolio ProductProduct ProjectProject IterationIteration DailyDaily agile planning onion  Agile Teams Plan at Innermost Level  “Required” at all levels
  • 21. 21 Leadership Role lean training evolution • Poppendieck collaboration – Two day Disciplined Agile Workshop (9000+ trained) • Additional focused workshops – Leading Agile Teams & Project Management • Deep dives on practices – “show me how its done right in SWG today” • Lean Series – Complements Agile curriculum • Collaborative leadership workshop – Focused on middle management and executives to enable collaboration over isolation or coordination
  • 22. 22 people do what you inspect 47% 100% 2006 2009 On Schedule Delivery
  • 23. 23 On-Site (San Jose) Off-Shore (Bangalore) Near-Shore (Toronto) Analysis Design Construction Function & Performance Test Component Test Deployment Project Management 100%100% 40% 60% 70% 30% 60% 40% 80% 20% 20% 20% 60% Geographic allocation and mapping
  • 24. 24 lessons for executives • A completion date is not a point in time, it is a probability distribution 0 6 12 Plans/Resource estimates Scope Product features/quality • Scope is not a requirements document, it is a continuous negotiation • A plan is not a prescription, it is an evolving, moving target Actual path and precision of Scope/Plan Uncertainty in Stakeholder Satisfaction Space Initial PlanInitial State
  • 26. 26 Leadership Role tools Optimizing how people work while minimizing face-to- face interactions Increasing control by integrating workflows and “forcing” new habits Collaboration Continuously improve through real-time measurement and constant steering. OptimizationProcess Automation keys across all disciplines
  • 27. 27 Leadership Role Number of comparisions How important left vs right pair-wise story comparison
  • 28. 28 Leadership Role fast voting and ranking Legend: 007 Integrated process tailoring… #1 for OSD directors drops to #6 for Rational 044 Global Collaboration #1 for IT Tiger Team drops to #7 for Rational 027 Reporting #1 for IT Accelertor team remains on top for Rational
  • 30. 30 Leadership Role overall progress tracking • End of Iteration Demos • Definition of Done • Done Criteria in Plan Items • Risk Tracking in Plan Items • Progress Reporting across Projects (planned)
  • 32. 32 continuous integration • Multi-staged continuous integration • Developer (continuous) • Team (continuous) • Product (weekly) • Composite product (weekly) A Team’s Build Dashboard
  • 38. 38 Executive Dashboard Development Health Business Health Development Quality Perceived Quality  Defect Backlog  Test Escapes  Functional Test Trends  Critical Situations  System Test Trends  S-Curve Progress  Automation Percentage  Customer Test Cases  Consumability Scorecard  Defect Latency  Quality Plan Commitments  Test Coverage  Defect Density  Build Health  Project Velocity  Staffing Variance  Process Timeliness  Iteration/Milestone Status  Severity Analysis  Security Vulnerabilities  Static Code Analysis  Requirements Met  IPD Timeliness  Transactional Survey  PMR / Call Rates  Critical Situations  Cost of Support  Installability  RFE SLAs  Usability  Consumability  Scalability  Integrations with other products  User Experience / Doc  Time to Resolution  APAR:PMR ratio  Post-GA metrics  Transparency  Sales Plays  Partner Enablement  Support Enablement  Technical Enablement  Sales Enablement  MCIF Index  Alt Packaging  OEMs  XL hits  Tactics  ROI  Pipeline / Multiplier  Revenue PracticesVulnerability Assessment Concurrent Testing Test Driven Development Whole Team Team Change Management Evolutionary Architecture Requirements Management executive measurement
  • 39. 39 Leadership Role tools improving Bottom-line Growth SW Revenue per DE HC $M 2004 2005 2006 2007 2008 2009 RevenueperHC$M E/Ras% Rev per DE HC E/R
  • 40. 40 Leadership Role tools doing More with Less Capacity 2003 2004 2005 2006 2007 2008 2009* HC/ProductGA SWGRevenuein$$’s HC / Product GA SWG Revenue
  • 41. 41 Leadership Role Note: Goals are either internal IBM statistics or industry benchmarks. Metric Goal 2006 Measurement 2011 Measurement Maintenance / Innovation 50/50 42% / 58% 31% / 69% Customer Touches / Product 100 ~10 ~ 400 Customer Calls -5% YoY ~ 135,000 ~100,000 (-19% since 2009) Customer Defect Arrivals -5% YoY ~ 6,900 ~2200 On Time Delivery 65% 47% 94% Defect Backlog 3 Months 9+ Months 3 months Enhancements Triaged 85% 3% 100% Enhancements into Release 15% 1% 21% Customer Sat Index 88% 83% 88% Beta Defects Fixed Before GA 50% 3% 95% Rational’s rewards
  • 42. 42 © Copyright IBM Corporation 2012. All rights reserved. The information contained in these materials is provided for informational purposes only, and is provided AS IS without warranty of any kind, express or implied. IBM shall not be responsible for any damages arising out of the use of, or otherwise related to, these materials. Nothing contained in these materials is intended to, nor shall have the effect of, creating any warranties or representations from IBM or its suppliers or licensors, or altering the terms and conditions of the applicable license agreement governing the use of IBM software. References in these materials to IBM products, programs, or services do not imply that they will be available in all countries in which IBM operates. Product release dates and/or capabilities referenced in these materials may change at any time at IBM’s sole discretion based on market opportunities or other factors, and are not intended to be a commitment to future product or feature availability in any way. IBM, the IBM logo, Rational, the Rational logo, Telelogic, the Telelogic logo, and other IBM products and services are trademarks of the International Business Machines Corporation, in the United States, other countries or both. Other company, product, or service names may be trademarks or service marks of others. Alan Kan Technical Manager, IBM Software Delivery Solutions alan@alankan.net www.linkedin.com/in/zenmaster/
  • 43. 43 Thank you to our sponsors Thank you to our sponsors

Editor's Notes

  1. Self introduction – I help my customers to leverage IBM technology to deliver high quality software faster. Been in IBM for 5 years, met some great people such as Scott Ambler, Mike O’Rouke, Scott Rich, etc. The information of this presentation primarily come from talking to them.
  2. IBM has 3 primary lines of business - hardware, software and services. We are talking about software group here. Size is big, but learnings are applicable to any organisations
  3. In 2006, dominantly using waterfall, RUP, and the Eclipse Way. Setup Agile COE in 2006, and management has invested $5m/year over 2006 and 2008. Primary differentiator between Agile and iterative = the use of User Stories, Ranked Product Backlog, reprioritise at each iteration, Daily Standup. Also adapted other common agile practices: iterative, reflect, adapt, incremental, feedback Practices inspired by agile practices, scrum, xp, some custom ones, that work for us
  4. Rewards have been significant On time delivery Enhancements Triaged, and Enhancements into release Maintenance/Innovation
  5. IBM has acquired a lot of companies for the talents. Have problem with geographically distributed teams being less effective. But no point forcing people to relocate because they will leave. So we have to work with it.
  6. GDD and distintegrated teams becomes quality problems. IBM figures on the slide.
  7. Quality problem means we spend a lot of resource in maintenance (bug fixing) that could have been doing innovative projects (new products). The implication is that if we are behind our competitors on innovation, that means they will have a new product before we do. Eventually they have a few products and features that we don’t, and we can’t catch up. Big problem that has senior management attention – losing our competitive advantage!
  8. Decided we need change in 2006.
  9. Why Agile? Cost of change is lower – getting into Agile is easier, getting out is easier.
  10. Good to have appropriate tools, trained up scrum masters, and people that know what they are doing. But also important is the right process, enforcement, nurturing, trust level from management to create a success agile environment.
  11. 50%+ of teams failed because of water scrum fall -> recognised that we need to do more than just changing dev teams.
  12. Dr Dobbs (Scott Ambler) found that effective agile teams are on the left of the scale. Mike O’Rouke found we are on the right on every count. Process-wise, use DAD. IBM has no way to deal with the situation using current tools -> need some new tools
  13. IDT Process is the process followed by all hardware, software and service product creation. Funding, scope, release date determined upfront, if changes, ask for forgiveness (more $). Over 50% of product development team come back for forgiveness -> obviously the model is not working too well. This convinced management to change. changed to unlock scope to 70/30, keep others constant.
  14. Generic iteration defnitions across teams in one product. Not generic across ibm.
  15. Document our best practice to share with other teams
  16. Agile = need to move fast. PMO idea and using excel to do status report after iterations don’t work too well. Most PMs can’t make it to agile. Need real time planning in tool, things like integration points, upgrades, migrations, still with PMs.
  17. Poppendieck – train the trainer with DAD Agile leadership and PM training for existing PMs Deep dive such as forced check-in, continuous build, TDD and coding for testers
  18. 2006 to 2008, we added on time delivery as a KPI to people, impacting performance review and bonus 2009 = 100% on time. But quality went down. So we added quality as another KPI. 2011 = 94% on time. But better quality.
  19. CFO mindset of offshoring -> need to convince them productivity level changes and therefore success rate of project changes dramatically. Don’t outsource for cost, hire the best talent from diff locations.
  20. Can’t just fund and walk away. They need to be there with us Customers are happy with the 70/30 arrangement
  21. 3 key success factors Collaboration – traceable conversation to context (story/task) to help with distributed teams Automation – forcing habits, putting in process Optimisation – real time reporting to help managers to remove impediments quickly
  22. Let’s say the team is largely based in Europe. The NZ guys need to wake up at 2am to do planning poker. It is not fun, and has less quality input from members. This system helps geo distributed teams to do planning poker by pairwise comparison + comment
  23. System aggregates the voting and ranks user stories. Not the end of it. But it is a much better baseline to talk. Most of the time everyone agrees.
  24. One source of truth
  25. Capture done critiera with user story
  26. Key is auditable (know when it is suggested) and actionable.
  27. If independent test team needs to connect to our user stories and create test cases from there to ensure traceability, they can. Based on Jazz platform.
  28. If they use Rational tools, then built in integration tried and tested.
  29. One way that we are looking at organizing metrics around business and operational objectives, with direct exec-level input. Too many, only ended up with 10.
  30. These are what senior management look at. Revenue by Distinguished Engineer – up a bit Employee/Revenue – down heaps = they need less people to make a dollar of revenue.
  31. HC/Product GA = we only need half as many people now to create a product compared to 2006.
  32. Defect Backlog Beta Defects Fixed before GA
  33. Author Note: Mandatory Rational closing slide (includes appropriate legal disclaimer). Graphic is available in English only.