Slow Down to Speed Up - Leveraging Quality to Enable Productivity and Speed with
- 1. Slow Down to Speed Up –
Leveraging Quality to Enable
Productivity and Speed
Fran O’Hara – Inspire Quality Services
Fran.ohara@inspireqs.ie
www.inspireqs.ie
© 2014 Inspire Quality Services
1
- 2. Agenda
• Quality and Speed/Productivity
• Lessons learnt/challenges
© 2014 Inspire Quality Services
2
- 3. Flipping the Iron Triangle
Scope/
Requirements
Plan Driven
Resources Schedule
FIXED
ESTIMATED
Resources Schedule
Value Driven
Scope/
Requirements
QuQauliatylity
?
3
- 4. Quality <-> Speed
Quick and Dirty is Faster
(short term)
Bad Quality slows you down
(long term)
Going faster gives better quality
(short & long term)
4
- 5. Economics of Product Development
Cycle Time
How Long it Takes to get Through the
Value Stream
• Economies of Speed/Cost of Delay
• Fast Feedback & Learning (Empirical)
• Waste/Cost Reduction & Agility
Unit/Post-Dev Cost
Cost of Deploying, Configuring,
Supporting, Using each ‘instance’
• Ease of Use, Robustness
• Cost of Configuration/Administration
• Browser/Platform/OS Support
Development Expense
Development Project Costs
• Cost of Engineering Team
• Dev Tools: SCM, CI, AutoTest
• Team Management & Facilities
• Shared Services (HR, Finance,etc.)
Product Value
Profit from a Software Product; Savings
from Internal IT Project
• Sales Revenue (Volume * Price)
• Cost Savings
• Strategic Value
Adapted from Don Reinertsen, 2009
5
- 7. Technical Debt
Symptoms of technical debt
• Bugs found in production
• Incomprehensible, un-maintainable
code
• Insufficient or un-maintainable
automated tests
• Lack of CI
• Poor internal quality
• Etc.
© 2014 Inspire Quality Services
7
- 8. Quality & Test
8
• Quality is not equal to test. Quality is achieved by putting
development and testing into a blender and mixing them until
one is indistinguishable from the other.
• Testing must be an unavoidable aspect of development, and
the marriage of development and testing is where quality is
achieved.
from ‘How google tests software’, James Whittaker et. al.
© 2014 Inspire Quality Services
- 9. Lessons Learnt / Challenges
Test
Automation
Line
Management
Definition of
Done
Test
Competency
Test Strategy &
Risk
Requirements
(e.g. Story size,
Non-Fn)
Techniques (e.g.
exploratory),
Planning for Quality,
Documentation, …..
9
- 10. Basic Testing within a Sprint
Automated
Acceptance/Story
based
Tests
Automated
Unit
Tests
Test
Strategy &
Risk
Manual
Exploratory
Tests
Represent Executable
requirements
Represent Executable
Design specifications
Provides
Supplementary
feedback
© 2014 Inspire Quality Services 10
- 12. Definition of ‘Done’
12
• An agreement between PO and the Team
– Evolving over time to increase quality & ‘doneness’
• Used to guide the team in estimating and doing
• Used by the PO to increase predictability and
accept Done PBIs
• ‘Done’ may apply to a PBI and to an Increment
• A single DoD may apply across an organisation, or
a product
– Multiple teams on a product share the DoD
Definition of
Done
© 2014 Inspire Quality Services
- 13. DoD example
Definition
of Done
13
Story level
• Unit tests passed,
• unit tests achieving 80%
decision coverage,
• Integration tests passed
• acceptance tests passed with
traceability to story
acceptance criteria,
• code and unit tests
reviewed,
• static analysis has no
important warnings,
• coding standard compliant,
• published to Dev server
Sprint level
• Reviewed and accepted by
PO,
• E-2-E functional and feature
tests passed
• all regression tests passing,
• exploratory testing
completed,
• performance profiling
complete,
• bugs committed in sprint
resolved,
• deployment/release docs
updated and reviewed,
• user manual updated
Release level
• Released to Stage server,
• Deployment tests passed,
• Deployment/release docs
delivered,
• large scale integration
performance/stress testing
passed
© 2014 Inspire Quality Services
- 14. The Automation Pyramid
Manual Tests
e.g. exploratory
GUI layer
e.g. Selenium
API/Service layer
Acceptance Tests
e.g. Fitnesse, Cucumber
Unit/Component layer
Developer Tests
e.g. JUnit
Automate at
feature/workflow level
Automate at
story level
Automate at
design level
Based on Mike Cohn
14
Test
Automation
- 15. Development Team
(Analysts, Progmrs., Testers, Architect, DBA, UI/UX, etc)
Architect
Team Lead
Developer1
Developer2
QA Lead
Tester1
Tester2
BA Lead
BA1
BA2 15
Create each
increment of
‘Done’ Product
No Specialised Sub-Teams
Test
Competency
?
- 16. Is testing fully integrated?
Sprint 1 Sprint 2
Code Code
Code &
Bug Fix
Test
Sprint 1 Sprint 2
Code
Code &
Bug Fix
Test
Code
Code &
Bug Fix
Test
Sprint 1 Sprint 2
Code & Bug Fix
Test
Code & Bug Fix
Test
A
B
C
Requirements
(e.g. Story
size, Non-Fn)
16
- 17. Examples of how to evolve quality/test
practices…
• See Google’s ‘Test Certified’ levels
• Paddy Power’s review of teams practices – using scale of 0-5 for items
such as
– Code Reviews,
– Pair Programming,
– Code Analysis,
– Unit Tests,
– Continuous Integration,
– Automated Acceptance Tests,
– Data Generation,
– Performance Analysis,
– TDD, etc.
(from Graham Abel, Softtest Test Automation Conference 2013)
• Communities of Practice, Tribes, Mentors, etc.
© 2014 Inspire Quality Services 17
- 18. Fran O’Hara
InspireQS
www.inspireqs.ie
fran.ohara@inspireqs.ie
Thank You!
© 2014 Inspire Quality Services 18