SlideShare a Scribd company logo
Testing in the Wild
Handling Non-Ideal Situations
Dawn Cannan
@dckismet
http://passionatetester.com
dcannan@gmail.com
George Dinwiddie
@gdinwiddie
iDIA Computing, LLC
gdinwiddie@idiacomputing.com
Agile Development Practices West 2011
Agile Testing Workshop
Tuesday, June 7, 2011
Check http://idiacomputing.com/publications.html for an updated
copy of these slides
Are we there yet?
 Change doesn’t happen overnight
 “Agile” isn’t a place you just get to and then stop
 Continuous Improvement
 Do, inspect, adapt
Legacy Systems
Battling monolithic monsters
Problem: Little Test Automation
 Developers aren’t writing sufficient unit tests
 None or very few integration tests
 Testers perform all testing manually
 Regression testing is done by humans
 Bugs pile up and “minor” ones don’t get fixed
Solution: Baby steps
 Don’t try to tackle new test automation and backfilling
regression tests all at once
 Start by automating common tasks with scripts, like setting up
environments or data that are needed to perform tests
 Use record-and-playback tool to create a broad and shallow
set of regression tests
Solution: Slow Down
 Reduce the number of bugs being injected each cycle
 Try root cause analysis to learn what issues are causing large
numbers of bugs
 Try having a retrospective targeted toward bugs
 Why are so many new bugs getting in?
 Encourage a “whole team approach”
 What can developers do differently?
 What can business-facing people do differently?
Problem: Test Data
 Test data may not be under tester’s control
 Tests may manipulate data, such that the tests are not
repeatable
 May not be clear how to set up a specific condition
Solution: Take Control
 Don’t write your automated tests to be intelligent and seek out
data
 Have the test create the data it needs right before the test (and
possibly removing it after)
 Mock the response of external data systems
 Work with other organizational departments to get access
 Set up small, local instances for test environments
Problem: Legacy Reporting
 Reports are generated with complicated, but pretty, information
 High level execs tend to want to see counts that represent
progress and state
 Some reports assume “testing silos”
 i.e., Count of bugs found by testers
Solution: Target important information
 Who is the target role for the information?
 What questions do they want the report to answer?
 How are the reports being used?
 Most of the time, individual numbers are not as important as
trends over time
Team Issues
When the teams aren’t as described in the Agile books
Problem: Geographic Distribution
 Parts of the team are in another location
 Time differences cause communications difficulties
 Can’t use index cards on a wall
 Language barriers may exist
Solution: Use available technology
 Online story tracking systems
 Pivotal Tracker, Jira, etc
 Online communication methods
 Instant messengers, Skype, IRC channels
 Agree on one 15-minute block of time to touch base
Solution: Distribute Smartly
 Mimic environments at each location (such as a wall of index
cards)
 Try to have representatives for each role at each location
 i.e., have somebody who can proxy for Product Owner in each
location
 Try to have face-to-face visits
 At the beginning of kicking off a project
 Once a year after that
Problem: Programmers vs. Testers
 Testers still feel separated from programmers
 Testable increments are still “thrown over the wall”
 Not much pairing of programmer/tester, if at all
Solution: Win Them Over
 Sit in the same room, if possible
 Look for those who are most willing to collaborate
 Ask for help
 Find a way that you can help them
 Find topics of interest to them, start a conversation
 Offer food and games (Seriously!)
Summary
 Find “low hanging fruit” and tackle those first
 Start small, get little pieces in place and build from there
 Develop relationships
Questions?
What types of situations have you
experienced?

More Related Content

Testing in the Wild

  • 1. Testing in the Wild Handling Non-Ideal Situations Dawn Cannan @dckismet http://passionatetester.com dcannan@gmail.com George Dinwiddie @gdinwiddie iDIA Computing, LLC gdinwiddie@idiacomputing.com Agile Development Practices West 2011 Agile Testing Workshop Tuesday, June 7, 2011 Check http://idiacomputing.com/publications.html for an updated copy of these slides
  • 2. Are we there yet?  Change doesn’t happen overnight  “Agile” isn’t a place you just get to and then stop  Continuous Improvement  Do, inspect, adapt
  • 4. Problem: Little Test Automation  Developers aren’t writing sufficient unit tests  None or very few integration tests  Testers perform all testing manually  Regression testing is done by humans  Bugs pile up and “minor” ones don’t get fixed
  • 5. Solution: Baby steps  Don’t try to tackle new test automation and backfilling regression tests all at once  Start by automating common tasks with scripts, like setting up environments or data that are needed to perform tests  Use record-and-playback tool to create a broad and shallow set of regression tests
  • 6. Solution: Slow Down  Reduce the number of bugs being injected each cycle  Try root cause analysis to learn what issues are causing large numbers of bugs  Try having a retrospective targeted toward bugs  Why are so many new bugs getting in?  Encourage a “whole team approach”  What can developers do differently?  What can business-facing people do differently?
  • 7. Problem: Test Data  Test data may not be under tester’s control  Tests may manipulate data, such that the tests are not repeatable  May not be clear how to set up a specific condition
  • 8. Solution: Take Control  Don’t write your automated tests to be intelligent and seek out data  Have the test create the data it needs right before the test (and possibly removing it after)  Mock the response of external data systems  Work with other organizational departments to get access  Set up small, local instances for test environments
  • 9. Problem: Legacy Reporting  Reports are generated with complicated, but pretty, information  High level execs tend to want to see counts that represent progress and state  Some reports assume “testing silos”  i.e., Count of bugs found by testers
  • 10. Solution: Target important information  Who is the target role for the information?  What questions do they want the report to answer?  How are the reports being used?  Most of the time, individual numbers are not as important as trends over time
  • 11. Team Issues When the teams aren’t as described in the Agile books
  • 12. Problem: Geographic Distribution  Parts of the team are in another location  Time differences cause communications difficulties  Can’t use index cards on a wall  Language barriers may exist
  • 13. Solution: Use available technology  Online story tracking systems  Pivotal Tracker, Jira, etc  Online communication methods  Instant messengers, Skype, IRC channels  Agree on one 15-minute block of time to touch base
  • 14. Solution: Distribute Smartly  Mimic environments at each location (such as a wall of index cards)  Try to have representatives for each role at each location  i.e., have somebody who can proxy for Product Owner in each location  Try to have face-to-face visits  At the beginning of kicking off a project  Once a year after that
  • 15. Problem: Programmers vs. Testers  Testers still feel separated from programmers  Testable increments are still “thrown over the wall”  Not much pairing of programmer/tester, if at all
  • 16. Solution: Win Them Over  Sit in the same room, if possible  Look for those who are most willing to collaborate  Ask for help  Find a way that you can help them  Find topics of interest to them, start a conversation  Offer food and games (Seriously!)
  • 17. Summary  Find “low hanging fruit” and tackle those first  Start small, get little pieces in place and build from there  Develop relationships
  • 18. Questions? What types of situations have you experienced?