Acceptance Test Driven Development (ATDD) is an agile software development practice where acceptance tests are automated to guide implementation of requirements. The key aspects of ATDD include:
1) Collaboratively writing examples and acceptance tests to define requirements.
2) Automating acceptance tests to verify requirements are met.
3) Using acceptance tests to facilitate discussions on future changes.
4) Driving implementation through the acceptance test suite.
Report
Share
Report
Share
1 of 32
More Related Content
Odd E验收测试驱动开发实战
1. 验收测试驱动开发实战 Acceptance Test Driven Development in practice Acceptance Test Driven Development in practice Steven Mak 麥天志 [email_address]
2. What are we up to now? Lost in translation Do not explain why Gaps discovered only until coding started Cumulative effects of small misunderstandings Inadequate and essentially flawed requirements and specifications
3. Failing to meet actual needs are obvious things really obvious? Fulfilling specifications does not guarantee success Imperative requirements
4. Meeting the needs with Acceptance TDD Drive implementation of a requirement through a set of automated, executable acceptance tests Requirement Acceptance Test Implementation Feedback
5. ATDD in a Nutshell Real-world examples to build a shared understanding of the domain Select a set of these examples to be a specification and an acceptance test suite Automate the verification of acceptance tests Focus the software development effort on the acceptance tests Use the set of acceptance tests to facilitate discussion about future change requests.
6. The ATDD cycle customer documentation A-TDD Workshop Feature Done Developers Testers Product Owner Architect Technical writers Example tests coding testing architecture other activities
7. Benefits of ATDD Comprehensible examples over complex formulas Close Collaboration Definition of Done Co-operative Work Trust and Commitment Testing on system level
8. Variations - an escape? Behaviour-driven development Example-driven development Executable specifications Names do not matter, but underlying practices matter Worthwhile to try if your business people do not like “testing”
9. Ideal candidate to work with Shared interest in success Authority to make decision Ability to understand implications Ability to explain the domain
10. Specification by Examples Use realistic examples to demonstrate differences in possibilities instead of abstract requirements Write specifications down as tables Workflows: Preconditions Processing steps Verifications
12. Specification workshop Ask the domain experts Developers and testers should suggest examples of edges or important issues for discussion Ubiquitous language Organise feedback to ensure shared understanding Use facilitator to stay focused if needed
13. Acceptance criteria Write tests collaboratively Examples in a form close to what your automation tool can understand Keep tests in a form that is human-readable Specification over Scripting, describe WHAT, not how Acceptance tests to prevent defects, not to discover Not necessarily automate anything Acceptance tests only work when we can discuss them
14. Some considerations User Interface Easy? Fragile? Performance issues? Boundary of Stub Sufficiently close Simulators? Business logic Not from developer perspective
15. Acceptance Test smells Long tests Parameters of calculation tests that always have the same value Similar test with minor differences Tests that reflect the way code was written Tests fail intermittently even though you didn’t change any code Interdependent tests, e.g. setup for others
16. Change Use existing acceptance tests to discuss future changes Seek advices from customer to determine if it specifies obsolete functionality when test fails Automate periodic execution of regression tests with CI Keep tests in the same version control as code
18. FIT FIT stands for “ F ramework for I ntegrated T ests” Most popular framework in-use Table-based Supporting languages like Java, C#, Python, or Ruby
19. FIT in practice Customer writes a test document containing examples Technical staff enhance the tables in the doc Technical staff implements fixture classes Executable Test Test doc with tables Test doc with sanitised tables Test doc and backing code (e.g. Java)
20. Robot Framework Python-based Keyword-driven test automation framework Test libraries implemented either in Python or Java Test cases are written in tabular format, save in HTML or TSV files Syntax similar to natural language Users can create new keywords from existing ones and contribute to the project
21. Preparing Test cases Test case name Test procedure using keywords Keyword arguments Test Case Action Argument Valid Login Open Login Page Input Name demo Input Password mode Submit Credentials
22. Data-driven test cases Define a keyword which will take the input data and prepare a table with test cases
23. Test case organisation Simple way: Single HTML file containing all test cases Test case tagging
24. Execution Gathering test cases, reading and setting variables Executing all actions for every test case Providing global statistics Writing the output in XML format Generating report and log in HTML format
31. References Bridging the Communication Gap Gojko Adzic Practical TDD and ATDD for Java Developers Lasse Koskela Agile Testing Lisa Crispin and Janet Gregory