SlideShare a Scribd company logo
AUTOMATED TEST TOOLS
EVALUATION CRITERIA




        Terry Horwath
    Version 1.02 (1/18/07)
Automated Test Tools Evaluation Criteria                                                                         Version 1.02 (1/18/07)




                                                    Table of Contents

1.     INTRODUCTION                                                                                                                            1
1.1     Author’s Background ...........................................................................................................1
1.2     Allocate Reasonable Resources and Talent..........................................................................1
1.3     Establish Reasonable Expectations ......................................................................................2
2.     RECOMMENDED EVALUATION CRITERIA                                                                                                         3
2.1     GUI Object Recognition.......................................................................................................3
2.2     Platform Support ..................................................................................................................3
2.3     Recording Browser Objects..................................................................................................3
2.4     Cross-browser Playback .......................................................................................................3
2.5     Recording Java Objects ........................................................................................................4
2.6     Java Playback .......................................................................................................................4
2.7     Visual Testcase Recording ...................................................................................................4
2.8     Scripting Language...............................................................................................................4
2.9     Recovery System ..................................................................................................................4
2.10    Custom Objects ....................................................................................................................5
2.11    Technical Support.................................................................................................................5
2.12    Internationalization Support .................................................................................................5
2.13    Reports..................................................................................................................................5
2.14    Training & Hiring Issues ......................................................................................................5
2.15    Multiple Test Suite Execution ..............................................................................................5
2.16    Testcase Management...........................................................................................................6
2.17    Debugging Support...............................................................................................................6
2.18    User Audience ......................................................................................................................6




                                                                      ii                                                     Terry Horwath
Automated Test Tools Evaluation Criteria                                      Version 1.02 (1/18/07)




1. INTRODUCTION
This document provides a list of evaluation criteria which has proven useful to me when
evaluating automated test tools like Mercury Interactive’s QuickTest Professional, WinRunner
and Segue’s Silk over the last several years for a variety of clients. Hopefully some readers will
find this information useful, such that it reduces your evaluation effort.
The specific criteria used for each project differs based on the client’s:
•   testing environment, and
•   test engineers’ programming backgrounds and skill sets, and
•   type of software being tested [especially the software develop tool, such as Visual Basic,
    PowerBuilder, Java, browser based applications, etc.], and
•   application(s) testing requirements.

The remainder of this chapter provides a variety of miscellaneous thoughts I have on automating
the testing process, while Chapter 2 contains my list of potential evaluation criteria. Note that
some of the Chapter 2 evaluation criteria is Java and web application testing oriented. Substitute
your application development tool—for example Visual Basic or PowerBuilder—in the Java
related evaluation criteria items.


1.1 Author’s Background
I have designed custom frameworks as well as hundreds of test cases using Silk/QaPartner from
1994 (version 1.0) through 2004 (version 5.5). with WinRunner (version 5) and Test Director in
1999 and 2000 and with QuickTest Professional since 2006 (versions 8 and 9).


1.2 Allocate Reasonable Resources and Talent
Most software testing projects do not fail because of the selected test tools—virtually all of top
automated testing tools on the market can be used to do an adequate job, even when the test tool
is not well matched with the software development environment. Rather I believe that most
failures are due to a combination of the following reasons:
1. Test engineers fail to treat the effort to develop a large number of complex test cases and test
   suites as a large software development project—it is crucial to apply good software
   development methodology to produce a test product, which includes defining requirements,
   developing a schedule, implementing each test suite using a shared custom framework of well
   known libraries and guidelines, as well as using a software version control system.
2. Sufficient manpower and time are not allocated early enough in the application development
   cycle. Along with incomplete testing this also leads to the phenomenon of test automation
   targeted for use with Release N actually being delivered and used with Release N+1.
3. Test technicians with improper skills are assigned to use these automated test tools. Users of
   these tools must have strong test mentalities and in all but a few situations they must also
   possess solid programming skills with the automation tool’s scripting language.




                                                  1                                   Terry Horwath
Automated Test Tools Evaluation Criteria                                       Version 1.02 (1/18/07)




1.3 Establish Reasonable Expectations
Through their promotional literature automated test tool vendors often establish unrealistic
expectations in one or more of the following areas:
•   What application features and functions can truly be tested with the tool.
•   The skill level required to effectively use the tool.
•   How useful the tool’s automatic recording capabilities are.
•   How quickly effective testcases can be produced.

This is unfortunate because in the hands of test engineers possessing the proper skill set all of the
top automated test tools can be used to test significant portions of virtually any GUI-centric
application. Use the following assumptions when reviewing this document and planning your
evaluation effort:
1. Even when a test tool is well matched with a software development tool, the test tool will still
   only be able to recognize a subset of the application’s objects—windows, buttons, controls,
   etc.—without taking special programming actions. This subset will be large when the
   development engineers create window objects using the development tool’s standard class
   libraries. The related issue of cross-browser playback also rears it head when testing web
   applications.
2. If the test engineer wants to unleash the full power of the test tool they will need to have, or
   develop, solid programming skills with the tool’s scripting language.
3. With few exceptions recording utilities—those tools which capture user interaction and insert
   validation functions—are only effective in roughing out a testcase. Thereafter captured
   sequences will most often need be cleaned up and/or generalized using the scripting
   language.
4. If an application has functionality which can’t be tested through the GUI you will need to:
   (a) use the tool’s ability to interface to DLLs—for Windows based applications;
   (b) use its SDK (software developer kit) or API if it supports one of these mechanisms;
   (c) use optional tools—at an additional cost—offered by the test tool vendor;
   (d) use other 3rd party non-GUI test tools more appropriate to the testing task.
5. If you are currently manually testing the application to be automated you will need to initially
   increase the size of the test team by a minimum of 1 or 2 test engineers—who possess good
   programming backgrounds. After a significant portion of testcases have been written and
   debugged you can start removing some of the manual test engineers. Pay back comes at the
   end of the automation effort, not during the initial implementation.
6. If the test team does not contain at least one member previously involved with automating the
   test process, coming up to speed is no small task—no matter which tool is selected. Budget
   dollars and time for training classes and consulting offered by the tool vendor to get your test
   team up and running.
7. Budget 80 hours of time to do a detailed evaluation of each vendor’s automated test tool
   against your selected evaluation criteria, using one of your applications. While you might
   initially recoil from this significant investment in time, keep in mind that the selected tool
   will likely be part of your department’s testing effort for many years—selecting the wrong
   tool will reduce productivity many times over 80 hours.




                                                   2                                   Terry Horwath

Recommended for you

Test automation
Test automationTest automation
Test automation

This document provides an overview of test automation using Cucumber and Calabash. It discusses using Cucumber to write automated test specifications in plain language and Calabash to execute those tests on Android apps. It outlines the environments, tools, and basic steps needed to get started, including installing Ruby and DevKit, creating Cucumber feature files, and using Calabash APIs to automate user interactions like tapping, entering text, and scrolling. The document also explains how to run tests on an Android app and generate an HTML report of the results.

calabashcucumberandroid test
Introduction to Test Automation
Introduction to Test AutomationIntroduction to Test Automation
Introduction to Test Automation

A brief introduction to test automation covering different automation approaches, when to automate and by whom, commercial vs. open source tools, testability, and so on.

software testingtest automation
Automation testing
Automation testingAutomation testing
Automation testing

This document discusses automation testing. It begins by defining automation testing and listing its benefits, which include saving time and money, improving accuracy, and increasing test coverage. It then covers levels of automation testing, frameworks, approaches like record and playback, modular scripting, and keyword-driven testing. The document also discusses the automation testing lifecycle, how to choose a testing tool, types of tools, when to automate and who should automate, supporting practices, and skills needed for automation testing.

testingautomation testingquality control
Automated Test Tools Evaluation Criteria                                       Version 1.02 (1/18/07)




2. RECOMMENDED EVALUATION CRITERIA

2.1 GUI Object Recognition
Does the tool:
(a) Provide the ability to record each object in a window—or on a browser page—such that a
    logical object identifier, used in the script, is definable independent of the operating system
    dependent property [or properties] used by the tool to access that object at runtime.
(b) (1) Provide the ability to associate (i.e. map) the logical object identifier with more than one
    operating system dependent property. And, (2) does the tool offer some technique to support
    a property definition technique which supports internationalization [if language localization
    is a testing requirement]?
(c) provide the ability to record—and deal effectively with—dynamically generated objects
    [often encountered when testing web applications].


2.2 Platform Support
Are all of the required platforms [i.e. NT 4.0, Windows XP, Windows Vista, etc.] supported for:
(a) testcase playback?
(b) testcase recording?
(c) testcase development [programming without recording support]?


2.3 Recording Browser Objects
Does the tool provide the ability to record against web applications under test, correctly
recognizing all browser page HTML objects, using the following browsers:
(a) IE7?
(b) IE6?
(c) Firefox?


2.4 Cross-browser Playback
Does the tool provide the ability to reliably and repeatedly playback test scripts against the
browsers which were not used during object capture and testcase creation, with little or no:
(a) Changes to the GUI map (WinRunner), GUI declarations (Silk) or the equivalent in other
    tools?
(b) Changes to testcase code?
(c) Does the tool provide some type of generic capability [without using sleep–like commands
    in the code] to deal with “browser not ready” to correctly synchronize code execution—such
    as an access to a web page over a slow internet connection?




                                                  3                                    Terry Horwath
Automated Test Tools Evaluation Criteria                                         Version 1.02 (1/18/07)




2.5 Recording Java Objects
Does the tool:
(a) Provide the ability to record objects against, and see all standard Swing, AWT and JFC 1.1.8
    and 1.2 objects, when running the Java application under test?
(b) Provide the ability to record objects against [and interact with] non-standard Java classes
    required by the Java application under test (i.e. for example the KLGroup’s 3rd party controls,
    when the application under test uses that 3rd party toolset)?
(c) Require that the platform’s static classpath environment variable be set with tool specific
    classes, or can this be set within the tool on a test suite by test suite basis?


2.6 Java Playback
Does the tool:
(a) Reliably and repeatedly play back the evaluation testcases?
(b) Provide some type of generic capability [without using sleep –like commands in the code]
    to deal with “application not ready” to correctly synchronize code execution? [This may or
    may not be an issue, depending on the application being tested].


2.7 Visual Testcase Recording
Does the tool:
(a) Provide the ability to visually record testcases by interacting with the application under test as
    a real user would?
(b) Provide the ability, while visually recording a testcase, to interactively insert—without
    resorting to programming—validation statements?
(c) Provide the ability, while interactively inserting a validation statements, to
    visually/interactively select validation properties (i.e. contents of a text field, focus on a
    control, control enabled, etc.)?


2.8 Scripting Language
Is the test tool’s underlying scripting language:
(a) object-oriented?
(b) Proprietary?


2.9 Recovery System
Does the tool support some type of built-in recovery system, which the programmer can
control/define, that drives the application under test back to a know state? (Especially in the case
where modal dialogs were left open when a testcase failure occurred)?




                                                    4                                    Terry Horwath
Automated Test Tools Evaluation Criteria                                       Version 1.02 (1/18/07)


2.10 Custom Objects
What capabilities does the tool provide to deal with unrecognized objects in a window or on a
browser page? [Spend a fair amount of time evaluating this capability, as it is quite important].


2.11 Technical Support
What was the quality and timeliness of technical support received during product evaluation?
[Remember—it won’t get any better after you purchase the product, but it might get worse].


2.12 Internationalization Support
Evaluate the support for internationalization [also referred to as language localization] in the
following areas [if this is a testing requirement]:
(a) Object recognition
(b) Object content (such as text fields, text labels, etc.).
(c) Evaluate and highlight any built–in or add-on multi–language support offered by the vendor.


2.13 Reports
What type of reporting and logging capabilities does the tool provide?


2.14 Training & Hiring Issues
(a) What is your [not the vendor’s] estimated learning curve to be competent (i.e. can write
    useful test scripts which may need to be rewritten later);
(b) What is your estimated learning curve to become an skilled (i.e. can write test scripts which
    rarely need to be rewritten).
(c) What is your estimated learning curve to become an expert (i.e. can design frameworks).
(d) What is your estimated availability of potential (i) employees, and (ii) expert consultants,
    skilled with this tool in your geographic area.


2.15 Multiple Test Suite Execution
(a) Can multiple test suites be driven completely from the tool [or from a command line
    interface] thereby allowing X number of unrelated suites/projects to be executed under a
    cron-like job or shell? (For true unattended operation).
(b) …including the ability to save the results log, as text, prior to or during termination/exit?
(c) …including the ability to return a reliable pass/fail status on termination/exit?




                                                  5                                     Terry Horwath
Automated Test Tools Evaluation Criteria                                       Version 1.02 (1/18/07)




2.16 Testcase Management
Does the tool support some type of test case management facility (either built-in or as an add-on)
that allows each test engineer to execute any combination of tests out of the full test suite for a
given project? How difficult is it to integrate manual testing results with automated test results?


2.17 Debugging Support
What type of debugging capabilities does the tool support to help isolate scripting and/or runtime
errors?


2.18 User Audience
Which of the following groups of users does the tool primarily target?
•   Test technicians possess good test mentalities, but often lack much if any background in
    programming or software development methodologies. They are the backbone of many test
    groups and have often spent years developing and executing manual testcases.
•   Test developers possess all of the test technician’s skill set, plus they have had some formal
    training in programming and limited experience working with on a software development
    project and/or automated testcases.
•   Test architects possess all of the test developer’s skill set, plus they have had many years of
    experience developing and maintaining automated test cases, as well as experience defining
    and implementing the test framework under which multiple automated test suites are
    developed. They are a recognized expert with at least one automated tool.




                                                 6                                     Terry Horwath

Recommended for you

Test Automation
Test AutomationTest Automation
Test Automation

This document summarizes a presentation on test automation. It discusses why test automation is needed such as manual testing taking too long and being error prone. It covers barriers to test automation like lack of experience and programmer attitudes. An automation strategy is proposed, including categories of tests to automate and not automate. Best practices are provided such as having an automation engineer and following software development practices. Specific tools are also mentioned. Good practices and lessons learned are shared such as prioritizing tests and starting better practices with new development.

agiletest automation
QSpiders - Automation using Selenium
QSpiders - Automation using SeleniumQSpiders - Automation using Selenium
QSpiders - Automation using Selenium

The document discusses automation testing using Selenium. It provides an overview of Selenium, including what it is, its components like Selenium IDE, Selenium RC, Selenium Grid, and Selenium WebDriver. It explains the features and advantages of each component. Selenium is an open source tool that allows automated testing of web applications across different browsers and platforms. It supports recording and playback of tests and can help reduce testing time and costs through automation.

automationseleniumqspiders
Test automation methodologies
Test automation methodologiesTest automation methodologies
Test automation methodologies

it is about test automation methodologies which you should know before starting to automate your test cases.

testmethodologiesautomation

More Related Content

What's hot

Test Automation
Test AutomationTest Automation
Test Automation
rockoder
 
Test Automation and Selenium
Test Automation and SeleniumTest Automation and Selenium
Test Automation and Selenium
Karapet Sarkisyan
 
QA process Presentation
QA process PresentationQA process Presentation
QA process Presentation
Nadeeshani Aththanagoda
 
Test automation
Test automationTest automation
Test automation
Xavier Yin
 
Introduction to Test Automation
Introduction to Test AutomationIntroduction to Test Automation
Introduction to Test Automation
Pekka Klärck
 
Automation testing
Automation testingAutomation testing
Automation testing
Mona M. Abd El-Rahman
 
Test Automation
Test AutomationTest Automation
Test Automation
nikos batsios
 
QSpiders - Automation using Selenium
QSpiders - Automation using SeleniumQSpiders - Automation using Selenium
QSpiders - Automation using Selenium
Qspiders - Software Testing Training Institute
 
Test automation methodologies
Test automation methodologiesTest automation methodologies
Test automation methodologies
Mesut Günes
 
Performance testing : An Overview
Performance testing : An OverviewPerformance testing : An Overview
Performance testing : An Overview
sharadkjain
 
Automation Testing
Automation TestingAutomation Testing
Automation Testing
Sun Technlogies
 
Selenium Automation Framework
Selenium Automation  FrameworkSelenium Automation  Framework
Selenium Automation Framework
Mindfire Solutions
 
How to select the right automated testing tool
How to select the right automated testing toolHow to select the right automated testing tool
How to select the right automated testing tool
Katalon Studio
 
How to Design a Successful Test Automation Strategy
How to Design a Successful Test Automation Strategy How to Design a Successful Test Automation Strategy
How to Design a Successful Test Automation Strategy
Impetus Technologies
 
Test Automation Strategies For Agile
Test Automation Strategies For AgileTest Automation Strategies For Agile
Test Automation Strategies For Agile
Naresh Jain
 
Katalon Studio - Successful Test Automation for both Testers and Developers
Katalon Studio - Successful Test Automation for both Testers and DevelopersKatalon Studio - Successful Test Automation for both Testers and Developers
Katalon Studio - Successful Test Automation for both Testers and Developers
Katalon Studio
 
Test automation framework
Test automation frameworkTest automation framework
Test automation framework
QACampus
 
Top ten software testing tools
Top ten software testing toolsTop ten software testing tools
Top ten software testing tools
JanBask Training
 
Top 20 best automation testing tools
Top 20 best automation testing toolsTop 20 best automation testing tools
Top 20 best automation testing tools
QACraft
 
Introduction to Automation Testing
Introduction to Automation TestingIntroduction to Automation Testing
Introduction to Automation Testing
Archana Krushnan
 

What's hot (20)

Test Automation
Test AutomationTest Automation
Test Automation
 
Test Automation and Selenium
Test Automation and SeleniumTest Automation and Selenium
Test Automation and Selenium
 
QA process Presentation
QA process PresentationQA process Presentation
QA process Presentation
 
Test automation
Test automationTest automation
Test automation
 
Introduction to Test Automation
Introduction to Test AutomationIntroduction to Test Automation
Introduction to Test Automation
 
Automation testing
Automation testingAutomation testing
Automation testing
 
Test Automation
Test AutomationTest Automation
Test Automation
 
QSpiders - Automation using Selenium
QSpiders - Automation using SeleniumQSpiders - Automation using Selenium
QSpiders - Automation using Selenium
 
Test automation methodologies
Test automation methodologiesTest automation methodologies
Test automation methodologies
 
Performance testing : An Overview
Performance testing : An OverviewPerformance testing : An Overview
Performance testing : An Overview
 
Automation Testing
Automation TestingAutomation Testing
Automation Testing
 
Selenium Automation Framework
Selenium Automation  FrameworkSelenium Automation  Framework
Selenium Automation Framework
 
How to select the right automated testing tool
How to select the right automated testing toolHow to select the right automated testing tool
How to select the right automated testing tool
 
How to Design a Successful Test Automation Strategy
How to Design a Successful Test Automation Strategy How to Design a Successful Test Automation Strategy
How to Design a Successful Test Automation Strategy
 
Test Automation Strategies For Agile
Test Automation Strategies For AgileTest Automation Strategies For Agile
Test Automation Strategies For Agile
 
Katalon Studio - Successful Test Automation for both Testers and Developers
Katalon Studio - Successful Test Automation for both Testers and DevelopersKatalon Studio - Successful Test Automation for both Testers and Developers
Katalon Studio - Successful Test Automation for both Testers and Developers
 
Test automation framework
Test automation frameworkTest automation framework
Test automation framework
 
Top ten software testing tools
Top ten software testing toolsTop ten software testing tools
Top ten software testing tools
 
Top 20 best automation testing tools
Top 20 best automation testing toolsTop 20 best automation testing tools
Top 20 best automation testing tools
 
Introduction to Automation Testing
Introduction to Automation TestingIntroduction to Automation Testing
Introduction to Automation Testing
 

Viewers also liked

Testing a Test: Evaluating Our Assessment Tools
Testing a Test: Evaluating Our Assessment ToolsTesting a Test: Evaluating Our Assessment Tools
Testing a Test: Evaluating Our Assessment Tools
Eddy White, Ph.D.
 
Types of test tools
Types of test toolsTypes of test tools
Types of test tools
Vaibhav Dash
 
Software testing tools
Software testing toolsSoftware testing tools
Software testing tools
Gaurav Paliwal
 
Software testing tools (free and open source)
Software testing tools (free and open source)Software testing tools (free and open source)
Software testing tools (free and open source)
Wael Mansour
 
Internet browers comparison
Internet browers comparisonInternet browers comparison
Internet browers comparison
ferristic
 
Browsers comparison
Browsers comparisonBrowsers comparison
Browsers comparison
Svetlana Puchkova
 
Software testing tools and its taxonomy
Software testing tools and its taxonomySoftware testing tools and its taxonomy
Software testing tools and its taxonomy
Himanshu
 
Automation Testing with Sikuli
Automation Testing with SikuliAutomation Testing with Sikuli
Automation Testing with Sikuli
lionpeal
 
Sikuli
SikuliSikuli
Practical Software Testing Tools
Practical Software Testing ToolsPractical Software Testing Tools
Practical Software Testing Tools
Dr Ganesh Iyer
 
Lecture 6
Lecture 6 Lecture 6
Lecture 6
Salina Saharudin
 
FITT Toolbox: Evaluation Criteria
FITT Toolbox: Evaluation CriteriaFITT Toolbox: Evaluation Criteria
FITT Toolbox: Evaluation Criteria
FITT
 
Object oriented testing
Object oriented testingObject oriented testing
Object oriented testing
Haris Jamil
 
Types of testing and their classification
Types of testing and their classificationTypes of testing and their classification
Types of testing and their classification
Return on Intelligence
 
ppt norm reference and criteration test
ppt norm reference and criteration testppt norm reference and criteration test
ppt norm reference and criteration test
Nur Arif S
 
Criteria for evaluation
Criteria for evaluationCriteria for evaluation
Criteria for evaluation
Juliet Cabiles
 
Selecting the Right Automated Testing tool
Selecting the Right Automated Testing tool Selecting the Right Automated Testing tool
Selecting the Right Automated Testing tool
Ho Chi Minh City Software Testing Club
 
Practical Sikuli: using screenshots for GUI automation and testing
Practical Sikuli: using screenshots for GUI automation and testingPractical Sikuli: using screenshots for GUI automation and testing
Practical Sikuli: using screenshots for GUI automation and testing
vgod
 
Chapter 7 Test and Measurement in Sports
Chapter 7 Test and Measurement in SportsChapter 7 Test and Measurement in Sports
Chapter 7 Test and Measurement in Sports
Vibha Choudhary
 
Norm-referenced & Criterion-referenced Tests
Norm-referenced & Criterion-referenced TestsNorm-referenced & Criterion-referenced Tests
Norm-referenced & Criterion-referenced Tests
Fariba Chamani
 

Viewers also liked (20)

Testing a Test: Evaluating Our Assessment Tools
Testing a Test: Evaluating Our Assessment ToolsTesting a Test: Evaluating Our Assessment Tools
Testing a Test: Evaluating Our Assessment Tools
 
Types of test tools
Types of test toolsTypes of test tools
Types of test tools
 
Software testing tools
Software testing toolsSoftware testing tools
Software testing tools
 
Software testing tools (free and open source)
Software testing tools (free and open source)Software testing tools (free and open source)
Software testing tools (free and open source)
 
Internet browers comparison
Internet browers comparisonInternet browers comparison
Internet browers comparison
 
Browsers comparison
Browsers comparisonBrowsers comparison
Browsers comparison
 
Software testing tools and its taxonomy
Software testing tools and its taxonomySoftware testing tools and its taxonomy
Software testing tools and its taxonomy
 
Automation Testing with Sikuli
Automation Testing with SikuliAutomation Testing with Sikuli
Automation Testing with Sikuli
 
Sikuli
SikuliSikuli
Sikuli
 
Practical Software Testing Tools
Practical Software Testing ToolsPractical Software Testing Tools
Practical Software Testing Tools
 
Lecture 6
Lecture 6 Lecture 6
Lecture 6
 
FITT Toolbox: Evaluation Criteria
FITT Toolbox: Evaluation CriteriaFITT Toolbox: Evaluation Criteria
FITT Toolbox: Evaluation Criteria
 
Object oriented testing
Object oriented testingObject oriented testing
Object oriented testing
 
Types of testing and their classification
Types of testing and their classificationTypes of testing and their classification
Types of testing and their classification
 
ppt norm reference and criteration test
ppt norm reference and criteration testppt norm reference and criteration test
ppt norm reference and criteration test
 
Criteria for evaluation
Criteria for evaluationCriteria for evaluation
Criteria for evaluation
 
Selecting the Right Automated Testing tool
Selecting the Right Automated Testing tool Selecting the Right Automated Testing tool
Selecting the Right Automated Testing tool
 
Practical Sikuli: using screenshots for GUI automation and testing
Practical Sikuli: using screenshots for GUI automation and testingPractical Sikuli: using screenshots for GUI automation and testing
Practical Sikuli: using screenshots for GUI automation and testing
 
Chapter 7 Test and Measurement in Sports
Chapter 7 Test and Measurement in SportsChapter 7 Test and Measurement in Sports
Chapter 7 Test and Measurement in Sports
 
Norm-referenced & Criterion-referenced Tests
Norm-referenced & Criterion-referenced TestsNorm-referenced & Criterion-referenced Tests
Norm-referenced & Criterion-referenced Tests
 

Similar to Testing Tool Evaluation Criteria

6. Testing Guidelines
6. Testing Guidelines6. Testing Guidelines
6. Testing Guidelines
Mohammad Nasir Uddin
 
Open Source Software Testing Tools
Open Source Software Testing ToolsOpen Source Software Testing Tools
Open Source Software Testing Tools
Varuna Harshana
 
Software testing interview Q&A – Part 2
Software testing interview Q&A – Part 2Software testing interview Q&A – Part 2
Software testing interview Q&A – Part 2
Khoa Bui
 
Istqb Agile-tester Extension
Istqb Agile-tester ExtensionIstqb Agile-tester Extension
Istqb Agile-tester Extension
Girish Goutam
 
Hrishikesh_iitg_internship_report
Hrishikesh_iitg_internship_reportHrishikesh_iitg_internship_report
Hrishikesh_iitg_internship_report
Hrishikesh Malakar
 
Test automation - Building effective solutions
Test automation - Building effective solutionsTest automation - Building effective solutions
Test automation - Building effective solutions
Artem Nagornyi
 
Chapter 5 - Tools
Chapter 5 - ToolsChapter 5 - Tools
Chapter 5 - Tools
Neeraj Kumar Singh
 
Qa case study
Qa case studyQa case study
Qa case study
hopperdev
 
SOFTWARE TESTING
SOFTWARE TESTINGSOFTWARE TESTING
SOFTWARE TESTING
Aurobindo Nayak
 
Learn software testing with tech partnerz 3
Learn software testing with tech partnerz 3Learn software testing with tech partnerz 3
Learn software testing with tech partnerz 3
Techpartnerz
 
Identifying Software Performance Bottlenecks Using Diagnostic Tools- Impetus ...
Identifying Software Performance Bottlenecks Using Diagnostic Tools- Impetus ...Identifying Software Performance Bottlenecks Using Diagnostic Tools- Impetus ...
Identifying Software Performance Bottlenecks Using Diagnostic Tools- Impetus ...
Impetus Technologies
 
Unit 5 st ppt
Unit 5 st pptUnit 5 st ppt
Unit 5 st ppt
Poonkodi Jayakumar
 
Txet Document
Txet DocumentTxet Document
Txet Document
Jayaprakash Perumalla
 
MTLM Visual Studio 2010 ALM workshop
MTLM Visual Studio 2010 ALM workshopMTLM Visual Studio 2010 ALM workshop
MTLM Visual Studio 2010 ALM workshop
Clemens Reijnen
 
Software Testing basics
Software Testing basicsSoftware Testing basics
Software Testing basics
Olia Khlystun
 
stlc
stlcstlc
Choosing a performance testing tool
Choosing a performance testing toolChoosing a performance testing tool
Choosing a performance testing tool
SebastinCastaoM
 
The Case for Agile testing
The Case for Agile testingThe Case for Agile testing
The Case for Agile testing
Cognizant
 
VAL-210-Computer-Validati-Plan-sample.pdf
VAL-210-Computer-Validati-Plan-sample.pdfVAL-210-Computer-Validati-Plan-sample.pdf
VAL-210-Computer-Validati-Plan-sample.pdf
SamehMostafa33
 
Performance Test Plan - Sample 2
Performance Test Plan - Sample 2Performance Test Plan - Sample 2
Performance Test Plan - Sample 2
Atul Pant
 

Similar to Testing Tool Evaluation Criteria (20)

6. Testing Guidelines
6. Testing Guidelines6. Testing Guidelines
6. Testing Guidelines
 
Open Source Software Testing Tools
Open Source Software Testing ToolsOpen Source Software Testing Tools
Open Source Software Testing Tools
 
Software testing interview Q&A – Part 2
Software testing interview Q&A – Part 2Software testing interview Q&A – Part 2
Software testing interview Q&A – Part 2
 
Istqb Agile-tester Extension
Istqb Agile-tester ExtensionIstqb Agile-tester Extension
Istqb Agile-tester Extension
 
Hrishikesh_iitg_internship_report
Hrishikesh_iitg_internship_reportHrishikesh_iitg_internship_report
Hrishikesh_iitg_internship_report
 
Test automation - Building effective solutions
Test automation - Building effective solutionsTest automation - Building effective solutions
Test automation - Building effective solutions
 
Chapter 5 - Tools
Chapter 5 - ToolsChapter 5 - Tools
Chapter 5 - Tools
 
Qa case study
Qa case studyQa case study
Qa case study
 
SOFTWARE TESTING
SOFTWARE TESTINGSOFTWARE TESTING
SOFTWARE TESTING
 
Learn software testing with tech partnerz 3
Learn software testing with tech partnerz 3Learn software testing with tech partnerz 3
Learn software testing with tech partnerz 3
 
Identifying Software Performance Bottlenecks Using Diagnostic Tools- Impetus ...
Identifying Software Performance Bottlenecks Using Diagnostic Tools- Impetus ...Identifying Software Performance Bottlenecks Using Diagnostic Tools- Impetus ...
Identifying Software Performance Bottlenecks Using Diagnostic Tools- Impetus ...
 
Unit 5 st ppt
Unit 5 st pptUnit 5 st ppt
Unit 5 st ppt
 
Txet Document
Txet DocumentTxet Document
Txet Document
 
MTLM Visual Studio 2010 ALM workshop
MTLM Visual Studio 2010 ALM workshopMTLM Visual Studio 2010 ALM workshop
MTLM Visual Studio 2010 ALM workshop
 
Software Testing basics
Software Testing basicsSoftware Testing basics
Software Testing basics
 
stlc
stlcstlc
stlc
 
Choosing a performance testing tool
Choosing a performance testing toolChoosing a performance testing tool
Choosing a performance testing tool
 
The Case for Agile testing
The Case for Agile testingThe Case for Agile testing
The Case for Agile testing
 
VAL-210-Computer-Validati-Plan-sample.pdf
VAL-210-Computer-Validati-Plan-sample.pdfVAL-210-Computer-Validati-Plan-sample.pdf
VAL-210-Computer-Validati-Plan-sample.pdf
 
Performance Test Plan - Sample 2
Performance Test Plan - Sample 2Performance Test Plan - Sample 2
Performance Test Plan - Sample 2
 

Recently uploaded

Research Directions for Cross Reality Interfaces
Research Directions for Cross Reality InterfacesResearch Directions for Cross Reality Interfaces
Research Directions for Cross Reality Interfaces
Mark Billinghurst
 
Observability For You and Me with OpenTelemetry
Observability For You and Me with OpenTelemetryObservability For You and Me with OpenTelemetry
Observability For You and Me with OpenTelemetry
Eric D. Schabell
 
UiPath Community Day Kraków: Devs4Devs Conference
UiPath Community Day Kraków: Devs4Devs ConferenceUiPath Community Day Kraków: Devs4Devs Conference
UiPath Community Day Kraków: Devs4Devs Conference
UiPathCommunity
 
find out more about the role of autonomous vehicles in facing global challenges
find out more about the role of autonomous vehicles in facing global challengesfind out more about the role of autonomous vehicles in facing global challenges
find out more about the role of autonomous vehicles in facing global challenges
huseindihon
 
Manual | Product | Research Presentation
Manual | Product | Research PresentationManual | Product | Research Presentation
Manual | Product | Research Presentation
welrejdoall
 
20240702 QFM021 Machine Intelligence Reading List June 2024
20240702 QFM021 Machine Intelligence Reading List June 202420240702 QFM021 Machine Intelligence Reading List June 2024
20240702 QFM021 Machine Intelligence Reading List June 2024
Matthew Sinclair
 
20240704 QFM023 Engineering Leadership Reading List June 2024
20240704 QFM023 Engineering Leadership Reading List June 202420240704 QFM023 Engineering Leadership Reading List June 2024
20240704 QFM023 Engineering Leadership Reading List June 2024
Matthew Sinclair
 
Fluttercon 2024: Showing that you care about security - OpenSSF Scorecards fo...
Fluttercon 2024: Showing that you care about security - OpenSSF Scorecards fo...Fluttercon 2024: Showing that you care about security - OpenSSF Scorecards fo...
Fluttercon 2024: Showing that you care about security - OpenSSF Scorecards fo...
Chris Swan
 
20240705 QFM024 Irresponsible AI Reading List June 2024
20240705 QFM024 Irresponsible AI Reading List June 202420240705 QFM024 Irresponsible AI Reading List June 2024
20240705 QFM024 Irresponsible AI Reading List June 2024
Matthew Sinclair
 
What's New in Copilot for Microsoft365 May 2024.pptx
What's New in Copilot for Microsoft365 May 2024.pptxWhat's New in Copilot for Microsoft365 May 2024.pptx
What's New in Copilot for Microsoft365 May 2024.pptx
Stephanie Beckett
 
Coordinate Systems in FME 101 - Webinar Slides
Coordinate Systems in FME 101 - Webinar SlidesCoordinate Systems in FME 101 - Webinar Slides
Coordinate Systems in FME 101 - Webinar Slides
Safe Software
 
Pigging Solutions Sustainability brochure.pdf
Pigging Solutions Sustainability brochure.pdfPigging Solutions Sustainability brochure.pdf
Pigging Solutions Sustainability brochure.pdf
Pigging Solutions
 
Details of description part II: Describing images in practice - Tech Forum 2024
Details of description part II: Describing images in practice - Tech Forum 2024Details of description part II: Describing images in practice - Tech Forum 2024
Details of description part II: Describing images in practice - Tech Forum 2024
BookNet Canada
 
Understanding Insider Security Threats: Types, Examples, Effects, and Mitigat...
Understanding Insider Security Threats: Types, Examples, Effects, and Mitigat...Understanding Insider Security Threats: Types, Examples, Effects, and Mitigat...
Understanding Insider Security Threats: Types, Examples, Effects, and Mitigat...
Bert Blevins
 
WPRiders Company Presentation Slide Deck
WPRiders Company Presentation Slide DeckWPRiders Company Presentation Slide Deck
WPRiders Company Presentation Slide Deck
Lidia A.
 
The Increasing Use of the National Research Platform by the CSU Campuses
The Increasing Use of the National Research Platform by the CSU CampusesThe Increasing Use of the National Research Platform by the CSU Campuses
The Increasing Use of the National Research Platform by the CSU Campuses
Larry Smarr
 
7 Most Powerful Solar Storms in the History of Earth.pdf
7 Most Powerful Solar Storms in the History of Earth.pdf7 Most Powerful Solar Storms in the History of Earth.pdf
7 Most Powerful Solar Storms in the History of Earth.pdf
Enterprise Wired
 
How RPA Help in the Transportation and Logistics Industry.pptx
How RPA Help in the Transportation and Logistics Industry.pptxHow RPA Help in the Transportation and Logistics Industry.pptx
How RPA Help in the Transportation and Logistics Industry.pptx
SynapseIndia
 
What’s New in Teams Calling, Meetings and Devices May 2024
What’s New in Teams Calling, Meetings and Devices May 2024What’s New in Teams Calling, Meetings and Devices May 2024
What’s New in Teams Calling, Meetings and Devices May 2024
Stephanie Beckett
 
How Social Media Hackers Help You to See Your Wife's Message.pdf
How Social Media Hackers Help You to See Your Wife's Message.pdfHow Social Media Hackers Help You to See Your Wife's Message.pdf
How Social Media Hackers Help You to See Your Wife's Message.pdf
HackersList
 

Recently uploaded (20)

Research Directions for Cross Reality Interfaces
Research Directions for Cross Reality InterfacesResearch Directions for Cross Reality Interfaces
Research Directions for Cross Reality Interfaces
 
Observability For You and Me with OpenTelemetry
Observability For You and Me with OpenTelemetryObservability For You and Me with OpenTelemetry
Observability For You and Me with OpenTelemetry
 
UiPath Community Day Kraków: Devs4Devs Conference
UiPath Community Day Kraków: Devs4Devs ConferenceUiPath Community Day Kraków: Devs4Devs Conference
UiPath Community Day Kraków: Devs4Devs Conference
 
find out more about the role of autonomous vehicles in facing global challenges
find out more about the role of autonomous vehicles in facing global challengesfind out more about the role of autonomous vehicles in facing global challenges
find out more about the role of autonomous vehicles in facing global challenges
 
Manual | Product | Research Presentation
Manual | Product | Research PresentationManual | Product | Research Presentation
Manual | Product | Research Presentation
 
20240702 QFM021 Machine Intelligence Reading List June 2024
20240702 QFM021 Machine Intelligence Reading List June 202420240702 QFM021 Machine Intelligence Reading List June 2024
20240702 QFM021 Machine Intelligence Reading List June 2024
 
20240704 QFM023 Engineering Leadership Reading List June 2024
20240704 QFM023 Engineering Leadership Reading List June 202420240704 QFM023 Engineering Leadership Reading List June 2024
20240704 QFM023 Engineering Leadership Reading List June 2024
 
Fluttercon 2024: Showing that you care about security - OpenSSF Scorecards fo...
Fluttercon 2024: Showing that you care about security - OpenSSF Scorecards fo...Fluttercon 2024: Showing that you care about security - OpenSSF Scorecards fo...
Fluttercon 2024: Showing that you care about security - OpenSSF Scorecards fo...
 
20240705 QFM024 Irresponsible AI Reading List June 2024
20240705 QFM024 Irresponsible AI Reading List June 202420240705 QFM024 Irresponsible AI Reading List June 2024
20240705 QFM024 Irresponsible AI Reading List June 2024
 
What's New in Copilot for Microsoft365 May 2024.pptx
What's New in Copilot for Microsoft365 May 2024.pptxWhat's New in Copilot for Microsoft365 May 2024.pptx
What's New in Copilot for Microsoft365 May 2024.pptx
 
Coordinate Systems in FME 101 - Webinar Slides
Coordinate Systems in FME 101 - Webinar SlidesCoordinate Systems in FME 101 - Webinar Slides
Coordinate Systems in FME 101 - Webinar Slides
 
Pigging Solutions Sustainability brochure.pdf
Pigging Solutions Sustainability brochure.pdfPigging Solutions Sustainability brochure.pdf
Pigging Solutions Sustainability brochure.pdf
 
Details of description part II: Describing images in practice - Tech Forum 2024
Details of description part II: Describing images in practice - Tech Forum 2024Details of description part II: Describing images in practice - Tech Forum 2024
Details of description part II: Describing images in practice - Tech Forum 2024
 
Understanding Insider Security Threats: Types, Examples, Effects, and Mitigat...
Understanding Insider Security Threats: Types, Examples, Effects, and Mitigat...Understanding Insider Security Threats: Types, Examples, Effects, and Mitigat...
Understanding Insider Security Threats: Types, Examples, Effects, and Mitigat...
 
WPRiders Company Presentation Slide Deck
WPRiders Company Presentation Slide DeckWPRiders Company Presentation Slide Deck
WPRiders Company Presentation Slide Deck
 
The Increasing Use of the National Research Platform by the CSU Campuses
The Increasing Use of the National Research Platform by the CSU CampusesThe Increasing Use of the National Research Platform by the CSU Campuses
The Increasing Use of the National Research Platform by the CSU Campuses
 
7 Most Powerful Solar Storms in the History of Earth.pdf
7 Most Powerful Solar Storms in the History of Earth.pdf7 Most Powerful Solar Storms in the History of Earth.pdf
7 Most Powerful Solar Storms in the History of Earth.pdf
 
How RPA Help in the Transportation and Logistics Industry.pptx
How RPA Help in the Transportation and Logistics Industry.pptxHow RPA Help in the Transportation and Logistics Industry.pptx
How RPA Help in the Transportation and Logistics Industry.pptx
 
What’s New in Teams Calling, Meetings and Devices May 2024
What’s New in Teams Calling, Meetings and Devices May 2024What’s New in Teams Calling, Meetings and Devices May 2024
What’s New in Teams Calling, Meetings and Devices May 2024
 
How Social Media Hackers Help You to See Your Wife's Message.pdf
How Social Media Hackers Help You to See Your Wife's Message.pdfHow Social Media Hackers Help You to See Your Wife's Message.pdf
How Social Media Hackers Help You to See Your Wife's Message.pdf
 

Testing Tool Evaluation Criteria

  • 1. AUTOMATED TEST TOOLS EVALUATION CRITERIA Terry Horwath Version 1.02 (1/18/07)
  • 2. Automated Test Tools Evaluation Criteria Version 1.02 (1/18/07) Table of Contents 1. INTRODUCTION 1 1.1 Author’s Background ...........................................................................................................1 1.2 Allocate Reasonable Resources and Talent..........................................................................1 1.3 Establish Reasonable Expectations ......................................................................................2 2. RECOMMENDED EVALUATION CRITERIA 3 2.1 GUI Object Recognition.......................................................................................................3 2.2 Platform Support ..................................................................................................................3 2.3 Recording Browser Objects..................................................................................................3 2.4 Cross-browser Playback .......................................................................................................3 2.5 Recording Java Objects ........................................................................................................4 2.6 Java Playback .......................................................................................................................4 2.7 Visual Testcase Recording ...................................................................................................4 2.8 Scripting Language...............................................................................................................4 2.9 Recovery System ..................................................................................................................4 2.10 Custom Objects ....................................................................................................................5 2.11 Technical Support.................................................................................................................5 2.12 Internationalization Support .................................................................................................5 2.13 Reports..................................................................................................................................5 2.14 Training & Hiring Issues ......................................................................................................5 2.15 Multiple Test Suite Execution ..............................................................................................5 2.16 Testcase Management...........................................................................................................6 2.17 Debugging Support...............................................................................................................6 2.18 User Audience ......................................................................................................................6 ii Terry Horwath
  • 3. Automated Test Tools Evaluation Criteria Version 1.02 (1/18/07) 1. INTRODUCTION This document provides a list of evaluation criteria which has proven useful to me when evaluating automated test tools like Mercury Interactive’s QuickTest Professional, WinRunner and Segue’s Silk over the last several years for a variety of clients. Hopefully some readers will find this information useful, such that it reduces your evaluation effort. The specific criteria used for each project differs based on the client’s: • testing environment, and • test engineers’ programming backgrounds and skill sets, and • type of software being tested [especially the software develop tool, such as Visual Basic, PowerBuilder, Java, browser based applications, etc.], and • application(s) testing requirements. The remainder of this chapter provides a variety of miscellaneous thoughts I have on automating the testing process, while Chapter 2 contains my list of potential evaluation criteria. Note that some of the Chapter 2 evaluation criteria is Java and web application testing oriented. Substitute your application development tool—for example Visual Basic or PowerBuilder—in the Java related evaluation criteria items. 1.1 Author’s Background I have designed custom frameworks as well as hundreds of test cases using Silk/QaPartner from 1994 (version 1.0) through 2004 (version 5.5). with WinRunner (version 5) and Test Director in 1999 and 2000 and with QuickTest Professional since 2006 (versions 8 and 9). 1.2 Allocate Reasonable Resources and Talent Most software testing projects do not fail because of the selected test tools—virtually all of top automated testing tools on the market can be used to do an adequate job, even when the test tool is not well matched with the software development environment. Rather I believe that most failures are due to a combination of the following reasons: 1. Test engineers fail to treat the effort to develop a large number of complex test cases and test suites as a large software development project—it is crucial to apply good software development methodology to produce a test product, which includes defining requirements, developing a schedule, implementing each test suite using a shared custom framework of well known libraries and guidelines, as well as using a software version control system. 2. Sufficient manpower and time are not allocated early enough in the application development cycle. Along with incomplete testing this also leads to the phenomenon of test automation targeted for use with Release N actually being delivered and used with Release N+1. 3. Test technicians with improper skills are assigned to use these automated test tools. Users of these tools must have strong test mentalities and in all but a few situations they must also possess solid programming skills with the automation tool’s scripting language. 1 Terry Horwath
  • 4. Automated Test Tools Evaluation Criteria Version 1.02 (1/18/07) 1.3 Establish Reasonable Expectations Through their promotional literature automated test tool vendors often establish unrealistic expectations in one or more of the following areas: • What application features and functions can truly be tested with the tool. • The skill level required to effectively use the tool. • How useful the tool’s automatic recording capabilities are. • How quickly effective testcases can be produced. This is unfortunate because in the hands of test engineers possessing the proper skill set all of the top automated test tools can be used to test significant portions of virtually any GUI-centric application. Use the following assumptions when reviewing this document and planning your evaluation effort: 1. Even when a test tool is well matched with a software development tool, the test tool will still only be able to recognize a subset of the application’s objects—windows, buttons, controls, etc.—without taking special programming actions. This subset will be large when the development engineers create window objects using the development tool’s standard class libraries. The related issue of cross-browser playback also rears it head when testing web applications. 2. If the test engineer wants to unleash the full power of the test tool they will need to have, or develop, solid programming skills with the tool’s scripting language. 3. With few exceptions recording utilities—those tools which capture user interaction and insert validation functions—are only effective in roughing out a testcase. Thereafter captured sequences will most often need be cleaned up and/or generalized using the scripting language. 4. If an application has functionality which can’t be tested through the GUI you will need to: (a) use the tool’s ability to interface to DLLs—for Windows based applications; (b) use its SDK (software developer kit) or API if it supports one of these mechanisms; (c) use optional tools—at an additional cost—offered by the test tool vendor; (d) use other 3rd party non-GUI test tools more appropriate to the testing task. 5. If you are currently manually testing the application to be automated you will need to initially increase the size of the test team by a minimum of 1 or 2 test engineers—who possess good programming backgrounds. After a significant portion of testcases have been written and debugged you can start removing some of the manual test engineers. Pay back comes at the end of the automation effort, not during the initial implementation. 6. If the test team does not contain at least one member previously involved with automating the test process, coming up to speed is no small task—no matter which tool is selected. Budget dollars and time for training classes and consulting offered by the tool vendor to get your test team up and running. 7. Budget 80 hours of time to do a detailed evaluation of each vendor’s automated test tool against your selected evaluation criteria, using one of your applications. While you might initially recoil from this significant investment in time, keep in mind that the selected tool will likely be part of your department’s testing effort for many years—selecting the wrong tool will reduce productivity many times over 80 hours. 2 Terry Horwath
  • 5. Automated Test Tools Evaluation Criteria Version 1.02 (1/18/07) 2. RECOMMENDED EVALUATION CRITERIA 2.1 GUI Object Recognition Does the tool: (a) Provide the ability to record each object in a window—or on a browser page—such that a logical object identifier, used in the script, is definable independent of the operating system dependent property [or properties] used by the tool to access that object at runtime. (b) (1) Provide the ability to associate (i.e. map) the logical object identifier with more than one operating system dependent property. And, (2) does the tool offer some technique to support a property definition technique which supports internationalization [if language localization is a testing requirement]? (c) provide the ability to record—and deal effectively with—dynamically generated objects [often encountered when testing web applications]. 2.2 Platform Support Are all of the required platforms [i.e. NT 4.0, Windows XP, Windows Vista, etc.] supported for: (a) testcase playback? (b) testcase recording? (c) testcase development [programming without recording support]? 2.3 Recording Browser Objects Does the tool provide the ability to record against web applications under test, correctly recognizing all browser page HTML objects, using the following browsers: (a) IE7? (b) IE6? (c) Firefox? 2.4 Cross-browser Playback Does the tool provide the ability to reliably and repeatedly playback test scripts against the browsers which were not used during object capture and testcase creation, with little or no: (a) Changes to the GUI map (WinRunner), GUI declarations (Silk) or the equivalent in other tools? (b) Changes to testcase code? (c) Does the tool provide some type of generic capability [without using sleep–like commands in the code] to deal with “browser not ready” to correctly synchronize code execution—such as an access to a web page over a slow internet connection? 3 Terry Horwath
  • 6. Automated Test Tools Evaluation Criteria Version 1.02 (1/18/07) 2.5 Recording Java Objects Does the tool: (a) Provide the ability to record objects against, and see all standard Swing, AWT and JFC 1.1.8 and 1.2 objects, when running the Java application under test? (b) Provide the ability to record objects against [and interact with] non-standard Java classes required by the Java application under test (i.e. for example the KLGroup’s 3rd party controls, when the application under test uses that 3rd party toolset)? (c) Require that the platform’s static classpath environment variable be set with tool specific classes, or can this be set within the tool on a test suite by test suite basis? 2.6 Java Playback Does the tool: (a) Reliably and repeatedly play back the evaluation testcases? (b) Provide some type of generic capability [without using sleep –like commands in the code] to deal with “application not ready” to correctly synchronize code execution? [This may or may not be an issue, depending on the application being tested]. 2.7 Visual Testcase Recording Does the tool: (a) Provide the ability to visually record testcases by interacting with the application under test as a real user would? (b) Provide the ability, while visually recording a testcase, to interactively insert—without resorting to programming—validation statements? (c) Provide the ability, while interactively inserting a validation statements, to visually/interactively select validation properties (i.e. contents of a text field, focus on a control, control enabled, etc.)? 2.8 Scripting Language Is the test tool’s underlying scripting language: (a) object-oriented? (b) Proprietary? 2.9 Recovery System Does the tool support some type of built-in recovery system, which the programmer can control/define, that drives the application under test back to a know state? (Especially in the case where modal dialogs were left open when a testcase failure occurred)? 4 Terry Horwath
  • 7. Automated Test Tools Evaluation Criteria Version 1.02 (1/18/07) 2.10 Custom Objects What capabilities does the tool provide to deal with unrecognized objects in a window or on a browser page? [Spend a fair amount of time evaluating this capability, as it is quite important]. 2.11 Technical Support What was the quality and timeliness of technical support received during product evaluation? [Remember—it won’t get any better after you purchase the product, but it might get worse]. 2.12 Internationalization Support Evaluate the support for internationalization [also referred to as language localization] in the following areas [if this is a testing requirement]: (a) Object recognition (b) Object content (such as text fields, text labels, etc.). (c) Evaluate and highlight any built–in or add-on multi–language support offered by the vendor. 2.13 Reports What type of reporting and logging capabilities does the tool provide? 2.14 Training & Hiring Issues (a) What is your [not the vendor’s] estimated learning curve to be competent (i.e. can write useful test scripts which may need to be rewritten later); (b) What is your estimated learning curve to become an skilled (i.e. can write test scripts which rarely need to be rewritten). (c) What is your estimated learning curve to become an expert (i.e. can design frameworks). (d) What is your estimated availability of potential (i) employees, and (ii) expert consultants, skilled with this tool in your geographic area. 2.15 Multiple Test Suite Execution (a) Can multiple test suites be driven completely from the tool [or from a command line interface] thereby allowing X number of unrelated suites/projects to be executed under a cron-like job or shell? (For true unattended operation). (b) …including the ability to save the results log, as text, prior to or during termination/exit? (c) …including the ability to return a reliable pass/fail status on termination/exit? 5 Terry Horwath
  • 8. Automated Test Tools Evaluation Criteria Version 1.02 (1/18/07) 2.16 Testcase Management Does the tool support some type of test case management facility (either built-in or as an add-on) that allows each test engineer to execute any combination of tests out of the full test suite for a given project? How difficult is it to integrate manual testing results with automated test results? 2.17 Debugging Support What type of debugging capabilities does the tool support to help isolate scripting and/or runtime errors? 2.18 User Audience Which of the following groups of users does the tool primarily target? • Test technicians possess good test mentalities, but often lack much if any background in programming or software development methodologies. They are the backbone of many test groups and have often spent years developing and executing manual testcases. • Test developers possess all of the test technician’s skill set, plus they have had some formal training in programming and limited experience working with on a software development project and/or automated testcases. • Test architects possess all of the test developer’s skill set, plus they have had many years of experience developing and maintaining automated test cases, as well as experience defining and implementing the test framework under which multiple automated test suites are developed. They are a recognized expert with at least one automated tool. 6 Terry Horwath