The document discusses best practices for performance testing. It provides an overview of the typical performance testing process, including defining goals, planning tests, scripting tests, executing tests, analyzing results, and delivering findings. It also discusses considerations for choosing testing tools and resources as well as common pitfalls to avoid, such as not testing, poor planning, relying on customers to find issues, using the wrong tools, and failing to properly isolate variables.
Application Lifecycle Transformation...a DevOps Discussion - By David Miller ...Melissa Luongo
This document discusses DevOps and its key practices for transforming an organization's application lifecycle. It provides the following key points:
1) DevOps addresses all aspects of the software delivery lifecycle by promoting continuous delivery, collaboration between development and operations, and incorporating customer feedback at every step.
2) There are 12 key practices that organizations leverage for DevOps transformation, including agile development, continuous integration, automated testing and deployment, and cross-functional teams.
3) A DevOps application delivery pipeline diagram shows the continuous flow from development to production with feedback at each stage.
Automated acceptance testing is an important part of the deployment pipeline. It tests that the application meets business requirements and provides value to users. Creating maintainable acceptance test suites involves deriving tests from acceptance criteria, layering the tests, and avoiding direct coupling to the GUI. Non-functional requirements like performance and capacity also need to be tested. The deployment process should be automated and standardized across environments using techniques like blue-green deployment and canary releases to allow rolling back changes if needed.
TAAS is a model of software testing whereby TRASYS undertakes the activity of testing applications & solutions for our customers as a service with following traits:
On-demand services
service delivery function & service governance
Well-defined & repeatable services (service catalogue, RACI, entry & exit criteria, )
Outsouring using a shared service centre
Pay per use (based on test effort estimation techniques)
Well defined test process & methodology
Professional / certified test resources
Supported by (cloud based and virtualised) test tools & test environments
Performance Engineering Case Study V1.0sambitgarnaik
This document discusses performance testing solutions and services offered by IonIdea. It provides an overview of IonIdea's performance testing tools for load testing, performance testing, and monitoring application and infrastructure performance. It also describes IonIdea's testing services such as performance testing, test automation consulting, and outsourced testing. Finally, it presents a case study example of how IonIdea used performance triage techniques including profiling and load testing to identify and address performance issues for an online banking application.
Oak Systems is an independent software testing company headquartered in Bangalore, India that provides quality assurance and testing services. It has over 100 testers with expertise in various domains and technologies. Oak Systems offers different engagement models for outsourced testing including onsite/offsite testing, managed testing services, and staff/team-based time and materials models. It aims to provide high quality testing services at competitive prices through mature processes and a positive testing culture.
Q Labs Webinar on Testcase Prioritization [Feb 20, 2009]Vipul Gupta
This document discusses test case prioritization and introduces the qLabs model. It outlines challenges with traditional prioritization approaches and limitations. The qLabs model is a dynamic, two-step process that calculates a Test Case Adjusted Priority (TAP) based on usage, importance, impact and risk factors to identify critical test cases and areas for a given release. A case study demonstrates how qLabs prioritizes 3000 test cases for a product with new modules and a shortened test cycle.
Manual testing of SAP systems is challenging due to the complexity of SAP and heavy reliance on data. Test automation is limited due to data dependencies and frequent process changes. As a result, manual testing plays a key role but is underserved by tools. Deciding what to test involves balancing cost and risk, while creating test scenarios is time-consuming without standardization. Exploratory testing by business users provides flexibility but limits documentation for developers. Overall, better tools are needed to optimize the manual testing process for SAP systems.
Are you new to performance testing? This slides are for those of you who want to explore and learn where and how to start testing application performance. During this web event, our performance testing experts will reveal the key pieces and parts of performance testing, including the phases of the test and how HP LoadRunner supports each phase.
- The document discusses the challenges of traditional web performance testing and introduces trusted cloud web performance testing as an effective alternative using real-time monitoring, sophisticated analytics, and affordable load and performance testing capabilities.
- A case study is presented of a tax filing website that was performance tested using cloud-based testing and was able to detect 27 critical issues while ramping up to 300,000 concurrent users, achieving results 75 times better than traditional testing.
- An accreditation process is described that uses cloud-based performance testing to validate a website's performance over 25 hours against key metrics and benchmarks.
The document provides a summary of Chitra Stroup's professional experience as a QA Lead. She has over 15 years of experience in testing software in both Agile and Waterfall environments. Currently she leads testing teams of up to 6 people and mentors junior testers. Her skills include test automation, defect tracking, and experience testing various platforms and languages.
Star west 2011 manoj narayanan presentation 1.0manoj7698
Crowdsourced testing, also known as crowd testing, leverages a global community to accomplish business goals such as product testing. It has become relevant for testing due to the need to test large numbers of scenarios. Crowd testing can be utilized for web, mobile, and game applications. Organizations are exploring various options to integrate crowd testing into their overall testing strategy including using it as an add-on, for specialized testing scenarios, based on risk and priority, or fully integrating it. As the crowd testing market matures, players are expanding their services, forming partnerships, and differentiating their offerings.
DevOps Evolution - The Next Generation ?Marc Hornbeek
Where is DevOps in its maturity? Is DevOps life near its beginning, middle, mature, near end-of-life or near extinction? What does the next generation look like? This presentation posits the next generation will be a new level of process optimization driven by coupling analytics with DevOps pipeline tools and associated role shifts.
At SQA Solution the objectives of SAP System Testing are to verify that the installed system, which includes the SAP software, custom code and manual procedures, perform per business requirements:
Executes as specified and without error.
Validates with the users and management that the delivered system performs in accordance with the stated system requirements.
Ensures that the system works with other existing systems, including but not limited to interfaces, conversions, and reports.
- Mahesh Chintala is a software tester with over 4 years of experience in manual and automation testing. He has worked on insurance software projects for The Hartford and Direct Line Group.
- His skills include Selenium, Quality Center, Guidewire, and testing web and desktop applications. He is proficient in test design, execution, defect tracking, and reporting.
- He led a team of testers and was responsible for test planning, coordination, and client communication on two large projects involving policy and claims management software.
This document provides a summary of Ott Calfee's qualifications, including over 5 years of experience in quality assurance and automation testing using tools like Selenium and JUnit. They have expertise in software development lifecycles like Waterfall and Agile, as well as testing methodologies. Recent experience includes quality assurance roles for projects at Safeway involving a point-of-sale system and in-store ordering. Previous roles involved testing training software, middleware systems at AT&T, and content verification.
Aligning Software Testing With Modern Age Development PracticesAspire Systems
This document discusses aligning software testing practices with modern development approaches. It describes the evolution from waterfall to iterative to agile development models. Agile practices like scrum, lean, kanban, and DevOps are discussed as engineering methods. Current business trends around digital transformation, mobility, and customization are also covered. The challenges of testing in these modern contexts include shorter release cycles, evolving requirements, and multi-channel delivery. The document proposes a shift left testing approach, greater automation, and a transition from quality assurance to quality engineering. Specific practices discussed include behavior driven development, test-driven development, and balancing manual and automated testing. Key technical areas like cloud, security, and tools are also summarized.
The document provides information on types of software testing, test strategy and planning, and test estimation techniques. It describes various types of testing including functional, system, end-to-end, load, security, and others. It also discusses test strategy, test planning, and creating test plans. Finally, it outlines several techniques for estimating testing efforts such as best guess, analogies, work breakdown structure, three-point estimation, and function point analysis.
The document discusses software testing in cloud platforms. It outlines the cloud testing model which includes Testing as a Service (TaaS), Testing Support as a Service (TSaaS), and Testing inside Cloud. It reviews related work on modeling cloud-based applications for testing and proposing automated testing platforms as a service. The document also highlights potential risks in cloud testing and presents commercial tools and open research issues in cloud testing.
Software testing as a service (STaaS) is an outsourcing model where testing activities are outsourced to a third party testing specialist. STaaS providers simulate real world testing environments and provide on-demand software testing services. STaaS allows organizations to access automation tools and skilled testers flexibly without large investments. Popular STaaS services include automated regression testing, performance testing, security testing, and testing of cloud applications. STaaS can reduce costs for organizations and help deliver high quality software faster.
This document provides an overview of OpenSTA, an open source load testing tool. It discusses OpenSTA's history and capabilities, key performance testing terminology, and the basic components and process for using OpenSTA. OpenSTA can be used to create scripts with the modeler, run load tests with the commander, and analyze results in Excel. It supports HTTP/HTTPS protocols and allows modeling realistic loads through scripting and variable management.
Complete description of the parts of an adult bed bug's body. When sleeping in a new bed, check for signs of their feces: small, black or dark brown dots on sheets or the mattress.
Know More About Rational Performance - Snehamoy KRoopa Nadkarni
Rational Performance Tester (RPT) is a tool for performance testing web applications. It can simulate thousands of virtual users to test an application's performance and scalability. RPT works with many web application frameworks and protocols. It combines access to protocol data with the ability to insert custom Java code, enabling advanced test scenarios. RPT uses a distributed architecture where test agents inject load from separate machines while the Eclipse workbench is used for test creation and analysis. Proper configuration of workbench and agent machines is important for optimizing test performance.
Load Testing SAP Applications with IBM Rational Performance TesterBill Duncan
This technical solution briefly describes how the SAP CoE / Value Prototyping successfully leveraged IBM Rational Performance Tester 8.0 to test an ABAP Web Dynpro application before it went into production. The paper shows how IBM testing tools can be used to simulate user load on any SAP system and measure the system’s behavior under load. The solution described in this paper was used in an SAP internal project to measure a new SAP application before it was implemented internally.
This document describes testing a website called Plants by WebSphere using IBM Rational Performance Tester. It provides details on recording a test, creating datapools, data correlation, generating different reports on test results, organizing the project using folders, and creating performance test schedules. Testing goals include measuring server response times and identifying potential bottlenecks when the site is subjected to high transaction volumes.
Take a load off! Load testing your Oracle APEX or JDeveloper web applicationsSage Computing Services
Geeeez, after demanding you unit test, system test, black box test, white box test, test-test-test everything, your manager is now demanding you load test your brand spanking new Oracle web application. How on earth can you do this?
This technical presentation will explain the concepts behind preparing for load testing, the Http protocol's request/response model, and live demonstrations using Oracle's Http Analyzer and Apache's JMeter to stress test your Oracle web application.
The presentation is suitable for anybody, be it DBAs or developers, who are concerned about the performance of any web based application, possibly an Apex or JDeveloper or 3rd party web application. Knowledge of Apex or JDeveloper is not mandatory for this presentation and they will not be covered in any depth.
This document provides an overview of performance testing and the Rational Performance Tester tool. It discusses why performance testing is important, different types of performance testing, performance engineering methodology, performance objectives and metrics. It also provides an overview of the Rational Performance Tester tool, describing its test creation, editing, workload scheduling, execution and results evaluation capabilities.
This document provides a basic introduction to load and stress testing with Apache JMeter. It explains the objectives of performance, load, and stress testing. It then guides the user through downloading and installing JMeter, creating a simple test plan with a thread group and HTTP request sampler, adding a listener to view results, running the test, and interpreting basic results from the summary report. While this only scratches the surface of JMeter's capabilities, it is intended to demonstrate the basic functionality and get users started with creating and running their first simple test plan.
This document discusses new features in IBM Rational Performance Tester Version 8.1. It describes enhancements that help with performance testing in agile environments, for performance test specialists, and for performance analysts. Key updates include improved definition and reporting of performance requirements, enhanced test case development features, and new analysis capabilities for monitoring resources and viewing run-time data.
Load testing simulates multiple users accessing an application simultaneously to evaluate performance under different load scenarios. There are three main types of load testing:
1. Performance testing gradually increases load to determine the maximum number of users/requests per second an application can handle.
2. Stress testing pushes load beyond normal limits to identify the breaking point and ensure error handling.
3. Soak testing subjects an application to high load over an extended period to check for resource allocation problems, memory leaks, and server overloading.
The tool JMeter is commonly used for load testing and allows simulating many users and transactions. It can test HTTP, databases, and other components. Plugins extend its functionality and distributed testing improves load
Rational Performance Tester is a tool that identifies system performance bottlenecks. It simplifies test creation, load generation, and data collection to help ensure applications can accommodate required user loads. Scripting involves recording user actions and inserting transaction points. Tests are then executed according to schedules that can run scripts across multiple remote machines in parallel to simulate different user loads.
This document outlines a performance test plan for Sakai 2.5.0. It describes the objectives, approach, test types, metrics, goals, tools, and data preparation. The objectives are to validate Sakai meets minimum performance standards and test any new or changed tools. Tests include capacity, consistent load, and single function stress tests. Metrics like response time, CPU utilization, and errors will be measured. Goals include average response time under 2.5s and max under 30s, CPU under 75%, and 500 concurrent users supported. Silk Performer will be used to run tests against a Sakai/Tomcat/Oracle environment. Over 92,000 students and 1,557 instructors of data will be preloaded
This document provides an overview and agenda for a presentation on automation testing using IBM Rational Functional Tester. It discusses what automation testing is, why it is useful, and when it should be implemented. It also addresses common myths about automation testing and provides best practices for setting up a successful automation framework. Finally, it gives an introduction to the features and capabilities of IBM Rational Functional Tester, including the recording and playback process for automated tests.
This performance test plan outlines objectives to compare the responsiveness and resource utilization of a current production system and a new proposed production system. It defines the scope, dependencies, and risks. Tools like JMeter and PerfMon will be used to execute load tests on the systems and analyze results. Performance testing activities include installing tools, implementing tests, executing tests at typical loads, monitoring results, and delivering a test plan, results, and metrics.
Continuous Testing through Service VirtualizationTechWell
The demand to accelerate software delivery and for teams to continuously test and release high quality software sooner has never been greater. However, whether your release strategy is based on schedule or quality, the entire delivery process hits the wall when agility stops at testing. When software or services that are part of the delivered system, or required environments are unavailable for testing, the entire team suffers. Al Wagner explains how to remove these testing interruptions, decrease project risk, and release higher quality software sooner. Using a real-life example, learn how service virtualization can be applied across the lifecycle to shift integration, functional, and performance testing to the left. Gain an understanding of how service virtualization can be incorporated into your automated build and deployment process, making continuous testing a reality for your organization. Learn what service virtualization can do for you and your stakeholders. The ROI is worth it!
Ness provides performance testing and environment comparison services to help companies maintain stability during transitions and releases. Their performance on demand solution offers load testing, ongoing monitoring, results analysis, and testing infrastructure. This holistic approach aims to improve testing and monitoring while reducing costs and increasing confidence in application launches.
Kingshuk Dasgupta leads pLab, which helps institutionalize performance engineering across the enterprise. pLab aims to promote performance awareness, educate teams, benchmark technologies, and build a shared testing environment. Performance is defined as a system's ability to meet objectives for response time, stability, scalability, and efficiency. Issues caused by poor performance include increased costs and lost income/competitiveness. pLab monitors key metrics like response times and outages to ensure service level targets are met.
The document provides an overview of performance testing, including:
- Defining performance testing and comparing it to functional testing
- Explaining why performance testing is critical to evaluate a system's scalability, stability, and ability to meet user expectations
- Describing common types of performance testing like load, stress, scalability, and endurance testing
- Identifying key performance metrics and factors that affect software performance
- Outlining the performance testing process from planning to scripting, testing, and result analysis
- Introducing common performance testing tools and methodologies
- Providing examples of performance test scenarios and best practices for performance testing
New Creation Information Technologies provides independent software testing and verification services. They offer functional and non-functional testing capabilities across the entire testing lifecycle. Their services include test planning, test case development, test automation, defect tracking, and reporting to help clients reduce costs and improve quality.
The document discusses the need for enhanced software quality training. It notes that current education lacks depth and real-world experience. A new approach to training is needed that focuses on building a strong conceptual foundation and practical skills through hands-on learning of techniques, tools, and best practices. This should involve real-world projects, continuous learning, and training that is interactive and never fully ends.
This presentation tells in brief the solutions provided by Impetus\'s Testing Center of Excellence "qLabs". Please send in your comments at qLabs@impetus.co.in
http://www.impetus.com/qLabs
Rational Quality Manager provides test management capabilities including test planning, execution, and analysis. It allows for collaboration between team members and provides dashboards and reports. RQM is part of the IBM Rational Software Delivery Platform and integrates with other Rational tools through the Jazz technology platform. RQM provides centralized management of test assets, environments, and utilization metrics to improve efficiency of the testing process.
The document provides information about LMS's software testing services. It outlines their capabilities including expertise across various domains, industries and technologies. It details their comprehensive testing methodologies, tools, templates and frameworks. It also describes their skilled workforce, alliances with tool vendors, delivery model, certifications and more. Their goal is to provide scalable, high-quality testing services to suit clients' needs.
The document introduces performance testing basics and methodology using Oracle Application Testing Suite. It covers types of performance testing like load testing, stress testing, and volume testing. It emphasizes the importance of setting up realistic user scenarios and test scripts. The testing environment should replicate production and use dedicated agent machines to generate load. Performance testing helps identify bottlenecks and determine scalability.
Automated performance testing simulates real users to determine an application's speed, scalability, and stability under load before deployment. It helps detect bottlenecks, ensures the system can handle peak usage, and provides confidence that the application will work as expected on launch day. The process involves evaluating user needs, drafting test scripts, executing different types of load tests, and monitoring servers and applications to identify performance issues or degradation over time.
Automated performance testing simulates real users to determine an application's speed, scalability, and stability under load before deployment. It helps detect bottlenecks, ensures the system can handle peak load, and provides confidence that the application will work as expected on launch day. The process involves evaluating user expectations and system limits, creating test scripts, executing load, stress, and duration tests while monitoring servers, and analyzing results to identify areas for improvement.
The document discusses factors to consider when choosing a test automation tool and framework. It describes how manual testing is time-consuming and prone to errors, while automation testing addresses these issues. The key steps in selecting a tool are to analyze requirements, skill sets, costs, and evaluate tools based on parameters like ease of use, support, and integration. Implementing a hybrid framework combines the benefits of modular, data-driven and keyword-driven approaches. Proof of concept testing potential tools helps confirm the right selection. Choosing tools and frameworks requires effort but pays off in project success.
1) Traditional load testing is limited in its ability to accurately measure end-user experience and identify issues with third-party components.
2) Load testing 2.0 uses real user testing from geographically distributed locations to more realistically drive large volumes of load and uncover regional response time discrepancies and external errors.
3) An online retailer used load testing 2.0 to identify that a third-party component was insufficient under load, affecting the performance of their overall application.
Are You Ready For More Visitors Cognizant Gomez Jan20Compuware APM
1) Traditional load testing is limited in its ability to accurately measure end-user experience and identify issues with third-party components.
2) Load testing 2.0 uses real user testing from geographically distributed locations to better understand regional response times and external factors that impact performance.
3) A case study showed that load testing 2.0 uncovered poor response times for key revenue regions that traditional load testing failed to detect.
Neev uses a scrum based Agile Development methodology, a proven Extended Delivery Center model of engagement - all designed to ensure high quality, timely deliverables.
This document discusses how test automation can enhance product quality and accelerate time-to-revenue. It outlines Aspire's test automation services including consulting, development, execution and maintenance of automated test scripts. The document promotes Aspire's test automation framework called PropelQ and tools that integrate with development workflows to deliver quality code and reduce testing costs.
AfterTest Madrid March 2016 - DevOps and Testing IntroductionPeter Marshall
This document discusses continuous testing in the context of DevOps. It defines continuous testing as including automated testing, managing production and non-production environments, application monitoring, and evaluating business objectives. Continuous testing relies on DevOps activities like automating builds, infrastructure, deployments, and monitoring. It advocates for smaller, more frequent deliveries through practices like test automation, infrastructure as code, and treating testing as integral to the software delivery process. The conclusion emphasizes automation, configuration management, and obtaining frequent feedback to enable continuous testing.
1. The document provides an introduction to service virtualization training, covering topics such as an overview of SV capabilities, a demo of main components, and what's new in the upcoming SV 5.0 release. 2. It discusses how digital transformation, agile/DevOps adoption, and emerging technologies are disrupting testing and increasing the need for test automation. 3. Case studies show how service virtualization can provide significant benefits like reduced wait times, improved service availability, decreased software cycles, and increased test coverage.
Similar to Best Practices In Load And Stress Testing Cmg Seminar[1] (20)
Join educators from the US and worldwide at this year’s conference, themed “Strategies for Proficiency & Acquisition,” to learn from top experts in world language teaching.
Webinar Innovative assessments for SOcial Emotional SkillsEduSkills OECD
Presentations by Adriano Linzarini and Daniel Catarino da Silva of the OECD Rethinking Assessment of Social and Emotional Skills project from the OECD webinar "Innovations in measuring social and emotional skills and what AI will bring next" on 5 July 2024
Understanding and Interpreting Teachers’ TPACK for Teaching Multimodalities i...Neny Isharyanti
Presented as a plenary session in iTELL 2024 in Salatiga on 4 July 2024.
The plenary focuses on understanding and intepreting relevant TPACK competence for teachers to be adept in teaching multimodality in the digital age. It juxtaposes the results of research on multimodality with its contextual implementation in the teaching of English subject in the Indonesian Emancipated Curriculum.
The Jewish Trinity : Sabbath,Shekinah and Sanctuary 4.pdfJackieSparrow3
we may assume that God created the cosmos to be his great temple, in which he rested after his creative work. Nevertheless, his special revelatory presence did not fill the entire earth yet, since it was his intention that his human vice-regent, whom he installed in the garden sanctuary, would extend worldwide the boundaries of that sanctuary and of God’s presence. Adam, of course, disobeyed this mandate, so that humanity no longer enjoyed God’s presence in the little localized garden. Consequently, the entire earth became infected with sin and idolatry in a way it had not been previously before the fall, while yet in its still imperfect newly created state. Therefore, the various expressions about God being unable to inhabit earthly structures are best understood, at least in part, by realizing that the old order and sanctuary have been tainted with sin and must be cleansed and recreated before God’s Shekinah presence, formerly limited to heaven and the holy of holies, can dwell universally throughout creation
How to Show Sample Data in Tree and Kanban View in Odoo 17Celine George
In Odoo 17, sample data serves as a valuable resource for users seeking to familiarize themselves with the functionalities and capabilities of the software prior to integrating their own information. In this slide we are going to discuss about how to show sample data to a tree view and a kanban view.
Ardra Nakshatra (आर्द्रा): Understanding its Effects and RemediesAstro Pathshala
Ardra Nakshatra, the sixth Nakshatra in Vedic astrology, spans from 6°40' to 20° in the Gemini zodiac sign. Governed by Rahu, the north lunar node, Ardra translates to "the moist one" or "the star of sorrow." Symbolized by a teardrop, it represents the transformational power of storms, bringing both destruction and renewal.
About Astro Pathshala
Astro Pathshala is a renowned astrology institute offering comprehensive astrology courses and personalized astrological consultations for over 20 years. Founded by Gurudev Sunil Vashist ji, Astro Pathshala has been a beacon of knowledge and guidance in the field of Vedic astrology. With a team of experienced astrologers, the institute provides in-depth courses that cover various aspects of astrology, including Nakshatras, planetary influences, and remedies. Whether you are a beginner seeking to learn astrology or someone looking for expert astrological advice, Astro Pathshala is dedicated to helping you navigate life's challenges and unlock your full potential through the ancient wisdom of Vedic astrology.
For more information about their courses and consultations, visit Astro Pathshala.
How to Create Sequence Numbers in Odoo 17Celine George
Sequence numbers are mainly used to identify or differentiate each record in a module. Sequences are customizable and can be configured in a specific pattern such as suffix, prefix or a particular numbering scheme. This slide will show how to create sequence numbers in odoo 17.
How to Store Data on the Odoo 17 WebsiteCeline George
Here we are going to discuss how to store data in Odoo 17 Website.
It includes defining a model with few fields in it. Add demo data into the model using data directory. Also using a controller, pass the values into the template while rendering it and display the values in the website.
No, it's not a robot: prompt writing for investigative journalismPaul Bradshaw
How to use generative AI tools like ChatGPT and Gemini to generate story ideas for investigations, identify potential sources, and help with coding and writing.
A talk from the Centre for Investigative Journalism Summer School, July 2024
How to Install Theme in the Odoo 17 ERPCeline George
With Odoo, we can select from a wide selection of attractive themes. Many excellent ones are free to use, while some require payment. Putting an Odoo theme in the Odoo module directory on our server, downloading the theme, and then installing it is a simple process.
National Learning Camp( Reading Intervention for grade1)
Best Practices In Load And Stress Testing Cmg Seminar[1]
1. Best Practices in Performance Testing Jennifer Turnquist Storage Service Line Director Lionbridge Technologies
2. Lionbridge Profile Public Company (Nasdaq: LIOX) Nearly $400M+ in revenues Profitable Deep expertise across the application life cycle Application Development & Maintenance Testing (Independent V&V) Content Development, Conversion & Enhancement Globalization Worldwide scale and capability Over 4000 employees operating in 25 countries (Scale) SEI CMM Level 5 certified process model (Quality) 8 of the world’s 10 most valuable companies are Lionbridge customers BusinessWeek Global 1000, July 2004
3. Services Designed around our Client’s Need Lionbridge A Trusted Partner Around the World Global Development & Testing Solutions Global Language & Content Solutions Interps Software Development Lifecycle Application Development Testing & Certification Maintenance & Support Full Content Lifecycle Localization/Translation Technical Publications eLearning Courseware Off-shore platforms leverage more than staff in China, India, and Eastern Europe Global footprint enables local interaction and facilitates worldwide release and support Trusted, US-based public company protects against IP loss Localization services spanning more than 80 languages Proprietary web-architected TM and terminology solution accelerates production and improves consistency Authoring and eLearning development services integrate seamlessly with localization to address global demand
4. VeriTest: Setting the Standard in Testing Since 1987 World’s largest independent testing company Over 400 test architects, engineers, and analysts in 11 labs across US, Europe, Asia Rapid expansion in VeriTest India From PDAs and PCs to 32-way servers Data center class storage lab Industry leader Exclusive provider and architect of industry- leading certification programs Developer of PC Magazine benchmarks Test and publish industry standard ISP benchmarks Operate globally-networked onsite to offshore model
5. The Lionbridge Team Local Connections, Global Efficiency 4,000+ Worldwide Staff Experience and Efficiency
6. Today’s Agenda Why Test Performance? Different Types of Performance Testing Performance Testing Roadmap Choosing the Right Testing Tools Top 10 performance testing pitfalls
7. "The standard philosophy of 'test to destruction'... will probably give you an idea of roughly how many users your site can handle at once, but it won't always tell you why the site fails to function properly. And without knowing why, you're not likely to be able to do much about it..." -- Extreme Tech
8. Why Test Performance? The internet and IT infrastructure crucial to business Users—employees, business partners, customers—rely on portals, applications, and data to do their jobs Cost of failure can be devastating Performance testing in the enterprise is intermittent, cyclical, often prompted by upgrades Testing is highly specialized
9. The high cost of not conducting performance testing Performance testing overlooked until disaster strikes Lost and abandoned sales - most visible result of poor performance testing but… Efficiency of mission-critical systems directly impacts business productivity Preventing problems—lost productivity, lost business, lost reputation, and even injury or death—is a major incentive Knowing the vital performance metrics = ammunition to IT departments when planning and justify purchasing decisions Provides the ability to demonstrate to investors and other critical stakeholders that the company’s infrastructure is adequate
10. Events that trigger performance testing Build vs. buy Evolving requirements Technology due diligence Consolidating servers Deploying a SAN Deploying or upgrading enterprise application Migrating to a new platform Addition of features Response to public critique Enhancements due to buying trends Acquiring or merging a business Launching new product Enhancing web application Promoting an offering Doing any of the above globally
12. Conduct the right test to get the right results Load Testing Determines the response time and throughput during typical user load Stress Testing Determines the peak user load Volume Testing Determines the problems that occur during long-term user activity Component Testing Determines the performance and behavior of a specific component Benchmark Testing Measures the performance of a system or component relative to a standard Transaction Cost Analysis Determines the system resources consumed by a single transaction
13. Performance Testing Roadmap Define Communicate Test Script Plan Identify stakeholders Agree on goals of testing Determine budget Determine schedule constraints Agree to promotion strategy Outline resources available Determine staffing plan Engage test lab (if needed) Verify basic functionality Generate use cases Capture user activity logging information Analyze user activity profile Model user activity Choose the tool(s) Identify re-usable script components Assign resources needed for scripting, testing Create test environment Design scripts Create scripts Validate scripts Build script library Execute tests Collect data Analyze test results Run possible iterations Troubleshoot bottlenecks Tune system Retest Log non-performance failures Outline context Draft results Provide feedback to stakeholders Deliver action items Finalize report(s) Promote results
14. An overview of the performance testing process After initiating the test, the load generator systems to begin accessing the system under test using the designed usage patterns. Depending on whether the test is a global, local, or isolated configuration, the load generators may be located worldwide or completely contained within a test lab. The one critical configuration requirement for the load-generating systems is that they have adequate network bandwidth throughput capability to access the system under test in a realistic manner without bandwidth constraints. If bandwidth constraints become a problem, adding additional load generators to the pool of load generators will typically fix this problem. If the test is global or local, the Internet will be an important factor in the configuration. For an isolated configuration, the Internet is not a factor.
15. An overview of the performance testing process Once a performance test is initiated, it can run for several minutes to several days, depending on the test goal. During the test time, the test tool monitors and collects performance data from all of the components within the system under test, such as the Web server, application server, or database server. All of the monitor data along with the performance test data collected at the generating client end to determine the overall performance as well as the potential system bottlenecks. In a typical performance test cycle, the performance bottlenecks are located, fixed, and iteratively retested to ensure that they are fixed as designed.
16. High Level Picture of the Process Overcome resource limitations • Replace testers with “Virtual Users” • Run many Virtual Users on few machines • Controller manages Virtual Users • Run repeatable tests with scripted actions • Get meaningful results with analysis tools System under Test Load Generation
17. You don’t have to go it alone Define Communicate Test Script Plan Build and train internal resources Hire contractors Utilize service offerings from test tool vendors Rely on application provider Engage with a consulting firm or SI Partner with an independent testing company
18. Important considerations for choosing the resources Deadlines Testing skills and experience Technology and/or application expertise Frequency and scale of testing requirements Infrastructure requirements Risk assessment Market factors
19. The vast number of performance testing tools can be overwhelming
20. Important considerations for choosing the right tool Do you already own the license? Do you have the internal resources to script and execute? Will it meet the test objectives? Is it compatible with your technology objectives? Does it fit within your budget constraints? Do you have the training and expertise to analyze the results? Does it match the frequency of your testing needs?
21. Leading performance tools Expensive hardware to purchase Uses SST TracePlus to provide record and playback feature No additional hardware required Spirent Avalanche/Reflector Expensive license Requires a unique license for each protocol type Compatible with numerous protocols Excellent data analysis tools WAN emulation Web transaction breakdown monitor Mercury Interactive LoadRunner Windows only Expensive license Offer “Lite” version for reduced price Excellent data analysis tools Root cause analysis tools included Segue SilkPerformer Covers few protocols; primarily Web-based automation Inexpensive license Good data analysis tools RadView WebLoad Cons Pros Benchmark Tool
22. Popular Benchmark Tools block level data transfer, OLTP (database) traffic block level data transfer terminal services traffic IOMeter IOZone TSScaling Other Industry Standard Tools SPECjbb SPECsfs SPEC Java Business Benchmark WCAT WMLS Microsoft LOADSIM DBHammer VeriTest NetBench VeriTest WebBench Benchmark Tool SPEC benchmarks Microsoft Tools VeriTest Tools Exercises the CPU NFS File / Network Traffic HTTP(S) traffic streaming media Exchange email traffic SQL database traffic CIFS file / network traffic HTTP / Web traffic Workload Simulated
23. Manual testing may be your best tool Frequency of testing requirements Rate of change Limitations of available tools
25. Top 10 Not Testing. Lack of clearly defined test objectives and poor planning. Relying exclusively on beta customers to find performance issues. Using the wrong tool for the job. Introducing too many variables simultaneously into a test. Failing to test how your product or system is actually used. Conducting load testing in a vacuum. Treating performance testing like a one-time event. Assume that the scripting effort will be short and simple. Finding functional bugs during performance runs.
26. Conclusion Companies rely on systems to conduct business efficiently and effectively Performance testing ensures that your users are getting reliable and timely access to the resources they need Performance testing mitigates the risk of lost time and money due to poor performance A fully integrated performance testing program is the preventative medicine that keeps your system from becoming an inaccessible and costly resource. Though it may seem counterintuitive at first to slow your deployment for performance test planning and execution, the payoff in time, money, and quality will be big and will come soon.