Here are a few potential questions from the document: - What is the true value of ISTQB certifications beyond just checking a box for management? How can the knowledge be applied practically? - How can metrics be designed and used effectively to assess quality and test coverage in an agile environment? What are some examples of valid and invalid metrics? - What artifacts or information are useful to include in a test plan even for agile teams using tools like JIRA? How can a test plan provide value beyond just additional paperwork? - What techniques can be used to effectively estimate defect severity when multiple testers with different perspectives are involved? How can consistency be achieved? - How can root cause analysis be applied
The document discusses how test managers are often seen as "black sheep" who raise issues without solutions and cause delays. It argues that test managers need to shift from a reactive to proactive role by getting involved early in projects, changing attitudes, and applying a test management dashboard to provide transparency and value. The dashboard would use KPIs and metrics to track testing progress, quality, risks, and deliver early warnings so test managers are seen as project victors rather than victims.
This presentation looks at how to make any improvement project a success, and it does this by using some fun case study examples looking at how Geoff has seen things go wrong, and from the depth of despair we will look at how things can be made to be successful. View webinar recording - https://testhuddle.com/resource/test-process-improvement-hard-can/
1) The document discusses root cause analysis techniques like the 5 Whys approach to identify underlying causes of problems that are not obvious from symptoms. It provides an example of using 5 Whys to determine why a customer was unhappy. 2) Lessons learned processes are described, including organizing workshops at the end of projects to identify positive and negative technical and management lessons to derive solutions and prevent repeat mistakes. 3) Applying lessons learned from previous similar projects is advised by selecting applicable lessons upfront and defining specific actions to address them over the course of new projects.
EuroSTAR Software Testing Conference 2011 presentation on Quality In Use by Isabel Evans. See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
Paul Gerrard presented a new test process and tool called Cervaya that combines elements of structured and exploratory testing. The process involves testers surveying features using Cervaya to iteratively build system models and test plans. This shifts testing earlier in the development process. Cervaya logs tester activity, supports real-time collaboration, and could generate documentation. The goal is to make testing more aligned with agile and continuous delivery approaches. Gerrard invited collaboration on further developing Cervaya.
The document describes a project defect model used at Ericsson to control project quality and performance. It tracks defects inserted during phases like specification and design, and detects defects during testing phases. This provides early signals about quality risks. On a pilot project, the model helped clarify requirements, enforce design rules, and focus testing. It accurately predicted defects at release, allowing better planning. The model has now been adopted for several projects to improve quality tracking, risk management, and decision making.
Going live is a huge accomplishment, but the journey has only just begun. Now that you’re operating in your new system, what are your plans to get the most of it and minimize disruptions? Hindsight is 20-20. Gain strategies from someone who lives in the post-implementation world. She not only knows the common pitfalls and setbacks, but will share tools and techniques of how to resolve and prevent them! Join Julie, NAV Support Lead to gain strategies, tools and techniques to succeed at making your vision a reality after go-live. Julie Bilodeau BDO Canada LLP BDO Connections 2016 | Knowledge Pod
EuroSTAR Software Testing Conference 2009 presentation on Test Process Improvement on a Shoestring by Ruud Teunissen. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
In this webinar, Dave Haeffner (Elemental Selenium, USA) discusses how to: - Build an integrated feedback loop to automate test runs and find issues fast - Setup your own infrastructure or connect to a cloud provider -Dramatically improve test times with parallelization https://huddle.eurostarsoftwaretesting.com/resource/webinar/use-selenium-successfully/
Wouldn’t it be nice if you could have more insight into the quality of a product, while it is developed, and not afterwards? Would you like to be able to estimate how many defects are inserted in the product in a certain phase, and how effective a (test) phase is in capturing these defects? To optimize your test phases regarding focus and effort in relation to how many defects they will find? This presentation will show a simple but very effective model that makes it possible: The Project Defect Model. The aim of the Project Defect Model is to track product quality, take corrective actions and reduce quality risks. To get more insight into the quality of the product during development, it is needed to measure the software development processes with two views: Introduction and detection of defects. Introduction is done during the specification, design and coding phases; defects are either introduced into documents or into the actual product. Detection of defects is done via inspections and test during all the phases of the project. A tool was developed using a spreadsheet. The purpose of the tool was to estimate the number of defects per phase, and to track all defects discovered in inspections and tests against these estimates. The tool supported analysis of the data with both calculated values and graphs comparing actuals to estimates in terms of current status and trends over time. The Project Defect Model has been beneficial to projects. It has helped estimating, planning, and tracking quality during the project, including an estimate of the the number of defects left in the released product. The quality data has been used in the project together with time and cost data, to take better decisions on test, review and inspections, and design. Also it has identified quality risks at an early stage, helping the project take corrective actions and decisions on product release and maintenance capacity planning. Finally the model provided insight into the effectiveness of the verification activities, supporting effective process improvement. Paper: The presentation is on a defect planning/tracking tool and approach. Focus will be upon: • Goals: What was the purpose of the model, why developed, what did we want to reach? • How: Show the definition of the model and its implementation and application. • Tools: The tool that was developed to implement the model, how it works, strengths. • Results: How did the model and tool help the project? Did it live up to its purpose? • Success factors: What were key issues that we have dealt successfully with? • Future: How is this model used in future projects, what could further increase its benefits?
I believe that our existing models of testing are not fit for purpose – they are inconsistent, controversial, partial, proprietary and stuck in the past. They are not going to support us in the rapidly emerging technologies and approaches. The certification schemes that should represent the interests and integrity of our profession don’t, and we are left with schemes that are popular, but have low value, lower esteem and attract harsh criticism. My goal in proposing the New Model is to stimulate new thinking in this area. eurostarconferences.com testhuddle.com
This document discusses choosing software development processes and methods that fit an organization's needs. It argues that one size does not fit all and that the development leader should understand their work and mix of projects in order to choose appropriate methods. Projects can involve varying levels of innovation and uncertainty, and the document provides examples of analytics and practices that fit different types of projects, from routine low-innovation work to high-innovation projects with more uncertainty. It emphasizes the need to measure progress and continuously adapt methods based on outcomes.
Over the past three years, our company’s test team has grown from three lonesome testers to a community of nine – with more planned. Since we don’t see testers as “click monkeys”, but as valuable and integrated project members who bring a specific skill set to the table, it’s important for us to choose testers well and to train them in various areas so that they can contribute, grow and see their own career path within testing. To structure to our internal tester training program, we have been developing role descriptions, education paths and career options for our testers, which I’d like to share with you in this webinar. View webinar - https://huddle.eurostarsoftwaretesting.com/resource/webinar/growing-company-test-community-roles-paths-testers/
Lean pilots provide an innovative framework for solving enterprise challenges through agile and lean methodologies. Case studies highlight successes in automating contracts, streamlining third party onboarding, improving the DUNS research process, and ensuring Privacy Shield compliance. Lessons learned include welcoming early failure, using cross-functional self-organizing teams, making decisions with data, and ensuring enterprise collaboration. The framework establishes a repeatable process for running lean experiments to minimize risk and waste.
EuroSTAR Software Testing Conference 2011 presentation on Practical Approaches to Motivating Testers by Dr. Tafline Murnane & Dr. Stuart Reid . See more at conferences.eurostarsoftwaretesting.com/past-presentations/
Running a single beta test is hard. Running multiple beta tests can make you want to tear your hair out! In this webinar, our beta managers as they share their advice for handling the demands of multiple beta tests at once. Click this link to access the on-demand webinar: https://www.centercode.com/webinar/2016/july/
Presenter: Anne Hungate President, Daring Systems You’ve heard about Continuous Integration and Continuous Delivery but what’s common as code makes its way through those processes? Testing. With DevOps Testing (also known as Continuous Testing), testing tasks are engineered to be continuously completed end-to-end across the entire development to deployment pipeline. Developers, QA analysts, security professionals, IT Operations analysts…everyone is a tester in a DevOps environment. Join us to learn more about DevOps Testing and the emerging role of DevOps Test Engineer.
EuroSTAR Software Testing Conference 2013 presentation on "OMG What Have We Done" by Gerlof Hoekstra. See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
The document discusses 10 signs that an organization's software testing may not be enough. These include having excessive production bugs, bugs found during user acceptance testing, growing bug counts over test cycles, not investing in testing compared to competitors, lacking clear criteria for what constitutes "enough" testing, testers advising against releasing software, weak prevention efforts like code reviews, lack of developer unit testing, frequently reduced testing periods causing deadline problems, and high tester turnover. The document advocates treating testing as risk management, increasing test reuse and automation, and addresses common challenges and questions around software testing.
Anton muzhailo - formal test process improvement. how to invest to the test process improvement and win
This document provides an overview of a presentation on agile test planning. It discusses the challenges of agile requirements and how test strategies serve a purpose beyond a single sprint. It also examines how the agile manifesto relates to planning and the value of test plans in agile. The presentation outlines four testing phases in agile - requirements and design, story/feature verification, system verification, and acceptance. It provides examples of what should be included in a test plan for each phase such as scenarios, automation approach, dependencies, and acceptance criteria.