The document discusses the flawed approach of "racing performance" without proper preparation. It argues that jumping straight into performance testing without checking components like tires, fuel levels, and ensuring the system is functioning properly leads to "fatal consequences" like crashes. Some reasons this approach still occurs include a lack of performance culture, confusion between performance and load testing, and viewing all problems as requiring load automation. The document promotes better approaches like verifying components, defining standards, stress testing parts individually and together before integration, using meters to monitor system functions, conducting lab tests, and test driving the system before attempting high-stakes races.
Ankur Jain presented on using the User Timing API to measure user perceived performance of single-page applications without an APM tool. The User Timing API allows developers to mark milestones and measure the time between them to understand performance. While it requires code changes, it provides accurate, real-user monitoring of applications across browsers. Some limitations are that it requires knowledge of the application's user flow and code access to implement the markers.
The document summarizes a presentation given at the Performance Advisory Council in Santorini, Greece in February 2020. The presentation advocated for using cloud-based performance engineering tools for their ease of use, ability to automatically correlate data, and to scale testing on demand. It cautioned that adopting new tools requires maintaining the same performance testing culture to avoid generating misleading or inaccurate results.
This document discusses automating performance testing pipelines. It covers value stream mapping to identify tasks that can be removed, simplified or automated. Automating testing provides benefits like reduced time, allowing specialists to focus on higher-value work, and empowering others to run tests. The document demonstrates automating a JMeter load test, providing tips like using JMeter projects and scripts. It notes that significant time savings are possible from automating not just test execution but test development as well through techniques like UI automation.
The document discusses adding performance verifications to continuous delivery pipelines. It notes that the typical approach to performance testing does not work well for continuous delivery, as test scripts are fragile and it is difficult to identify the cause of issues. The document recommends taking a different approach with the goal of detecting any degradations as early as possible. It suggests implementing unit performance tests to test endpoints in isolation and detect degradations immediately after they are introduced. While unit tests are important, integration and load tests are still needed periodically to test how components interact as a whole. Client-side performance also needs to be considered.
This document provides an overview of a presentation on the economic impacts of performance testing practices. It discusses two examples:
1) Ford's Pinto car that had a design flaw causing fuel tanks to explode in rear-end collisions. Analyzing the options of fixing the flaw vs paying fines, fixing it would have cost $137 million but saved much higher future costs.
2) An example project prioritizing what processes to automate for performance testing. Following the Pareto principle, automating the top 20% of processes that generate 80% of the load yields far better results than automating all processes, saving significant time and resources.
The presentation emphasizes how taking a long-term view of costs and following
This document discusses web performance optimization techniques. It is a summary of rules for web performance by Mark Tomlinson, who has 27 years of experience in performance. Some of the key techniques discussed include reducing HTTP requests, optimizing file compression, minimizing code, improving web font and image performance, prefetching resources, avoiding unnecessary redirects, and optimizing infrastructure and databases. The document emphasizes measuring performance through load testing and monitoring to identify bottlenecks.
This document discusses using R for exploratory result analysis of load testing data from JMeter. It provides an introduction to R and highlights benefits like it being programming based, developed for data analysis, supporting exploratory analysis, and including visualization libraries. It also gives examples of base R functions for data manipulation and visualization including aggregate, subset, ifelse, scatter plots, and using color. Finally, it discusses using R to create interactive dashboards for reporting load testing results.
The document discusses observability in systems and discusses how logs, metrics, and traces can provide context and help observe internal system states through external outputs. It notes there are trade-offs to consider in collecting, storing, detecting, alerting, visualizing, and configuring observability data. The document also briefly touches on using AI in observability and the human element.
- Automating performance tests through continuous integration can provide direct feedback on performance changes after code releases and infrastructure changes. It allows performance issues to be detected and addressed earlier.
- Key best practices include starting with a single important test scenario, focusing on robustness over realism, visualizing trend data over time, and analyzing results to update thresholds and catch regressions.
- The goal is to continuously monitor performance through the pipeline and in production to better understand impacts of changes and flag any performance issues for further investigation. Automated tests complement but do not replace thorough acceptance testing.
This document discusses using machine learning algorithms for predictive performance modeling of IT systems. It explains that ML can be used to predict quality of service metrics like response time and server utilization for different conditions, such as increased user load or hardware configurations, based on past production and test data. This helps with data-driven decision making for hardware procurement and effective utilization, and can reduce the cost and time of performance testing by complementing benchmarking and application tuning. The key is having sufficient historical data to train accurate models, with various techniques available along a cost-accuracy spectrum, from linear projections to simulation to machine learning.
This document describes a performance automation solution using load testing scripts to continuously monitor application performance. The solution uses scripts to test functionality, availability, response times, and end-to-end workflows. Load testing engines run the scripts on a periodic schedule and store results. An alerting system analyzes results and sends alerts if response times exceed thresholds or tests fail to run. The system is containerized using Docker for scalability. Potential customers include project managers who need regression testing, monitoring of production applications, and emergency alerts about degradations or failures.
The document provides guidance on developing reliable load test scripts and scenarios. It discusses test data requirements, parameterizing dynamic values, proper use of HTTP protocol versus GUI-level scripting, handling asynchronous requests, implementing think time and pacing, validating scripts, and best practices for load scenario configuration. The goal is to explore main points around scripting best practices, validating load scripts thoroughly, and configuration best practices to build effective performance tests.
This document discusses testing strategies for distributed decentralized systems. It outlines various types of network, data, security, load, and consensus testing that can be performed. Network testing would examine factors like latency, bandwidth, packet loss, and different network topologies. Data testing focuses on syncing and smart contract functionality. Security testing evaluates attacks like eclipses, double spends, and 51% attacks. Load testing stresses nodes and networks with high transaction volumes. Consensus testing analyzes block times, state transitions, and finality across client implementations. The goal is to benchmark and evaluate system performance under different conditions.
Scott Moore presented on performance testing HTTP/3 (QUIC). He provided background on the timeline of HTTP/3 development and how it aims to solve head of line blocking that still exists with HTTP/2 over TCP. Moore demonstrated performance tests of HTTP/1.1, HTTP/2, and HTTP/3 with and without network emulation of LTE and satellite connections. The results showed HTTP/3's potential to remove head of line blocking but also challenges around increased server load and UDP optimizations needed. Moore concluded that HTTP/3 benefits will be most noticeable on low bandwidth connections and additional protocols will need to support it as adoption increases over time.
The document discusses optimizing cloud configurations for price-performance. It notes that cloud providers offer many compute and storage options that can impact costs and performance. Traditional optimization approaches like model-based recommendations or testing applications may not find the best configurations due to hidden bottlenecks or unique application behaviors. The document proposes a new AI-driven approach that uses automated testing and machine learning to continuously optimize configurations. As an example, it shows how AI optimized a MongoDB database on AWS, finding a configuration that improved throughput by 205% and reduced latency by 90% while lowering costs by 2.9%. The AI approach was able to find unconventional but higher-performing configurations that human experts may miss.
PAC 2019 virtual Uma Malini ; Hari Krishnan RAMACHANDRANNeotys
Performance assurance of SAP applications requires addressing challenges across the presentation, application, and database layers. Key challenges include high customization, tight coupling of SAP and non-SAP systems, large data volumes and transactions. Solutions involve selecting the right performance testing tools, leveraging functional test automation, integrated production scenario testing, and approaches like snapshot/restore for faster test cycles. Critical factors for performance assurance include detailed planning, workload design, batch strategy, data modeling, tuning, integrated execution, and best practices around focus, infrastructure, methodology, and diagnostics.
This document discusses the need for an automated framework for load testing to fully automate performance testing into continuous integration/continuous delivery (CI/CD) pipelines. It outlines benefits like improved quality and consistency of testing, increased velocity of performance testing, and decreased costs of performance testing. It provides recommendations for creating an automated framework, including keeping it simple, basing analysis on raw data, using existing tools, and making it configurable, customizable, and scalable. It also gives examples of requirements and initial steps to start automating load tests.
This document provides an overview of Mark59, an open source solution for Jmeter-Selenium-CI testing. It describes the progression from traditional performance and validation testing to continuous integration testing using Mark59. Key aspects of Mark59 include scripting tests in Selenium, running tests from Jenkins, and analyzing results through trend analysis and SLAs. Mark59 allows running individual or distributed Selenium tests from Jmeter and provides APIs, documentation, and examples for developing tests.
The document describes the journey of automating large scale enterprise crash dump analysis. It details how manual crash analysis used to be a slow and difficult process involving passing large files between experts. Through four steps of automation - automating analysis, adding a web frontend, integrating workflows, and enabling deep analysis in the browser - a tool called SuperDump was created that transformed the process. SuperDump reduced analysis time from days to minutes, enabled non-experts to analyze crashes, and improved productivity, security, and quality by making analysis scalable and easy.
Performance modeling provides important insights for capacity planning and system sizing without costly full-scale testing. While sophisticated mathematical modeling was common in the past, today's complex systems are difficult to model formally and existing tools are outdated. However, minimal modeling with common-sense approximations using metrics like resource usage per transaction and hardware capacity can still be useful. Keeping even informal models in mind helps performance engineers understand systems, but complex systems benefit from documenting models. Reviving the art of performance modeling can add value to modern continuous performance testing approaches.
Encontro anual da comunidade Splunk, onde discutimos todas as novidades apresentadas na conferência anual da Spunk, a .conf24 realizada em junho deste ano em Las Vegas.
Neste vídeo, trago os pontos chave do encontro, como:
- AI Assistant para uso junto com a SPL
- SPL2 para uso em Data Pipelines
- Ingest Processor
- Enterprise Security 8.0 (Maior atualização deste seu release)
- Federated Analytics
- Integração com Cisco XDR e Cisto Talos
- E muito mais.
Deixo ainda, alguns links com relatórios e conteúdo interessantes que podem ajudar no esclarecimento dos produtos e funções.
https://www.splunk.com/en_us/campaigns/the-hidden-costs-of-downtime.html
https://www.splunk.com/en_us/pdfs/gated/ebooks/building-a-leading-observability-practice.pdf
https://www.splunk.com/en_us/pdfs/gated/ebooks/building-a-modern-security-program.pdf
Nosso grupo oficial da Splunk:
https://usergroups.splunk.com/sao-paulo-splunk-user-group/
Understanding Cybersecurity Breaches: Causes, Consequences, and PreventionBert Blevins
Cybersecurity breaches are a growing threat in today’s interconnected digital landscape, affecting individuals, businesses, and governments alike. These breaches compromise sensitive information and erode trust in online services and systems. Understanding the causes, consequences, and prevention strategies of cybersecurity breaches is crucial to protect against these pervasive risks.
Cybersecurity breaches refer to unauthorized access, manipulation, or destruction of digital information or systems. They can occur through various means such as malware, phishing attacks, insider threats, and vulnerabilities in software or hardware. Once a breach happens, cybercriminals can exploit the compromised data for financial gain, espionage, or sabotage. Causes of breaches include software and hardware vulnerabilities, phishing attacks, insider threats, weak passwords, and a lack of security awareness.
The consequences of cybersecurity breaches are severe. Financial loss is a significant impact, as organizations face theft of funds, legal fees, and repair costs. Breaches also damage reputations, leading to a loss of trust among customers, partners, and stakeholders. Regulatory penalties are another consequence, with hefty fines imposed for non-compliance with data protection regulations. Intellectual property theft undermines innovation and competitiveness, while disruptions of critical services like healthcare and utilities impact public safety and well-being.
Software Engineering and Project Management - Introduction to Project ManagementPrakhyath Rai
Introduction to Project Management: Introduction, Project and Importance of Project Management, Contract Management, Activities Covered by Software Project Management, Plans, Methods and Methodologies, some ways of categorizing Software Projects, Stakeholders, Setting Objectives, Business Case, Project Success and Failure, Management and Management Control, Project Management life cycle, Traditional versus Modern Project Management Practices.
An Internet Protocol address (IP address) is a logical numeric address that is assigned to every single computer, printer, switch, router, tablets, smartphones or any other device that is part of a TCP/IP-based network.
Types of IP address-
Dynamic means "constantly changing “ .dynamic IP addresses aren't more powerful, but they can change.
Static means staying the same. Static. Stand. Stable. Yes, static IP addresses don't change.
Most IP addresses assigned today by Internet Service Providers are dynamic IP addresses. It's more cost effective for the ISP and you.
20CDE09- INFORMATION DESIGN
UNIT I INCEPTION OF INFORMATION DESIGN
Introduction and Definition
History of Information Design
Need of Information Design
Types of Information Design
Identifying audience
Defining the audience and their needs
Inclusivity and Visual impairment
Case study.
Conservation of Taksar through Economic RegenerationPriyankaKarn3
This was our 9th Sem Design Studio Project, introduced as Conservation of Taksar Bazar, Bhojpur, an ancient city famous for Taksar- Making Coins. Taksar Bazaar has a civilization of Newars shifted from Patan, with huge socio-economic and cultural significance having a settlement of about 300 years. But in the present scenario, Taksar Bazar has lost its charm and importance, due to various reasons like, migration, unemployment, shift of economic activities to Bhojpur and many more. The scenario was so pityful that when we went to make inventories, take survey and study the site, the people and the context, we barely found any youth of our age! Many houses were vacant, the earthquake devasted and ruined heritages.
Conservation of those heritages, ancient marvels,a nd history was in dire need, so we proposed the Conservation of Taksar through economic regeneration because the lack of economy was the main reason for the people to leave the settlement and the reason for the overall declination.
Exploring Deep Learning Models for Image Recognition: A Comparative Reviewsipij
Image recognition, which comes under Artificial Intelligence (AI) is a critical aspect of computer vision,
enabling computers or other computing devices to identify and categorize objects within images. Among
numerous fields of life, food processing is an important area, in which image processing plays a vital role,
both for producers and consumers. This study focuses on the binary classification of strawberries, where
images are sorted into one of two categories. We Utilized a dataset of strawberry images for this study; we
aim to determine the effectiveness of different models in identifying whether an image contains
strawberries. This research has practical applications in fields such as agriculture and quality control. We
compared various popular deep learning models, including MobileNetV2, Convolutional Neural Networks
(CNN), and DenseNet121, for binary classification of strawberry images. The accuracy achieved by
MobileNetV2 is 96.7%, CNN is 99.8%, and DenseNet121 is 93.6%. Through rigorous testing and analysis,
our results demonstrate that CNN outperforms the other models in this task. In the future, the deep
learning models can be evaluated on a richer and larger number of images (datasets) for better/improved
results.
A vernier caliper is a precision instrument used to measure dimensions with high accuracy. It can measure internal and external dimensions, as well as depths.
Here is a detailed description of its parts and how to use it.
OCS Training Institute is pleased to co-operate with
a Global provider of Rig Inspection/Audits,
Commission-ing, Compliance & Acceptance as well as
& Engineering for Offshore Drilling Rigs, to deliver
Drilling Rig Inspec-tion Workshops (RIW) which
teaches the inspection & maintenance procedures
required to ensure equipment integrity. Candidates
learn to implement the relevant standards &
understand industry requirements so that they can
verify the condition of a rig’s equipment & improve
safety, thus reducing the number of accidents and
protecting the asset.
A brand new catalog for the 2024 edition of IWISS. We have enriched our product range and have more innovations in electrician tools, plumbing tools, wire rope tools and banding tools. Let's explore together!
A brief introduction to quadcopter (drone) working. It provides an overview of flight stability, dynamics, general control system block diagram, and the electronic hardware.
1. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
DON’T RACE PERFORMANCE
Leandro Melendez – Señor Performo
2. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
Who am I?
• I am Leandro Melendez
• Performance tester, scripter, engineer and idealist. Perf Manager @QualiTest
• Lot’sa experience on several performance projects for 10 years.
• Found lots of vices, weird paths of action and plain ignorance.
• Decided to get a secret identity and wear the spandex moustache to fight against
bad practices, process inertia and plain ignorance as Señor Performo.
• All this through a blog (www.srperf.com), Social NW’s (@srperf), hosting the
Spanish version of the performance testing podcast PerfBytes en Español.
• Last, but not least, through public speaking, spreading the word of performance,
which we will do here today.
3. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
Why this?
Motif
4. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
There are problems in this world
• Mi journey is to help to fix problems
5. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
There are problems in this world
• Mi journey is to help to fix problems
• Many simple topics are still confused
6. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
There are problems in this world
• Mi journey is to help to fix problems
• Many simple topics are still confused
• I will not talk complex tech stuff
7. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
There are problems in this world
• The mission is to defeat comon evils
WRONG
8. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
There are problems in this world
• The mission is to defeat comon evils
• Today we will try to defeat one
WRONG
9. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
There are problems in this world
• The mission is to defeat comon evils
• Today we will try to defeat one
• Orating and public speaking
10. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
What do you mean racing?
11. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
BUT FIRST
A question
12. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
Will require your cooperation…
13. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
Have you been asked to… ?
14. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
Don’t be shy
15. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
Have you been asked to… ?
16. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
17. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
THATS WHAT I MEANT
WITH RACING
18. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
Just racing without preparation
FATAL CONSEQUENCES!
19. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
FLAWED APPROACH
Why just racing ends in crashes?
20. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
Just jumping in…
• Many try it
• But it is like a stunt
21. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
Just jumping in…
• Many try it
• But it is like a stunt
22. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
Just jumping in…
• It is extremely difficult
• Requires lots of prep
• Generally bad outcomes
23. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
ISSUES on the approach
• Often they do not check tires
• Air pressure
• May even be flat
24. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
ISSUES on the approach
• Do ti without fuel
• Did not even check before!
25. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
ISSUES on the approach
• Do ti without fuel
• Did not even check before!
• MAY NOT EVEN HAVE
A FUEL METER!
26. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
NO idea of nothin’
• Often do not know what is going on with the rest of the car either
27. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
Thinking it is safe
• Still, without knowing how the car is doing, let’s try to race it
28. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
Thinking it is safe
• Still, without knowing how the car is doing, let’s try to race it
29. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
Thinking it is the way
• By racing the car we will find out if it has gas, air in tires, oil, etc.
30. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
Why is this STILL happening?
31. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
There are some reasons
• None or not much performance culture
32. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
There are some reasons
• Many still think it has to happen in the end
Perf UAT Functional
33. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
There are some reasons
• It is the only way for response times
34. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
There are some reasons
• Confusion of performance and load
PeRf LoAd
35. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
There are some reasons
• All this time Performance Testing was only…
36. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
There are some reasons
• All this time Performance Testing was only…
• LOAD TESTING!!!! 😱
37. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
There are some reasons
• Man with a hammer fallacy
38. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
There are some reasons
“To a man with a hammer,
everything looks like a nail.”
― Mark Twain
39. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
There are some reasons
• To a man with a load automation tool
everything must be load automated
40. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
BETTER WAYS
41. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
Don’t just check it in
• The produced parts were just placed on the cars as they came out
42. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
Don’t just check it in
• Verify each piece as soon as it is produced and measure the performance
with any transcendent metrics
43. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
Don’t just check it in
• You must define standards of what is acceptable before you build
44. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
Stress per piece
• Even before mounting pieces, put them to small tests.
• Stress, use, specs, etc.
• Separated!
45. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
Put together
• As pieces are ready put them together and test them together.
• Not too much. Just shake ‘em.
• Do they fit together?
46. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
Meters!
Speed meter
Distance meter
Fuel tank meter
Battery meter
Temperature meter
RPMs meter
Direction meter
Tire pressure meter
More stuff meter
47. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
Lab test
Before a race check in controlled smaller environments.
48. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
Just drive it
You have meters.
You can see how it does.
Drive it yourself, the testers or
anyon else before races.
49. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
Fix any issue found
As soon as issues on your car are
detected. . . Fix them!
50. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
You are ready!!!!
After you applied all the previous steps, you can race!
51. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
Questions?
52. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
THANK YOU!