Common mistakes in data science projects include: 1) Not properly defining the business problem or focusing on optimizing the wrong process. 2) Not adequately preparing the data or understanding how it was generated. 3) Rushing the modeling process or implementation without proper testing. 4) Choosing complex methods or "AI" solutions when simpler approaches may work better. 5) Not involving experienced people or adequately educating the team. To avoid these mistakes, it is important to carefully analyze the business problem, data, modeling process, and make sure the right people are involved.
This document provides a summary of a presentation on Rapid Software Testing. The presentation was given by Michael Bolton of DevelopSense and covered the methodology and mindset of rapid software testing. It emphasizes testing software expertly under uncertainty and time pressure. The presentation defines rapid testing as testing more quickly and less expensively while still achieving excellent results. It compares rapid testing to other approaches like exhaustive, ponderous, and slapdash testing. The presentation also discusses principles of rapid testing, how to recognize problems quickly using heuristics, and testing rapidly to fulfill the mission of testing.
The document provides guidance on hiring developers based on Joel Spolsky's book "The Ultimate Developer Recruiting Guide". It summarizes the key steps in Spolsky's hiring process: conducting interviews that last at least an hour and involve both technical exercises and allowing candidates to interview the hiring manager; focusing on finding candidates that are both smart and able to get things done; and emphasizing the importance of treating candidates and employees well to attract and retain top talent.
Can we discover new possibilities and overcome our anchors by working with questions instead of answers
In the startup world speed to market is everything. This talk covers how it is possible to embed user insights into a rapid software development cycle by conducting usability studies that break the stereotype that "research takes too long." Justin Marx and Rebecca Destello illustrate how to plan, conduct, analyze and inform development sprints in just one week with what famously became known as "Witness Wednesdays." Justin Marx, Product Designer and Rebecca Destello, Manager, Research & Insights - both with Atlas Informatics.
Why is it that everyone knows the importance of frequent user testing, yet hardly anyone does it? Because user testing often is time consuming, complex and expensive. It probably doesn’t fit in your development process and thus feels like extra work. To feel reassured you tell yourself to test with users once you have something working, or at the very end of the process. This is strange, because everybody knows that changing your product late in the process will increase costs exponentially. We created a way so that user testing saves time, improves the quality and doesn’t cost a lot of money. Team driven, pragmatic and no extra resources needed. The talk will show how, with only 2 hours every sprint, we focused on creating better products faster. We would love to share our learnings and simple DIY tools that let you start user testing with your current teams tomorrow!
https://www.bigdataspain.org/2016/program/thu-managing-data-science.html https://www.youtube.com/watch?v=XolLvcdxP2c&t=48s&list=PL6O3g23-p8Tr5eqnIIPdBD_8eE5JBDBik&index=12
This document provides an overview and agenda for a training on problem solving and root cause analysis. It covers defining problems versus symptoms, using the Plan-Do-Check-Adjust problem solving model, developing problem statements, prioritizing problems, and practicing active listening skills to understand current conditions and gather information. The training aims to help participants reduce defects by addressing ongoing or critical issues.
The 7-Step Problem Solving Methodology outlines a standardized process for exploring problems, understanding root causes, and implementing effective solutions. The 7 steps include: 1) identifying the problem, 2) determining and ranking causes, 3) taking short-term action, 4) gathering data and designing tests, 5) conducting tests, analyzing data, and selecting a solution, 6) planning, implementing, and fail-safing the solution, and 7) measuring, evaluating, and recognizing the team. The methodology provides a disciplined approach for solving problems where the solution is not obvious.
The principal plays a key role in facilitating school improvement and professional learning for teachers. As an agent of change, the principal must intentionally address barriers to teacher learning, such as focusing too much on confirming existing ideas rather than challenging them. Some strategies for interrupting barriers include using protocols to structure discussion, making preconceptions explicit, and viewing mistakes as learning opportunities. The principal also ensures school goals are aligned to student needs based on data and provides resources to support teachers in achieving goals.
Money doesn’t grow on trees: developer teams are expensive and always need to deliver value. I’ll describe in a pragmatic way how we have adopted agile practices to deliver more value with the same team and to solve 3 pains: - estimation and deadlines - bug fixes and quality assurance - inefficient communication And without working overtime (or almost never).
When going into the development of a software product, a possible source of mistake is the incorrect evaluation of the complexity that lies behind an idea , as well as a clutter coming from the massive amounts of technologies enabled. This presentation explains a possible way to deal with such issues.
The document discusses problem solving skills and techniques. It describes the problem solving process as having five steps: 1) defining the problem, 2) finding possible solutions, 3) choosing the best solution, 4) implementing the solution, and 5) evaluating the solution. It also discusses common problem solving tools like brainstorming and the 5 Whys technique. Finally, it lists some reasons why people may fail to solve problems effectively, such as not being methodical or misinterpreting the problem.
Software development is not exactly the same as computer programming. When it comes to a career, development for productization introduces many more things than simply coding. It is important to learn how to accomplish tasks, sharpen skills, develop the career and enjoy it. And last but not the least, how to start?
This document discusses the application of the Thiagi Four-Door model for rapid e-learning. It describes Sun Microsystems' use of the model to address problems with expensive, repetitive, and boring e-learning courses that lacked autonomy and had high attrition rates. The Four-Door model incorporates case studies, expert questions, tests, games, and a library to engage learners. Sun implemented a prototype in 2 months, piloted it in 2 more months, and fully deployed the first Four-Door course after 4 months with positive learner feedback and results. The document recommends obtaining business support, allowing design time, and paying attention to guidance for future Four-Door implementations.
The document provides an overview of problem solving skills and thinking differently, with the goal of helping unemployed professionals think in new ways to find jobs. It discusses critical vs creative thinking, systems thinking, statistical thinking, intuition, problem solving tools/methods, and lateral/intuitive thinking. Techniques for thinking differently include meditation, reconnecting with senses/intuition, analogies/metaphors, conversations/interviews, and learning something new. The document aims to get readers to open their minds to new ideas and think in ways outside their comfort zones.
This document provides an overview and agenda for Week 5 of the Data Scientist Enablement course. It includes discussions on data visualization and a quote by John Tukey. The learning plan recommends readings on data visualization and time series analysis. Activities involve practicing with visualization tools and datasets. The assignment is a comparative study of data visualization tools. Submissions are due by Saturday at 11:59pm via email. References and additional resources on data visualization are also provided.
This document discusses best practices for managing product releases and software engineering teams. It provides the following recommendations: 1) Establish clear processes for releases, including regular intervals, versioning, distribution, and metrics to measure success. Ensure everyone understands their role in the release cycle. 2) Use the Dreyfus model of skill acquisition to balance team skills and experience levels. Recruit for "smart and get things done" attitudes. Apply practices according to where the team stands. 3) Automate aspects like releases, reporting, and testing when possible, but also retain some manual processes to aid understanding of what to automate. Team learning takes time.
For the full video of this presentation, please visit: https://www.embedded-vision.com/platinum-members/embedded-vision-alliance/embedded-vision-training/videos/pages/may-2018-embedded-vision-summit-warden For more information about embedded vision, please visit: http://www.embedded-vision.com Pete Warden, Google research engineer and the tech lead of the TensorFlow Mobile and Embedded team, presents the "Solving Vision Tasks Using Deep Learning: An Introduction" tutorial at the May 2018 Embedded Vision Summit. This talk introduces deep learning for vision tasks. It provides an overview of deep learning, explores its weaknesses and strengths, and highlights best approaches to applying deep learning to solving vision problems. The audience will learn to think about vision problems from a different perspective, understand what questions to ask, and discover where to find the answers to these questions. The talk will conclude with insights on the challenges of deploying deep learning solutions on mobile devices.
The document discusses various topics related to developing a technology product, including hiring an engineering team, creating a product, technical development challenges, and setting up processes. It provides advice on tuning your setup by considering human resources, available technologies, tools, and processes. It discusses common pitfalls and emphasizes focusing on users and testing. Technical concepts discussed include infrastructure, programming languages, servers, APIs, storage, desktop development, and mobile development.
This document summarizes a presentation on data science consulting. It discusses: 1) The Agile Analytics group at ThoughtWorks which does data science consulting projects using probabilistic modeling, machine learning, and big data technologies. 2) Two case studies are described, including developing a machine learning model to improve matching of healthcare product data and using logistic regression for retail recommendation systems. 3) The origins and future of the field are discussed, noting that while not entirely new, data science has grown due to improvements in technology, programming languages, and libraries that have increased productivity and driven new career opportunities in the field.
This document summarizes a lecture about using the lean startup approach for product development. It discusses: - Using a minimum viable product (MVP) to test assumptions quickly without overbuilding. An example is Dropbox starting with a simple demo video. - The build-experiment-learn feedback loop, where you build an MVP, experiment to collect data, and learn how to improve. Key phases are identifying leap-of-faith assumptions to test like value and growth hypotheses. - The dilemma of having more traction to raise funds after validating assumptions with an MVP, rather than prematurely seeking funds with just an idea and no customer feedback. Starting small allows wisely using funds.
This document discusses using Google's Attribute-Component-Capability (ACC) model approach to help balance test efforts. The key points are: 1) The ACC model involves listing a product's attributes, breaking it into technical components, and categorizing capabilities. This provides an overview of test needs across the entire product. 2) Complexity, frequency of use, and user impact are assigned scores to capabilities. This determines relative "testing needs". 3) The ACC items, scores, and needs are tracked in a tool like Excel linked to a tool like TFS. This provides instant visibility into where more testing is required based on risk.
En puisant dans ses différentes expériences de direction technique (notamment chez Pixmania, Criteo et Viadeo), Julien Simon parlera des risques qui menacent les équipes et les plates-formes en forte croissance, en donnant au passage des pistes pour les anticiper et les résoudre.
Presentation done at the "CTO Crunch" event by France Digitale, Paris, 24/02/2015. Based on his experience (VP Eng @ Digiplug, CTO @ Pixmania, VP Eng @ Criteo, CTO @ Aldebaran Robotics and now CTO @ Viadeo), Julien shares some hard-learned, bullshit-free lessons on what it means to be a CTO. Hiring, Tools, Methodology, Technology, Politics: welcome to Hell :)