SlideShare a Scribd company logo
AI Ethics - Bias
in Data
Hello!
I am Lucy Craddock
I am a data science degree
apprentice.
You can find me at
@fudgycraddock
2
Driverless Cars
Can AI make life-or-death
decisions?
3
4
5
6
7
50% x 3
lives
88% x 2
lives
99% x 4 lives
USA
Female Dummies in
Car Crash Tests
EU
8
��� Female dummy introduced in 2011.
● 4-feet-10 inches tall, 108 pounds
● Can double as a 12-year-old child
● Female dummy used in 1 test
● … in the passenger seat
● Smaller version of a male dummy
http://www.diva-portal.org/smash/record.jsf?pid=diva2%3A1203713&dswid=
-7449https://www.washingtonpost.com/local/trafficandcommuting/female-dummy-makes
-her-mark-on-male-dominated-crash-tests/2012/03/07/gIQANBLjaS_story.html
https://www.humaneticsatd.com/crash-test-dummies/frontal-impact/hiii
-5f
female vehicle occupant’s
odds of being injured in a
frontal crash are 73%
greater than the odds for a
male occupant
9
- University of
Virginia, 2019
Heart Attacks
▪ BHF found Women are 50% more likely than men to
receive an incorrect diagnosis
▪ Estimate that over 10 years, 8,200 women have died
needlessly after a heart attack
10
Office
Temperature
▪ Standard office temperature developed in 1960s
▪ Overestimates female metabolic rates as much as 35%
11
https://www.nature.com/articles/nclimate2741
Voice Recognition
Software
▪ Google’s voice recognition software is 70% more
accurate for male voices over female voices.
12
https://makingnoiseandhearingthings.com/2016/07/12/
googles-speech-recognition-has-a-gender-bias/
Face Recognition
Software
▪ Microsoft and IBM can identify white male faces with
99% accuracy
▪ Both had error rates of 35% for dark-skinned women
13
https://www.theregister.co.uk/2018/02/13/facial_recognition_s
oftware_is_better_at_white_men_than_black_women/
14
15
Racial Bias
▪ COMPAS, designed by Northpointe is a tool that scores
offenders by likeliness to reoffend to determine prison
sentences and bail time.
▪ Criticised for being racist
16
https://medium.com/thoughts-and-reflections/racial-bias-and-gender-bias-example
s-in-ai-systems-7211e4c166a1
https://www.washingtonpost.com/news/monkey-cage/wp/2016/10/17/can-an-algorit
hm-be-racist-our-analysis-is-more-cautious-than-propublicas/
17
‘COMPAS Software Results’, Julia Angwin et al.
(2016)
18
https://www.documentcloud.org/documents/2702103-Sample-Risk-Assessment-C
OMPAS-CORE.html
Human
Classification Bias
▪ Youtube sued by its creators over their demonetization
algorithm for discriminating LGBTQ content.
▪ Demonetisation is ‘confirmed’ by human classifiers
□ Allegedly outsourced from countries where
homosexualiy is illegal
19
https://www.theverge.com/2019/8/14/20805283/lgbtq-youtuber-lawsuit-discrimination-alleged-video-r
ecommendations-demonetization
https://www.htxt.co.za/2019/09/30/youtubers-discover-why-lgtbqi-content-is-being-demonetised-on-yo
utube/
Conclusions
20
Thanks!
Any questions?
You can find me at
▪ @fudgycraddock
21
Credits
Special thanks to all the people who made and
released these awesome resources for free:
▪ Presentation template by SlidesCarnival
22
Further Reading
▪ For more cases on the gender data gap, please
see Invisible Women by Caroline Criado Perez
23

More Related Content

Lucy Craddock CloudCampLondon - AI Ethics - Bias in Data

  • 1. AI Ethics - Bias in Data
  • 2. Hello! I am Lucy Craddock I am a data science degree apprentice. You can find me at @fudgycraddock 2
  • 3. Driverless Cars Can AI make life-or-death decisions? 3
  • 4. 4
  • 5. 5
  • 6. 6
  • 7. 7 50% x 3 lives 88% x 2 lives 99% x 4 lives
  • 8. USA Female Dummies in Car Crash Tests EU 8 ● Female dummy introduced in 2011. ● 4-feet-10 inches tall, 108 pounds ● Can double as a 12-year-old child ● Female dummy used in 1 test ● … in the passenger seat ● Smaller version of a male dummy http://www.diva-portal.org/smash/record.jsf?pid=diva2%3A1203713&dswid= -7449https://www.washingtonpost.com/local/trafficandcommuting/female-dummy-makes -her-mark-on-male-dominated-crash-tests/2012/03/07/gIQANBLjaS_story.html https://www.humaneticsatd.com/crash-test-dummies/frontal-impact/hiii -5f
  • 9. female vehicle occupant’s odds of being injured in a frontal crash are 73% greater than the odds for a male occupant 9 - University of Virginia, 2019
  • 10. Heart Attacks ▪ BHF found Women are 50% more likely than men to receive an incorrect diagnosis ▪ Estimate that over 10 years, 8,200 women have died needlessly after a heart attack 10
  • 11. Office Temperature ▪ Standard office temperature developed in 1960s ▪ Overestimates female metabolic rates as much as 35% 11 https://www.nature.com/articles/nclimate2741
  • 12. Voice Recognition Software ▪ Google’s voice recognition software is 70% more accurate for male voices over female voices. 12 https://makingnoiseandhearingthings.com/2016/07/12/ googles-speech-recognition-has-a-gender-bias/
  • 13. Face Recognition Software ▪ Microsoft and IBM can identify white male faces with 99% accuracy ▪ Both had error rates of 35% for dark-skinned women 13 https://www.theregister.co.uk/2018/02/13/facial_recognition_s oftware_is_better_at_white_men_than_black_women/
  • 14. 14
  • 15. 15
  • 16. Racial Bias ▪ COMPAS, designed by Northpointe is a tool that scores offenders by likeliness to reoffend to determine prison sentences and bail time. ▪ Criticised for being racist 16 https://medium.com/thoughts-and-reflections/racial-bias-and-gender-bias-example s-in-ai-systems-7211e4c166a1 https://www.washingtonpost.com/news/monkey-cage/wp/2016/10/17/can-an-algorit hm-be-racist-our-analysis-is-more-cautious-than-propublicas/
  • 17. 17 ‘COMPAS Software Results’, Julia Angwin et al. (2016)
  • 19. Human Classification Bias ▪ Youtube sued by its creators over their demonetization algorithm for discriminating LGBTQ content. ▪ Demonetisation is ‘confirmed’ by human classifiers □ Allegedly outsourced from countries where homosexualiy is illegal 19 https://www.theverge.com/2019/8/14/20805283/lgbtq-youtuber-lawsuit-discrimination-alleged-video-r ecommendations-demonetization https://www.htxt.co.za/2019/09/30/youtubers-discover-why-lgtbqi-content-is-being-demonetised-on-yo utube/
  • 21. Thanks! Any questions? You can find me at ▪ @fudgycraddock 21
  • 22. Credits Special thanks to all the people who made and released these awesome resources for free: ▪ Presentation template by SlidesCarnival 22
  • 23. Further Reading ▪ For more cases on the gender data gap, please see Invisible Women by Caroline Criado Perez 23