How New AI Technology Helps People With Disabilities but Introduces Discrimination

Giving job seekers a chance

  • AI is being used by a nonprofit in Thailand to help find work for people living with disabilities. 
  • A growing number of AI technologies are aimed at helping those with disabilities.
  • But some experts say using AI can lead to more discrimination.
a woman in a wheelchair working in an office and checking her phone

Francesco Carta fotografo / Getty Images

Artificial intelligence (AI) holds promise and perils for those with disabilities as the world marks the International Day of People with Disabilities

The Vulcan Coalition in Thailand hopes to use AI to help find employment for people with disabilities. The organization plans to hire workers capable of labeling the large amounts of data produced in the Thai language. It's one of many efforts to use AI to improve the lives of people with disabilities. 

"AI can be used to improve communication for people with speech impairments by helping them to interpret facial expressions and other body languages, or by providing translations for communication between cross-linguistic and cross-cultural speakers," Amey Dharwadker, a machine learning engineer at Facebook, told Lifewire in an email interview. "AI can also be used to create more accessible environments for people with disabilities."

AI to the Rescue

The Vulcan Coalition founders want to train people with disabilities to label data, which involves identifying raw data, like audio files or videos, and adding informative labels for context. The tags allow a machine learning AI model to learn from the data to power apps like chatbots and voice recognition. 

"When we first told people that we would like to employ them as data labelers, they were a little bit scared because it was out of their comfort zone," Vulcan co-founder Methawee Thatsanasateankit said in a post on Microsoft's blog. "People have told them that they couldn't do many types of work. We had to convince them to take the course. But now it's something they can be proud of because they can tell other people they do this high-value job."

Moran Leshem Bar, the chief growth officer of Dataloop, a platform for AI data management, said in an email interview that AI offers an increased ability to process unstructured data. This system is a fancy way to describe the raw data we process as humans, such as visual, audio, and language. 

"This, in turn, means that AI models can help disabled people to translate the world," she said.
"One example would be glasses that describe the world, allowing blind people to "hear" the world around them," she added. "In the more advanced cases, AI combined with robotics can even proceed to replace limbs and allow mobility on top of the sensing."

AI can also be used to identify facial expressions and emotions, which may be helpful for people with autism spectrum disorder, Dharwadker said. A growing number of AI-powered chatbots can act as virtual assistants for people who have difficulty speaking or need help completing everyday tasks.

The Potential for Harm

While AI is helping those with disabilities, it could also be a source of discrimination. Dharwadker said if a company were to use AI to screen job applicants, it could reveal biases against people with disabilities as the system could assume they aren't as capable as other candidates. 

AI could also harm job candidates with disabilities by making it difficult for them to access services and accommodations. 

"For example, if an AI system were used to evaluate whether someone was eligible for disability benefits, it might be more difficult for people with disabilities to get the benefits they need," Dharwadker said.

AI can also be used to create more accessible environments for people with disabilities.

The US Department of Justice has said that employers need to protect against the role of AI in hiring discrimination as it could violate the Americans with Disabilities Act (ADA), a federal law that seeks to remove barriers for people with disabilities in activities, including employment. 

"Some hiring technologies require an applicant to take a test that includes an algorithm, such as an online interactive game or personality assessment," the DOJ writes on its website. "Under the ADA, employers must ensure that any such tests or games measure only the relevant skills and abilities of an applicant, rather than reflecting the applicant's impaired sensory, manual, or speaking skills that the tests do not seek to measure."

But Michael Bond, the founder of Spoken, a company that uses big data, machine learning, and user-centered design to help people with language disorders, predicted in an email interview that the future is bright for AI-driven assistive technologies. 

"We've quickly gone from depending on physical improvements in technology to data-driven AI systems," he added. "Hearing aids are no longer judged by simply how well they make the surrounding sound louder, but rather how smart they are in separating the important sounds from background noise."

Correction 12/08/2022: Corrected the company link in the next to last paragraph.

Was this page helpful?