AI Could Monitor Your Child’s Emotional State in School

Raising privacy concerns

  • A range of new apps may be able to detect the students' emotions. 
  • Intel and classroom software maker Class are reportedly developing software that will watch over digital students’ faces and body language. 
  • Some experts are concerned that using AI to monitor students could lead to invasions of privacy.
Two students working on computers in a home setting.

Justin Paget / Getty Images

Schools could soon use artificial intelligence (AI) to monitor students’ emotional states. 

Intel and classroom software maker Class are developing software that will watch over digital students’ faces and body language. The system can reportedly detect whether students are bored, distracted, or confused, but some experts are concerned that the software could lead to invasions of privacy. 

"Such AI-based monitoring products should be transparent about how this data is being used," Michael Huth, co-founder of the company Xayn and head of the computing department at Imperial College London, told Lifewire in an email interview. "If it is used to track engagement in a student cohort, where engagement at the individual level is not recorded, this is somewhat less problematic from a privacy perspective. If any of this data were to be used to influence student assessment, however, I can see all sorts of problems arising."

Watching Over Students

Class and Intel have partnered to integrate an AI-based technology that runs along with the videoconferencing software Zoom, Protocol reported. The software can be used to monitor students, the report said.

"Through our partnership with Intel, we’ll be able to bring new, immersive features to Class' software that are driven by research-backed functionalities," according to the company’s website. "We’ll also work alongside Intel to leverage resources and provide educators and students with valuable insights, joint case studies, white papers, webinars, and much more."

Intel told Protocol that the classroom software is in the early stages, and there are no plans to send the product to market. 

The partnership with Class isn’t the only potential collaboration between Intel’s emotion-reading software and other manufacturers. An online brochure from Intel touts a new electronic whiteboard called Viewboard aimed at schools. Intel software "makes it possible for myViewboard to assess and display the emotional states of learners as immediate feedback for educators," according to the brochure. 

Class is one of many software products that purport to track students’ emotions using AI. There’s also 4Little Trees, an emotion detection learning platform that is "a virtual teaching assistant [that] would help by choosing the most suitable questions for you or challenge you when needed," according to the company’s website

Ashish Fernando, the CEO of AI company iSchoolConnect, told Lifewire in an email interview that AI can be trained to assess human behavior and features without the impact of subjective biases individuals hold. 

"AI can assess a significantly larger number of human behaviors, including easy-to-miss micro-expressions, at a rate much faster than a human," Fernando said. "Where feedback could have taken hours, AI can now provide the same within a few minutes."

The view from behind a student as they do school work on a computer at the kitchen table.

Thomas Park / Unsplash

Privacy Concerns

The use of AI to monitor humans is growing. Snapchat has AI tech that assesses the emotional level of a group of people at a live event. Mad Street Den uses computer vision AI to help retailers recognize potential buyers’ facial expressions and emotions. BrainQ leverages AI to understand how passengers interact with the autonomous vehicles they are using. And companies are using AI to track driver fatigue.

"This is a great use-case for road safety, but it brings up questions concerning the privacy rights of drivers in their personal vehicles," Huth said. "Will fatigue information be shared with the insurance company or the car manufacturer? Or will it just warn the driver to pull over and rest? This is less of an issue for commercial car fleets such as trucks, delivery vans, or cabs.”

One issue is that AI’s information, such as muscle movement to infer emotional state, might not be reliable, Huth said. "This creates a dilemma, as more reliable AI may require accessing even more sensitive data such as EEG signals," he added. 

For students, AI can be useful for monitoring compliance during online exams when a proctor cannot be physically present. But this use of AI for this is contentious, Huth said, because the system tries to determine whether a student intends to cheat or does cheat, and there’s a real potential for false positives that result in the unjustified penalizing of students who have done nothing wrong. 

"Any AI used for this purpose would need to be robust enough to deal with toilet breaks, nervous tics of individual students (such as erratic eye movements), and so on," Huth said. 

Was this page helpful?