AI Voice Cloning Helps Scammers—Here's How to Protect Yourself

Pretending to be someone else is easier than you think

  • AI voice cloning fraud is a growing problem.
  • Readily available software lets scammers mimic the sound of people’s voices. 
  • You can protect yourself by verifying the identity of callers.
Closeup on someone holding a smartphone with an unknown caller displayed on the screen.

Tero Vesalainen / Getty Images

Be wary of phone calls from people claiming to be loved ones in trouble. 

Federal officials say fraudsters are using voice cloning software to pull off sophisticated family emergency and corporate fraud scams. It's part of a growing problem of criminals using artificial intelligence (AI).

"With the latest advancements in speech synthesis, systems like VALL-E are able to create a high-quality voice clone with only a 3-second enrolled recording of an unseen speaker," Vijay Balasubramaniyan, the CEO of the cybersecurity firm Pindrop told Lifewire in an email interview. 

How AI Voice Cloning Scams Work

AI voice cloning scams often start with a call from someone who sounds like a family member. The scammer says that your loved one is in trouble and needs money for an emergency involving a car accident or a hospital bill. In fact, the person on the phone is a stranger using voice cloning software. 

"Artificial intelligence is no longer a far-fetched idea out of a sci-fi movie. We're living with it, here and now," writes the FTC on its website. "A scammer could use AI to clone the voice of your loved one. All he needs is a short audio clip of your family member's voice—which he could get from content posted online—and a voice-cloning program. When the scammer calls you, he'll sound just like your loved one."

Voice cloning allows users to recreate the unique vocal patterns of an individual. However, it's also a double-edged sword, Tamás Kádár, the CEO of the fraud prevention company SEON, said in an email. The danger arises when this technology falls into the wrong hands. 

"Unscrupulous individuals can use voice cloning to deceive, manipulate, and exploit others, with the potential to cause severe emotional and financial harm," Kádár added. "This is especially concerning as voice cloning software becomes more accessible and affordable, opening up a pandora's box of nefarious possibilities."

The family emergency isn't the only scam that uses voice cloning. Kádár said that another common fraud is when the scammer impersonates a family member or close friend, usually claiming they need financial assistance due to an accident or urgent situation. "The victim, thinking they're helping a loved one, sends money, only to discover later it was a scam," he added. 

Another scam is CEO fraud, when a fraudster clones the voice of a company's executive and contacts employees, requesting they wire funds or disclose sensitive information. The employee, thinking they're following orders from a superior, unwittingly enables the scam.

Grandparents are increasingly vulnerable to voice cloning scams, Steven J.J. Weisman, an expert in identity theft, said in an email interview. The scammer can obtain a recording of a grandchild's voice from social media and replicate it using AI technology. 

Two adults, one taking an emergency phone call, the other holding a credit card and using a laptop.

RainStar / Getty Images

He pointed to the case of Ruth Card of Canada, who was swindled out of CAD 3,000 by a scammer who used AI voice cloning technology to make a call that appeared to come from her grandson. The caller said her grandson was in jail and needed the money immediately for bail.

"All it takes is AI voice-generating software and as little as 30 seconds worth of the grandchild's audio," Weisman said. 

Spotting a Voice Cloning Scam

You can protect yourself from AI voice cloning scams by being wary of unsolicited calls requesting personal or financial information. According to the FTC, scammers often ask you to pay or send money in ways that make it hard to get your money back. If the caller says to wire money, send cryptocurrency, or buy gift cards and give them the card numbers and PINs, those could be signs of fraud.

Balasubramaniyan said it's a red flag if you are subjected to emotional manipulation and high-pressure tactics. "If a user feels compelled to help in these situations, hang up and independently call back the contact using a known phone number," he added. 

In the future, there may be ways to detect AI scams. "Companies are creating technology to detect voice cloning," Balasubramaniyan said. "These detection technologies use knowledge of human voice production and evolution to determine when a voice is a clone by recreating the physical anatomy of the speech in question and determining anomalies in that speech."

Was this page helpful?