NewsLocal NewsWPTV Investigates

Actions

Federal Trade Commission warns AI voice cloning is enhancing 'grandparent scam'

'Now, you actually have a real person's voice, which most people don't realize can be faked,' Alan Crowetz says
Posted
and last updated

WEST PALM BEACH, Fla. — Cybersecurity experts and the Federal Trade Commission are warning consumers about the so-called "grandparent scam," which is making it even easier for criminals to carry out due to artificial intelligence voice cloning.

"She swore it sounded like him," Eric Lieberman told Contact 5 in January.

Lieberman is talking about his 86-year-old mother in Greenacres who got a call that she thought was from her grandson who was allegedly in jail and needed bond money.

"She was able to pull out $16,000 [from] one of her accounts," Lieberman said. "They called an Uber, which was supposedly picking up the cash for the bail bondsmen."

Now, three months later, the criminals who are carrying out what authorities call the "grandparent scam" have a new tool to trick consumers.

Eric Lieberman spoke to WPTV about how his mother was duped by a person involved in a "grandparent scam."
Eric Lieberman spoke to WPTV about how his mother was duped by a person involved in a "grandparent scam."

"It just takes it to the next level," Alan Crowetz, the president and CEO of InfoStream, told Contact 5. "Now, you actually have a real person's voice, which most people don't realize can be faked."

It's called AI voice cloning and unfortunately, Crowetz said it's easy to do once a criminal gets their hands on a recording of your voice.

"I believe that most people will have their voice on the internet," Crowetz said. "People think that's not possible, but if I simply look at most of your viewers and went to their Facebook page, I can find a video of them speaking. I can check out Instagram or Snapchat."

While you may be wondering if you should delete voice recordings from your social media pages, Crowetz said if it's already out there, it's out there.

Alan Crowetz gives advice on how the public can avoid falling for fake voices created by artificial intelligence.
Alan Crowetz gives advice on how the public can avoid falling for fake voices created by artificial intelligence.

"The No. 1 thing to do is what you're doing, getting the word out there, educating people," he said. "You've got to be paranoid."

Various AI cloning websites are quick and easy to use. They often only cost a few dollars and some sites even have a free trial.

"It's gonna become the routine," Crowetz said. "The first couple times we had the phishing email, people were really surprised, now every day people get phishing emails."

The Federal Trade Commission put out a warning about AI voice cloning earlier this week, reminding consumers to be weary of any urgent phone calls demanding money, even if they think they know the voice on the other line.

It's best to hang up and call them at a number you know to be theirs.

"A lot of smart people call us, and they've been tricked," Crowetz said. "This happens all the time."

Email the Investigators
Share your news tips and story ideas with WPTV's investigations team.