Watch how easy it is for scammers to manipulate your voice using AI
DALLAS - It’s 2 a.m. and you’re fast asleep. Then you’re jarred awake by your ringing phone, and you answer.
"Help me. Mom, please help. I just got robbed," she says.
MORE: Save Me Steve
It’s your daughter and she’s in trouble. Or is she?
Greg Bohl is the chief data officer for TNS Communications, the company that tags potentially problematic calls to your cellphone.
When you get a call and the caller ID says, "Spam Likely," that’s Bohl’s company.
"We’re in the business of protecting the consumer," he said.
TNS Communications processes around 1.2 billion calls each day to try to keep bad actors from accessing you.
But those actors are making advancements that put your voice in the leading role. And it’s all thanks to artificial intelligence.
"Steve, you’re a really popular guy on the internet. They can take your voice. It’s everywhere. Take a clip then can record it and they can manipulate it. And they can type in any sentence they want it to say and have your voice do anything they want it to do," Bohl said.
Here’s the real danger. We’re not talking about a mashup of words like in a popular YouTube video of President Barack Obama that makes it sound like he’s reciting song lyrics.
This is the extraction of your tone and cadence to artificially generate your voice to speak sentences you never uttered.
So how convincing is it?
It’s so convincing that FOX 4 consumer reporter Steve Noviello used the software to tell this story on TV.
"This isn’t audio I recorded. It’s my voice artificially generated. And if you’re thinking, "Well I’m not a newscaster. My voice isn’t on the internet.’ Think again. Capturing a recording of your voice is far easier than you realize," he said.
"Everybody likes the custom voicemail that says, ‘Hey it’s me.’ Guess what? That’s longer than three seconds. I can take that clip from your voicemail and duplicate it and put you in a very bad situation," Bohl said.
Bohl used his own daughter as an example.
"These chips are so good," she says in a TikTok video.
He used that same voice sample manipulated by AI to say, "Mom, help me, please. I’ve been in an accident."
He took a random Save Me Steve video off the internet to make Steve say, "Mom, help me. I’ve been in a bad accident. Can you give this guy some money so I can go to the hospital?"
"If my mom got that phone call, I guarantee you she’d have her checkbook ready," Steve said.
Bohl said the most susceptible are the elderly and parents because scammers prey on people in highly charged, confusing, or high-stress situations.
"They attack our emotions," he said.
And it’s working. According to the Federal Trade Commission, last year consumers got duped out of more than $11 million by fake voice scams.
How can you stop it?
"My recommendation to consumers is you find a safe word for your family," Bohl said.
It’s a simple solution for sound sleepers.
That safe word is an easy way to identify if it really is your family member on the other end of the phone.
These AI manipulations are processed in real time, meaning the scammers can use a fake voice to carry on an actual conversation and even answer questions.