Deep fake voicemail scam exploding

A new and scary twist on an old email scam has exploded.

Scammers are using artificial intelligence voices that mimic loved ones. It's an attempt to play on the heartstrings of unsuspecting victims who may believe a friend or family member needs help.

"So, you get a call saying, I'm in jail or stuck somewhere, and it really does sound like your loved one, but it could very well be a scammer," said tech analyst Larry Magid, founder of ConnectSafely.org.

University of Michigan Computer Science Engineer and Professor Hafiz Malik is working on finding ways to hold the monster of voice cloning at bay.

"Technology is moving at a lightning pace and this monster is just getting better. They are getting more and more realistic every passing day," said Malik.

The mother lode of pre-recorded voice and videos is found on social media. Voices are also often picked up at meetings and rallies.

"Based on my understanding, you need just a few seconds of someone's voice," said Malik.

Malik and other experts are creating systems to divulge deep fakes. But it's a high-tech game of cat and mouse.

"We are very close to the point where you could have a live telephone call with a bot and not know you're talking to a bot," said Mr. Magid.

Until there is a foolproof tech remedy to automatically detect this cloning, there is the potential for the deepest evils imaginable.

"There's a lot you can do that is harmful with this technology. You could get somebody very upset, an example being recording the voice of a loved one who has passed on," said Magid.

And if the victim is highly susceptible to such manipulation, scammers can further rip them off to keep up the conversations.