Voice-cloning AI scams on the rise: Arizona AG & ASU Professor send warning

The Arizona Attorney General has a warning about artificial intelligence as people are using the technology to clone voices, tricking someone into thinking a loved one is on the phone and needs money.

AG Kris Mayes says her office is getting more and more calls about this scam, and she wants to warn the public about it now so fewer people get tricked into sending money.

"I think this technology has evolved more quickly than any technology in human history," Mayes remarked.

Artificial intelligence is being used to clone voices and scammers are taking advantage by using other people's voices in phone scams.

"If you get a call from someone on the other line that sounds like your mother, you can't assume that's your mother. Unfortunately, it's somewhat awkward to ask your mother some kind of a password, 'Are you really my mom? Can you answer the following question?' but that's where we are right now," said Professor Subbarao Kambhampati with ASU's School of Computing and Augmented Intelligence.

Related

FBI warn against AI-generated deepfake content created for sextortion schemes

The FBI said victims include children and non-consenting adults.

Mayes says scammers are even taking it to the next level.

"What’s also happening is these scammers are, in some cases, using spoofing equipment that can spoof your phone number, so the combination of them being able to make it look like it’s actually your phone number in tandem with cloning your voice is making these scams and frauds even more dangerous," she said.

Kambhampati says there needs to be regulations on artificial intelligence for reasons like voice-clone scams.

Related

Artificial intelligence, like algorithms, could crack the language of cancer and Alzheimer’s, study finds

The study found that looking at cancer and neurodegenerative diseases with artificial intelligence could lead to discovering how to alleviate symptoms - or maybe even prevent the diagnosis from happening at all.

"They expect that you use them for good purposes. For example, you might want to say a story to your kid in their grandmother's voice … that's the kind of things they expect you to use it for," he said.

If you get a call, and you're unsure if it's the person they are claiming to be, Mayes says to call that person's actual phone number to verify and report scams to local law enforcement and the attorney general's office.

Just 3 seconds is all it takes

All the scammer needs is three seconds of someone talking. That's put into a website that then generates a voice.

From there, you can type whatever you want it to say.

"The surprising part is it doesn't take that much data, that much voice sample to actually train it to speak like you," Kambhampati said. "The current state of the art is even with a three-second clip of your voice, the system can imitate you … and say any text."

On a voice cloning website, it'll ask for 10 to 20 voice samples using different emotions.

"The better the voice sample, the better compelling imitation is, but you can do a pretty passable imitation with just 3 seconds of the voice," Kambhampati said.

After those voice samples are submitted, a voice is generated.

That voice can say anything a person writes in the text with a click of a button.

"Many people grew up thinking they can trust their eyes, they can trust their ears. That's no longer true," Kambhampati said.

Artificial IntelligenceNewsCrime and Public SafetyPhoenix