AI Is Now Making Disturbing Prank Calls, Don’t Fall Victim To An AI Voice Scam

A new AI voice scam is using people's voices and calling their loved ones asking for money

By Charlene Badasie | Published

AI voice scam

AI models designed to simulate the human voice have made it easy for criminals to scam people out of money. Evolving in sophistication, the software requires a few audio clips to reproduce the sound and tone of a natural person convincingly. For those who are targeted, it’s becoming increasingly difficult to know if a voice is faked, even if the described dilemma seems implausible.

The AI voice scam surge highlights the dark impact of generative artificial intelligence. According to The Washington Post, advances in math and computer science have improved training mechanisms for the software which analyzes what makes a voice unique. This includes gender, age, and accent. These programs search a vast database of audio recordings to find similar patterns.

So if a recording of your voice exists in the digital space (like YouTube, social media, podcasts, or commercials) it can be cloned for use in an AI voice scam. In 2022, a startup called ElevenLabs made headlines after its free software was used to replicate celebrity voices saying things they never did. The company responded by incorporating safeguards to stem misuse.

But banning free users and launching a tool to detect clips generated by artificial intelligence isn’t enough to protect vulnerable people from losing money in an elaborate AI voice scam. Speaking to The Washington Post, Canadian resident Benjamin Perkin detailed how his elderly parents lost thousands of dollars in a scary incident.

The AI voice scam ordeal began when his parents got a phone call from an alleged lawyer. He told them Benjamin had been involved in a car accident in which a U.S diplomat was killed. As a result, he was in jail and needed money for legal fees. The attorney put their son on the phone. The faux Benjamin said he loved them and asked for financial assistance.

A few hours later, the lawyer called Benjamin’s parents, saying he needed $15,449 before a court appearance later that day. Panicked, the elderly couple rushed to several banks to gather the cash and sent it via a Bitcoin terminal. They later explained that although the call seemed unusual, they couldn’t shake the feeling that their son needed help.

They were very confused when the real Benjamin called his parents later that night. Perkin told The Washington Post he doesn’t know how the AI voice scam artist found his audio likeness. But he has shared several YouTube videos talking about snowmobiling. The family filed a police report with law enforcement authorities. But their money is long gone.

Assistant Director of the Federal Trade Commission’s Marketing Practices Division, Will Maxson, says locating people responsible for AI voice scams is very difficult. This is mainly because they could be based anywhere in the world, making it hard to identify which agency has jurisdiction over each case. He advised people to be vigilant constantly.

“If a loved one tells you they need money, put that call on hold and try calling your family member separately,” he said via The Washington Post. If a suspicious call comes from a familiar number, it’s worth noting that it could still be an AI voice scam.