All articles

The rise of AI: voice impersonation scams

Phone scams have long plagued the voice channel, but now with the help of AI they are evolving. AI-powered voice scams are on the rise, making it easier for fraudsters to deceive unsuspecting individuals. But scammers aren’t the only ones using AI. Read on to see how Hiya is using AI to fight back against the latest threats.

The evolution of voice impersonation scams

Phone scams have been a persistent global problem for years, often involving bad actors impersonating trusted entities to steal money or trick individuals into providing sensitive information. In the U.S. alone, consumers lose USD $42.5 billion each year to phone fraud. However, with the rise of AI technology, scammers have a new tool to make their deceptions even more convincing. 

According to the news website Axios.com, “Generative AI has lowered the bar for cybercriminals looking to clone someone's voice and use it in their schemes. Cybercriminals now need as little as three seconds of someone's voice to successfully clone it and make it usable in a scam call.”

Examples of AI-powered voice impersonation scams

AI-powered voice scams can take various forms, such as impersonating government officials, medical professionals, or service providers. By leveraging AI-generated voices, scammers can manipulate emotions, tone, and intonation, making it difficult for victims to detect the illegitimacy of the call.

News reports from around the world tell of voice-impersonated “loved ones scams.” Loved ones scams — where a fraudster calls a parent or grandparent pretending to be their child or grandchild in need of immediate financial assistance — are nothing new. But the new twist is that the fraudster can now use an AI-generated voice clone to mimic the child’s actual voice. Here are few examples: 

What Hiya users are reporting

Consumers who receive spam protection from Hiya via their device manufacturer or the Hiya mobile app have the ability to report scams to Hiya. While users have reported loved ones scams, none of them have mentioned that the scammer was using a voice clone. 

Hiya has, however, received user reports telling of robocalls that appear to use interactive AI voice response to accomplish the scam. Here is a report from a user who goes into great detail describing the call:

“This was NOT a robocaller, but actually AI. It was intelligent, and answered my questions after a brief pause. If not for the pauses, I would have been convinced it was a person. They said they were with Kentucky State Police doing a drive to raise money for police and wounded warriors. The person that “called” had a strong, demanding and manipulative voice and way of talking that made you feel pressure to give. I asked if it was a recording and there was a slight pause, and he said “I assure you this is a live person calling”.  It lied to me.” 

The person reporting the call was amazed at how realistic the conversation seemed. The only clue was the brief pause before answering a question. The report continued:

“The call had a strange and unsettling feel to it, but I had to listen very carefully to responses to tell it was AI that was learning from me. I asked if this was a recording again, and it said “I assure you you’re talking to a live person”. When I told my daughter it said this, she said “why would a human say it like that? No human would say that”. Finally it said, “I understand, we can mail you a packet, but I just need to gather some information, can we start with your first and last name?” I said,  “no thank you, this is a recording or a computer. Goodbye.”

Using AI to fight AI-powered scams

Fortunately, there are ways to mitigate the risks posed by AI-powered voice scams. Mobile carriers can protect their subscribers by implementing Hiya Protect on their network. Hiya Protect leverages the industry’s only self-learning spam protection system, Adaptive AI, to detect spam and fraud calls in real time. Adaptive AI analyzes every component of a phone call: the phone number used to make the call, the caller making the call, the recipient of the call, and the call itself. 

Adaptive AI turns the tool of AI against scammers to detect threats so subscribers can be warned about suspicious phone calls or have them blocked entirely. It is self-learning and constantly adapting to the latest threats, so as scammers evolve their tactics and try to thwart spam protection systems, Hiya Protect can still detect them.

With the emergence of AI based threats, the only way carriers can future-proof their networks is with spam protection that also uses AI. Scammers are constantly adapting, and unless carriers are also constantly adapting, they will fall behind.

Author Andrea Moreno

Carrier Customer Marketing Manager