Voice fraud and the threat of generative AI

To illustrate the power of generative AI — and how it can be misused by scammers — Hiya’s Chief Product Officer James Lau cloned the voice of Hiya CEO Alex Algard and played an audio clip of the scam he devised.

It was part of a presentation Lau gave at Hiya’s 2023 State of the Call Summit in New York City on Oct 5. 

 

 

Here’s the voicemail message the cloned Alex Algard left for a Hiya employee in the finance department: 

“Hey, Jen. This is Alex. I need you to urgently wire some money into an account. Aaron, our CFO, told me that there was an error in payroll, and Bob was supposed to get his bonus yesterday, but it was never transferred. I need you to wire $20,000 directly into his account immediately. He is threatening to quit. Please write this down. The SWIFT Code is MRMDUS33, and the account number is 5285251. I have already checked with Aaron, so no need to get his approval. I need to hop on a plane now, so I won’t be reachable for the next several hours. Please do this right away. I need this done in the next 10 minutes. Thank you.” 

Lau’s phone scam was created to illustrate a point. “Now, we can no longer use someone's voice as a way to authenticate them,” Lau said. 

Lau then detailed to the audience the technology he used to create the scam — using the same tools that scammers have easy access to.

First, he went to LinkedIn and found a recently hired employee in the finance department. Then he went to YouTube and clipped a video of Alex Algard speaking. He cloned Algard’s voice using an online tool. He then downloaded another app to spoof the phone number to make it look like the call came from Algard. 

MGM security breach began with a phone call

Lau reminded the audience of MGM Resorts’ security breach in September that forced the company to shut down systems that enabled everything from slot machines to electronic room keys, resulting in a loss of $100 million. 

“How does a $13 billion company that spends tens of millions of dollars in cybersecurity each year get hacked?” Lau asked. “Through a phone call. The hackers called an employee, got their credentials, and just walked right into the network.”

“Security is only as good as the weakest link, and the phone call is often the weakest link,” Lau told the audience, which included telecommunication leaders, spam and fraud experts, industry regulators, and more. “With the use of generative AI, hackers are making the phone more dangerous than ever.” 

What is generative AI?

Lau explained that generative AI is a category of artificial intelligence that uses very large models, up to hundreds of billions, and understands patterns in the data it trains on. With that data, it is able to create new content, including text, images, audio, video, and even computer code.   

“Gen AI is just a tool. It can be used for good or for bad,” Lau said. In the hands of bad actors, generative AI can be used to: 

  • Augment speech to remove accents from foreign callers. 
  • Create new scam scripts.
  • Improve the grammar and design of phishing emails.
  • Clone voices of real people.
  • Create deep-fake videos.
  • Replace phone scam agents.

“Imagine a world where the agents themselves get replaced by a natural-sounding voice clone that can actually talk to the victim instead of humans running the scam,” he said. 

What can the telecommunications industry do?

Lau ended his presentation with advice on what the telecommunications industry can do to fight against bad actors using the latest technology to execute their scams. He suggested: 

  • Raise awareness and educate the public Inform consumers of new scams and remind them that caller ID is spoofable and that voices can be cloned. 

  • Take precautionsEncourage consumers to review their social media presence, including personal information and audio/video clips. Set up a family password that can be used if there’s any doubt regarding a family member’s identity. 

  • Implement verified caller IDSTIR/SHAKEN is a good start, but more needs to be done to make phone numbers non-spoofable.

  • Consider in-call audio analysisLau mentioned that this is a challenge due to privacy issues, but in the future, there may be a way to perform real-time in-call analysis to detect voice clones and voice augmentation.

  • Interactive honeypotsImprovements could be made to spam detection phone lines used to record incoming spam calls. Improvements could include interactive honeypots where the honeypot interacts with the scammer.  

Lau ended his talk with a message to the industry leaders in the audience. He played a clip from the movie Spiderman, where Spiderman is told after discovering his superhero powers, “Remember, with great power comes great responsibility.”

Get the State of the Call Summit Special Report

Attendees at the State of the Call Summit 2023 received a copy of a special report prepared specifically for the event. The report highlights spam call rates in the first half of 2023 in the United States, Canada, the United Kingdom, and worldwide. It also details the major phone scams targeting consumers in 2023. You can download a copy of the report below. 

 

1696513117245

 

Watch State of the Call Summit 2023 presentations on demand.