Government officials are warning of the dangers of AI being used to pull off grandparent scams — also referred to as family emergency scams. These scams involve a fraudster pretending to be a grandchild or other family member in an emergency situation and in need of money.
In a consumer alert titled Scammers use AI to enhance their family emergency schemes, the U.S. Federal Trade Commission cautions:
All a scammer needs is a short audio clip of your family member's voice — which he could get from content posted online — and a voice-cloning program. When the scammer calls you, he’ll sound just like your loved one.
But with the increasing availability of easy-to-use AI tools, this scam becomes even more believable when fraudsters clone the child’s actual voice.
Grandparent scams have been common in Canada for years, and last month Canadian authorities arrested 25 Canadians for using the scam to steal more than $21 million from seniors in 46 U.S. states.
The indictment, filed in the U.S. District Court in Vermont, describes an elaborate network of fraudsters who operated call centers in Montreal, placing calls to elderly victims in the United States. Between the summer of 2021 and June 2024, the indictment says the accused used spreadsheets containing potential victims’ names, addresses, ages and estimated household income.
Using spoofed caller ID to make the calls appear local, “openers” would impersonate a grandchild or other family member saying they were arrested after a car crash and needed money for bail. The victim was then handed off to a “closer” who posed as an attorney and convinced victims to provide money to a bail bondsman who would come to the home to pick it up. If a bail bondsman was not in the vicinity, victims would be instructed to mail cash to a U.S. address, where it would be picked up and transmitted to Canada.
Although the indictment does not state whether the scammers used AI-generated voice clones in the calls, more details may be revealed as the case makes its way through the legal system.
Hiya has the perfect solution. Earlier this year, Hiya launched the Hiya AI Phone mobile app. The app not only continuously analyzes call audio to detect scam language and suspicious patterns, but it detects AI-generated voices — live and recorded — by analyzing subtle audio patterns and notifies users when an AI voice is detected.
The app also features call screening, where an AI assistant answers unknown calls, asking callers to state their name and reason for the call, and blocks scam and spam calls while connecting legitimate ones.
The technology behind the Hiya AI Phone is also available to Hiya partners to integrate into their own apps, devices, or network-based services.
Learn more about the Hiya AI Phone app and to download a free two-week trial.