AI Scam Using Your Family’s Voices

By Matthew Flynn | Published

In another example of the dark side of artificial intelligence, an alarming AI scam has emerged, utilizing state-of-the-art voice cloning technology. Unscrupulous individuals are harnessing this technology to mimic the voices of your dear ones, threatening harm unless they receive payment.

AI Scam Impersonating Family

morpheus-1

A story detailed in the New Yorker tells of an AI scam in which a woman in Scottsdale, Arizona received a call from a person claiming to be her older daughter. She was at her younger daughter’s dance studio when her phone rang. The voice on the other end, sobbing and fearful, claimed to be her older daughter in distress, claiming to be held captive.

Threatening Harm

ai warfare

A man with a Spanish accent further threatened harm if she sought assistance and demanded a million dollars in ransom, reducing it later to fifty thousand dollars.

Fortunately for the woman, she avoided being a financial victim of the AI scam when she discovered her daughter was home safe with her husband.

The evolution of voice cloning technology has been a remarkable journey. Advancements have reached such a level that it is possible for an AI scam to duplicate a person’s voice with just seconds of audio.

Mimicking Any Voice

In 2019, Dessa, a company based in Toronto, marked its contribution by successfully recreating the voice of popular podcaster Joe Rogan, though it demanded substantial financial and audio resources.

Another leap was made by ElevenLabs in 2022, a New York-based firm, when they unveiled a service capable of mimicking any voice at an accelerated speed and in over two dozen languages.

Unlawful Activities

AI scam

The technology, while monumental, has its good and bad. It’s being used for noble causes like assisting individuals suffering from voice-depriving ailments and setting up AI-based memorial services.

However, it is also being misused for unlawful activities, one of which is this AI scam where fraudsters mimic the voices of their victims’ dear ones to extort money.

According to a 2022 report from the Federal Trade Commission (FTC), such voice cloning scams robbed Americans of more than two million dollars.

AI Impersonation

racist artificial intelligence AI scam

In the current scenario, voices do not fall under copyright protections, causing sleepless nights for the victims of voice cloning.

Evidently, there is an urgent need for legislation to prevent the misuse of this advanced technology.

Legislators have proposed bills such as the QUIET Act, seeking to intensify penalties for AI impersonation and classify an AI scam as a weapon if it is incorporated into criminal acts.

No Foolproof Solution

human artificial intelligence AI scam

The FTC is committed to finding safeguarding measures against voice cloning scams and has incentivized the creation of protective techniques by offering prizes. Alas, a foolproof solution is yet to be discovered.

The frequency of this AI scam and other such nefarious activities is challenging to quantify, but there is growing anecdotal evidence pointing to an increase. For now, it’s best for individuals and family to take steps on their own to prevent victimization.