AI Just Cloned A Girl’s Voice In Million-Dollar Kidnapping Scheme

A woman in Arizona nearly fell for a kidnapping scheme in which her young daughter's voice was perfectly replicated by AI.

By Sean Thiessen | Updated

ai kidnapping

AI is being used in a new kidnapping scam. A report by the New York Post detailed the story of one mother who received a phone call demanding a ransom for her daughter, whose voice cried for help in the background. Her daughter turned out to be safe and sound; her voice on the phone had been cloned by artificial intelligence.

Jennifer DeStefano recalled the details of the AI kidnapping scheme. She had just sent her 15-year-old daughter, Brie, on a ski trip. She later received a mortifying phone call. 

“I pick up the phone, and I hear my daughter’s voice, and it says, ‘Mom!’ and she’s sobbing,” DeStefano recalled. “I said, ‘What happened?’ And she said, ‘Mom, I messed up,’ and she’s sobbing and crying.”

Brie’s voice came through with all her natural inflection and emotion. AI had recreated Brie’s voice, leaving no doubt in her mother’s mind that this kidnapping was real.

Then a man’s voice told Brie to lie down. He spoke to Jennifer directly, saying that he would drug Brie, assault her, and leave her in Mexico if Jennifer did not pay a sum of one million dollars.

DeStefano trembled through the call, the AI voice of her daughter crying for help in the background to sell the kidnapping. She explained that she did not have that amount of money, getting the alleged kidnapper to reduce his price to $50,000.

artificial intelligence ai art

DeStefano was out at an activity with her other daughter. She got help from one of the other mothers and, after calls to 911 and DeStefano’s husband, found that Brie was safely intact on her ski trip.

Even after the situation was resolved, DeStefano was in shock. AI had so convincingly replicated her daughter’s voice that she believed every second of the kidnapping scam. Jennifer DeStefano is not the first person to encounter such a situation.

Phone scammers have been using AI to fool unsuspecting targets into thinking a kidnapping has occurred. According to Subbarao Kambhampati, an AI expert at Arizona State University, perpetrators need only three seconds of audio to convincingly reproduce someone’s voice. With larger sample sizes, the accuracy of the reproduction increases.

Oftentimes, predators find their victims and the samples they need on social media. The more a person puts their voice out into the world, the more susceptible they are to people using AI to fake their kidnapping.

For Jennifer and Brie DeStefanos, that was not the case. According to Jennifer, Brie has a very limited online presence; you would be hard-pressed to find her voice on social media. However, Brie does have interviews online where she talks about her achievements in school and sports.

Special Agent Dan Mayo of the FBI’s Phoenix office says that the more information people have about themselves online, the more susceptible they are to these types of scams. However, he also offers tips on how to beat the AI kidnapping scheme.

Anyone who gets a call similar to the one Jennifer DeStefano experienced should ask the kidnapper questions they could not possibly answer without actually having the person they claim to have. An emergency word that family members share with each other can also serve as a confirmation when such a suspicious scenario arises.

Threats like the one made against the DeStefano family have been reported several times recently. As technology evolves, so do those who abuse its power. In the ever-changing world of AI, people will have to take more and more precautions to avoid these dangerous, exploitative situations.