Skip to content

AI Voice Scams On The Rise

Many cannot be able to tell the difference between AI and the real thing
Unknown,Caller.,A,Man,Holds,A,Phone,In,His,Hand
Photo: MoonSplinters | Shutterstock

With the increasing popularity of artificial intelligence, the accessibility of tools used for scams has also become more accessible, resulting in a rise in cybercrime.

U.S. authorities have been rattled by a new type of scam that employ readily accessible AI voice cloning tools, which convincingly replicate voices — usually of loved ones.

“These tools help fraudsters leave misleading voicemails and can even change their voices on phone calls,” a security expert from VPNOverview said. “Essentially, the aim of the scam is to steal from people by impersonating family members.” 

A recent survey revealed that 70% of people may not be able to tell the difference between a cloned voice and the real thing.  

According to the Economic Times, some scammers are pretending to be a person's child in distress — usually a kidnapping scenario. The person using the AI then asked for the person to send money. 

Naturally, for a clone to exist, an original voice is required and cybercriminals face no challenges in obtaining original voice files to create these clones. Research from McAfee revealed that 53% of adults surveyed admitted to sharing their voice data online or in recorded notes at least once a week, while 49% reported doing so up to ten times a week.

“The only thing the scammer needs is a short audio clip of your family member's voice, which could easily be accessed through content posted online, and by using a voice-cloning program, the clip can be used to sound like a real-life call from a loved one,” the security expert said. 

Among the individuals who reported financial losses, 36% stated that they were scammed out of amounts ranging from $500 to $3,000, whereas 7% fell victim to scams involving sums ranging from $5,000 to $15,000.

To prevent the scams from happening, experts say to be cautious of unknown numbers. To determine if a caller is impersonating a loved one, hang up and redial the person's actual phone number.

If you believe you’ve encountered an AI-generated voice scam, report it to the Federal Trade Commission (FTC) at ReportFraud.ftc.gov and your local law enforcement.


In case you missed it here's Local Profile's report on tax scams.