paint-brush
Voices of Deception: Guide to Protecting Yourself from AI Voice Scamsby@gershwin.aaron
265 reads

Voices of Deception: Guide to Protecting Yourself from AI Voice Scams

by Aaron GershwinJune 19th, 2023
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

A recent voice cloning scam in the U.S. raised cybersecurity concerns, as the unpredictable power of AI becomes evident. Already, voice scams lead to a staggering 77% of victims suffering financial losses. They can be taken to a new level with the help of AI. But AI is just one tool in available for criminals. Learn on how you can erase your online personal data.
featured image - Voices of Deception: Guide to Protecting Yourself from AI Voice Scams
Aaron Gershwin HackerNoon profile picture

00

A recent voice cloning scam in the U.S. raised cybersecurity concerns, as the unpredictable power of AI becomes evident. Here, I delve into the methods criminals use and how to avoid AI voice scam calls.


Jennifer DeStefano received a call, it was an unknown number on screen. The words that came when she answered pierced her heart instantly. "Help me, mom, please help me," heard an Arizona-based mother. There was no doubt in DeStefano’s mind that she just heard her 15-year-old daughter who was away on a skiing trip. “It was the way she would have cried," later said DeStefano to a local television station describing an incident.


In a chilling twist of events, it turned out that the voice the mother heard was not her daughter's at all. Instead, it was an AI doppelganger, cunningly used in an audacious scam attempt. Luckily, the woman instinctively reached out to her daughter's personal phone and exposed the sinister scheme.

The era of compelling voice cloning

This shocking account is merely a glimpse into the near future. Already, voice scams lead to a staggering 77% of victims suffering financial losses of averagely $1400 per person. The voice scams can be taken to a new level with the help of AI. Harnessing the power of a short audio sample (just 3 seconds are enough!) AI voice simulators can seamlessly craft voicemails, and even render a real time conversation. With the ability to mimic diverse accents, genders, and replicate the speech patterns of loved ones, these widely available and often free technologies facilitate the creation of compelling deepfakes.


The voice scams can be taken to a new level with the help of AI.


The ease of personal data collection

Scammers possess a wide range of tools at their disposal, with AI being just one of them. Afterall, what is a convincing voice call without a phone number to call to?


YouTuber Jack Gordon recently went onto whitepages.com to search for famous people. Gordon easily found phone numbers he wanted and gave celebrities like Jimmy Kimmel and Charlie D'amelio a call. Jack's intention was merely to make an interesting video. Nevertheless, this serves as a reminder of the alarming accessibility of sensitive information such as home addresses, phone numbers, emails, and family connections.


People search sites, like whitepages.com, are just the tip of a vast data mining iceberg. I don’t mean to sound bleak, but the reality is that websites, apps, and digital record databases amass all sorts of data. The legal exchange and sale of data between companies is a well-established practice, with _American companies alone estimated to have spent over $19 billion in 2018, as reported by the Interactive Advertising Bureau. While these practices are more common among companies, there is no guarantee that your personal data is safe from leaks or potential sale to scammers.

Exercising your rights to remove your data

Rest assured, not everything is doom and gloom. You possess the power to opt-out of people search sites and legal rights to request the deletion of your data from data brokers’ databases. Though you should be prepared, getting companies to do what you want is time-consuming and complicated. In fact, the process proved to be such a burden, that you can now pay a service to do it on your behalf.


These services scan hundreds of databases and sites, and send official requests to remove personal data. Many services, like Incogni, also incorporate recurring scans to prevent an individual's information from being re-added. As the battle for data privacy wages on, such tools offer a glimmer of hope.

Social media puts biometrics on display

I already touched upon the fact that your own social media posts can later be turned against you. Makeup tutorials or unboxing videos expose your unique features to the world. From facial features and iris patterns to fingerprints and vocal nuances, these four key biometric data can be easily extracted from posts on social networks making them data storehouses for criminals.


Deepfake videos and AI-generated images hold the power to portray us doing any activity imaginable. One particularly notorious instance unfolded in March 2022 when a video emerged, depicting Ukrainian President Volodymyr Zelenskyy ordering his soldiers to lay down their weapons and surrender. Although this audacious attempt was quickly defeated, it revealed the endless possibilities of futuristic technology.


Makeup tutorials or unboxing videos expose your unique features to the world.

Steps for safeguarding your identity online

While deleting all your social media profiles, throwing away your phone, paying exclusively in cash, and boycotting services like Uber would be the ultimate solution for safeguarding your privacy, there are more modern measures to take.


Encrypted messaging. Message encryption effectively denies access to your voice messages and leaves hackers unable to pry into your private conversations. Apple's iMessage, Meta's WhatsApp, or Signal transforms messages into an unreadable jumble to anyone apart from the intended recipient.


Controlling post resolution and timing. Current AI tools heavily rely on high-quality images. To safeguard your identity, simply reduce the resolution of images and videos when posting. Also, don’t repeat the mistake of Kim Kardashian which led to the infamous Paris robbery and better wait until after returning home to share precious vacation memories.


Limiting social media posting. Exercise caution when it comes to videos featuring your voice. Whenever possible, opt for videos that solely consist of a soundtrack and don’t reveal your voice. Additionally, consider sharing videos in formats that automatically vanish after a brief period of time, such as Instagram stories or BeReal.

What to do if you receive a dodgy call?

When receiving a suspicious call, there are four effective strategies to use:


  1. Cybercriminals rely on your emotional reaction. So, pause and reflect: Does the voice truly resemble that of your loved one? Is the request something they’d ask of you? Consider hanging up and directly contacting the person, as previously mentioned Alabama mom did.
  2. Often scammers use loose terms, like “son” or “grandson”, while describing the accident, so ask for more details. “Can you confirm my son’s name?” “Can you tell the plate numbers or car color?” This will throw criminals off their script.
  3. If you receive a sudden call with a loved one's voice, ask for some very personal details, like “What I told you this morning?”, “When is your father’s birthday?”. While voices can be cloned, scammers are unlikely to possess knowledge about the intimate aspects of your life.
  4. In cases where vulnerable family members, such as elderly or children, might be targeted, create a codeword within your family. Develop a plan to always request it, if a distressing situation happens.

Bottom line

Just as we developed ways to keep our houses from robbery, we must learn to protect our data from being taken and used in the rise of the AI. Such measures as deleting personal information from people search sites with Incogni, limiting social media posting, encrypting you messages with iMessage, WhatsApp or Signal, and exercising caution, are simple yet powerful tools to minimizing the risks of falling victim to AI voice cloning scams.