AI scam calls: How to protect yourself, how to detect them

You answer one A random call from a family member who breathlessly explains how a horrific car accident happened. They need you to send money immediately or they will go to jail. You can hear the desperation in their voices as they plead for an immediate transfer of cash. Even though it did sound like them and the call was coming from their number, you still had a feeling something wasn’t right. So you decide to hang up and call them back right away. When your family members answer your call, they say there was no car accident and they have no idea what you are talking about.

Congratulations, you just successfully avoided an AI scam call.

As generative AI tools become more powerful, it’s becoming increasingly easier and cheaper for scammers to create fake but convincing audio of people’s voices. These AI voice clones are trained on existing audio clips of human speech and can be tweaked to imitate almost anyone. The latest models can even speak multiple languages. OpenAI, the makers of ChatGPT, recently announced a new text-to-speech model that could further improve voice cloning and make it more widely available.

Of course, bad actors are using these AI cloning tools to trick victims into thinking they are talking to a loved one on the phone, even though they are talking to a computer. While the threat of AI scams can be scary, you can keep these expert tips in mind the next time you get an urgent, unexpected call to stay safe.

Keep in mind that AI audio is difficult to detect

It’s not just OpenAI; there’s OpenAI. Many tech startups are working on replicating near-perfect-sounding human speech, and progress has been rapid lately. “If it was a few months ago, we would have given you tips on what to look out for, like a pregnancy stalling or showing some kind of latent phase,” said Ben Colman, co-founder and CEO of Reality Defender. .” Like many aspects of generative AI last year, AI audio is now a more convincing imitation of the real thing. Any security strategy that relies on detecting quirks by voice over the phone is outdated.

Hang up and call back

Security experts warn that it’s easy for scammers to make calls appear to come from legitimate phone numbers. “A lot of times, scammers will spoof the number they’re calling you to make it look like they’re calling you from a government agency or bank,” said Michael Jabbara, global director of fraud services at Visa. “You have to be proactive.” Regardless. Whether it’s from your bank or a loved one, any time you receive a call asking for money or personal information, simply ask for a call back. Find the number online or in your contacts and initiate a follow-up conversation. You can also try messaging them through other verified lines of communication, such as video chat or email.

Create a secret safeword

A popular safety tip suggested by multiple sources is to devise a safeword that only family members know so they can be asked over the phone. “You can even pre-negotiate a word or phrase with your loved one that they can use to identify themselves if they’re under duress,” said McAfee CTO Steve Grobman. ‘s true identity.” While a call back or verification through another method of communication is best, a safeword can be especially helpful for younger people or older relatives who may have difficulty reaching them through other means.

Or just ask them what they’re having for dinner

What if you don’t have a safe word and are trying to find out if a distressing phone call is real? Pause for a moment and ask a personal question. “It can even be as simple as asking a question that only a loved one knows the answer to,” Grobman said. “It could be, ‘Hey, I want to make sure this is really you. Can you remind me what we had for dinner last night?'” Make sure the question is specific enough that the scammer can’t answer it correctly with an educated guess.

Understand that any sound can be imitated

Deepfake audio cloning isn’t just reserved for celebrities and politicians, such as New Hampshire’s use of an AI tool to sound like Joe Biden’s phone calls and stop people from voting. “One misconception is: ‘This can’t happen to me. No one can clone my voice,'” said Rahul Sood, PinDrop’s chief product officer. PinDrop, a security company, discovered the possible source of the AI ​​Biden audio. “People don’t realize that just 5 to 10 seconds of your voice, on a TikTok or a YouTube video you might have created in your career, can easily be used to create a clone of you.” Using AI Tools , a voicemail sent on your smartphone might even be enough to replicate your voice.

Don’t give in to emotional appeals

Whether it’s a hog-killing scam or an AI phone call, experienced scammers can build trust, create a sense of urgency, and find your weaknesses. “Be wary of any engagement that makes you feel strong emotions, as the best scammers are not necessarily the most skilled technical hackers,” Jabala said. “But they know a lot about human behavior.” If you take a moment to reflect on the situation and resist your urges, you can avoid being scammed.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *