Security
Headlines
HeadlinesLatestCVEs

Headline

AI used to fake voices of loved ones in “I’ve been in an accident” scam

AI is being used by shock call spammers to emulate the voice of a loved one claiming to be involved in an accident.

Malwarebytes
#ios#cisco#git#intel

The San Francisco Chronicle tells a story about a family that almost got scammed when they heard their son’s voice telling them he’d been in a car accident and hurt a pregnant woman.

Sadly, this is becoming more common. Scammers want to spread panic among their victims, and to do this, they feign an emergency situation. That may be a car accident, unexpected hospitalization, or any other scenarios which instantly cause concern and cause victims to act quickly.

Sometimes it’s a pregnant woman who is hurt, sometimes it’s a diplomat.

In earlier days of scams like these, success depended a great deal on the criminal’s skills at social engineering, but rapid advancements in Artificial Intelligence (AI) mean scammers can now easily and convincingly fake the “voice” of the relative that is the supposed victim of the accident.

What better way to make the victim believe that something bad has happened than hearing their loved one cry out for help in their own voice. With the help of various AI powered tools, criminals can easily compose fragments based on a short voice clip they found online.

FBI Special Agent Robert Tripp said:

“Now criminals can fabricate a voice using AI tools that are available either in the public domain for free, or at a very low cost.”

The criminals will keep that part of the communication short, so the target is unable to ask the relative any questions about what happened. While it is possible to fake entire conversations with the help of AI, the tools that can do that are much harder to operate. The criminal would have to type out the responses very quickly and the target might get suspicious. In the story from the San Francisco Chronicle, the phone was “taken over” by the so-called police officer at the scene of the accident, who told the parents that their son would be taken into custody.

This was later followed a cold call by someone posing as legal representative for their son, asking for money to for bail. The intended victims got suspicious when the so-called lawyer said he’d send a courier to pick up the bail money.

The FBI says it has received more than 195 complaints about this type of scam that it refers to as “grandparent scams.” It reports nearly $1.9 million in losses, from January through September of 2023.

How to avoid scams like this

One of the main tools for the imposters is the amount of information they can round up about the target. Their main sources will include social media and phishing.

There are a few simple precautions to lessen the chances of falling victim to such scams:

  • Avoid letting third parties know too many personal details about you.
  • Do not answer telephone calls from numbers you do not recognize or calls from private numbers.
  • Ask for the caller’s telephone number and check it.
  • If necessary, try to reach your allegedly implicated family member by phone. If you can’t reach them directly, call someone else who might know where they are.
  • Hang up if in doubt and never give financial or personal information to the caller.
  • If you receive or have received such a call, please notify the police immediately and under no circumstances respond to the perpetrators’ demands.

We don’t just report on threats – we help safeguard your entire digital identity

Cybersecurity risks should never spread beyond a headline. Protect your—and your family’s—personal information by using Malwarebytes Identity Theft Protection.

Malwarebytes: Latest News

Spotify, Audible, and Amazon used to push dodgy forex trading sites and more