Headline
Is AI being used for virtual kidnapping scams?
Categories: News Tags: kidnap
Tags: scam
Tags: virtual
Tags: AI
Tags: voice
Tags: fake
Tags: fraud
Tags: hoax
Tags: kidnapping
We take a look at claims that AI is now being used for a notorious form of kidnapping hoax.
(Read more…)
The post Is AI being used for virtual kidnapping scams? appeared first on Malwarebytes Labs.
You may have seen a worrying report of Artificial Intelligence (AI) being used in a virtual kidnapping scam. The AI was supposedly used to imitate the voice of an Arizona resident’s daughter, who claimed to have been kidnapped. The daughter was safe and well elsewhere on a school trip. Unfortunately, with the daughter out of sight this just made the scam seem more believable. Was she actually on the trip, or kidnapped? With no way to know right away, all the parent could do was listen to a demand for $1m and the threat of terrible things happening to their daughter.
The scammers dropped the ransom down to $50k after being told that the money simply wasn’t available, and while all of this was going on, a friend of the family, and law enforcement, were able to confirm that the supposedly kidnapped daughter was in fact safe and well.
Virtual kidnapping scams have been around for many years, but this is a new spin on a well-worn technique.
The imitated child’s parent is convinced that some form of AI was used in this instance. To do this, scammers would have had to obtain some samples of the daughter’s voice. The samples would then have been fed into a machine learning algorithm which learned how she speaks, giving the scammers a computer program that can speak like the victim.
This technique certainly works, and can produce strartling results. To hear for yourself, take a listen to podcast.ai, a podcast entirely generated by AI, that features guests like the late Steve Jobs.
The case for AI
Can we be sure that what happened here was down to AI?
The victim claims that the voice was definitely that of her daughter. You would expect someone to recognise a fake or an imitation of their own child. Think how many celebrity impersonators you’ve heard on TV or elsewhere, and how many of them are actually good at it. More often than not, the slightest imperfections really stand out. Now apply this to a mother and her daughter. She’s going to have a very good idea what her offspring does and doesn’t sound like.
Subbarao Kambhampati, a computer science professor at Arizona State University, told the New York Post that it’s possible to spoof a voice in convincing fashion from just three seconds of audio.
According to the victim, her daughter has no social media presence to speak of, but has done a few short public interviews. In theory, this could be enough for the fraudsters to create a working facsimile of her voice.
None of this is proof that AI was used, but none of it rules out AI either.
The case against AI
Creating a replica voice from three seconds of audio sounds scary, but in practice things aren’t quite so cut and dry. We covered a great example of this a little while ago, involving a journalist logging into his telephone banking via use of AI voice replication. It’s definitely not an exact science, and getting the voice right can take many attempts, samples, and requires an AI tool that can stitch everything together to an acceptable standard.
In terms of the mother’s claim she recognised her daughter’s voice, that’s complicated. Understandably, she will have experienced a considerable level of panic when receiving the call, and that might have affected her ability to identify her daughter. CNBC wrote about the phenonmenon of virtual kidnappings in 2018, before the current AI boom. In every case listed in its article, the person stuck on the phone is convinced the voice on the other end of the line is who the fake kidnapper claims them to be. Teenage sons, younger daughters, men in their thirties…the horror of these calls has the victim pretty much ready to stand up in court and state that this was the real deal.
This effect of “Yes, it’s them” has been happening for years, long before AI came onto the scene. Is this what’s happened in the AI kidnap scam above? And why would virtual kidnappers bother to replicate someone’s voice if the victim is going to believe it’s all real anyway?
Protection from virtual kidnap scams
Steering clear of this kind of attack isn’t particularly affected by whether or not the person screaming down the phone is an impersonator or a slice of AI. The basics remain the same, and social engineering is where a lot of these attacks take shape. It’s not a coincidence that most of these stories involve the supposed kidnap victim being on holiday or away from the family home when the bogus call comes through. There are some things you can do to blunt the effect of virtual kidnap scams:
Be vacation smart. Avoid posting travel dates and locations that could add some fake legitimacy to a scammer’s call.
Make your data private. Revisit your online presence, and lock down or delete your data so scammers know less about you.
A plausible alert. Consider a password that family members can use to confirm they actually are in danger.
Malwarebytes removes all remnants of ransomware and prevents you from getting reinfected. Want to learn more about how we can help protect your business? Get a free trial below.
TRY NOW