Headline
The Loneliness Epidemic Is a Security Crisis
Romance scams cost victims hundreds of millions of dollars a year. As people grow increasingly isolated, and generative AI helps scammers scale their crimes, the problem could get worse.
Loneliness has never been more urgent. On top of the significant mental health concerns, the idea that people are now lonelier and having fewer social interactions is fueling very real threats to security. Foremost among these is one of today’s most pernicious digital frauds: romance scams, which exploit targets’ feelings of isolation and net fraudsters hundreds of millions of dollars per year. As scammers increasingly organize their workflows and incorporate new AI technologies, it’s becoming possible for them to deploy these scams at an even more vast scale.
Romance scams, also known as confidence scams, are extremely communication-intensive. They require attackers to build relationships with their targets via dating apps and social media. So while generative AI chatbots are already being used to write scripts and converse in multiple languages for other types of fraud, they can’t quite pull off these romance scams on their own. But with the vulnerable population growing, researchers believe there is real potential for automation to provide a boon to scammers.
“These frauds are growing into a more organized form,” says Fangzhou Wang, an assistant professor researching cybercrime at the University of Texas at Arlington. “They are hiring individuals from all over the world, meaning that they can target all different kinds of victims. Everybody is using dating apps and social media. There are all these opportunities that give fraudsters fertile ground.”
Romance fraud is already big business for scammers. People in the US have reported losses of nearly $4.5 billion to romance and confidence fraud over the past decade, according to an analysis of the last 10 years of data from the FBI’s annual internet crime reports. (The most recent data available encompasses up to the end of 2023.) According to the FBI’s figures, romance and confidence scams have led to losses of around $600 million for each of the past five years—except for 2021, when losses peaked at almost $1 billion. Some estimates are even higher. And while there has been some decrease in the amount of money lost to romance scammers in recent years, there has been a rise in so-called pig butchering fraud, which often contains elements of confidence scams.
Romance scams begin all over the internet, from criminals blasting out messages on Facebook to hundreds of victims at a time, to others matching with every profile they see on dating apps. A variety of criminals run romance scams, from “Yahoo Boys” in West Africa to giant scam compounds in Southeast Asia. However, once a criminal has made contact with a potential victim, they all follow an eerily similar playbook to build emotional attachment with those they are attempting to defraud.
“Romance fraud is the most devastating fraud to be a victim of, bar none,” says Elisabeth Carter, an associate professor of criminology at Kingston University London, who has extensively studied these scams and their impacts on people.
Online dating has taken years to integrate into mainstream conceptions of relationships and love, but it is now the norm. As generative AI chatbots have found their way onto scores of smartphones, they have quickly become yet another digital avenue for romance and connection. While it would be difficult with current technology to farm out a romance scam to a chatbot entirely, the potential is clearly there for attackers to use generative AI for creating scam scripts and helping fill in content for more and more chats that are all running simultaneously, even in multiple languages.
UTA’s Wang notes that while she hasn’t assessed whether scammers are using generative AI to produce romance scam scripts, she is seeing evidence that they are using it to produce content for online dating profiles. “I think it is something that has already happened, unfortunately,” she says. “Scammers right now are just using AI-generated profiles.”
Some criminals in Southeast Asia are already building AI tools into their scamming operations, with a United Nations report in October saying organized crime efforts have been “generating personalized scripts to deceive victims while engaging in real-time conversations in hundreds of languages.” Google says scam emails to businesses are being generated with AI. And separately, the FBI has noted, AI allows criminals to more quickly message victims.
Criminals will use a range of manipulation tactics to entrap their victims and build up their perceived romantic relationships. This includes asking intimate questions of their potential victims that only a trusted confidant would ask—for example, questions about relationships or dating history. Attackers also build intimacy through a technique known as “love bombing,” in which they use terms of endearment to try to rapidly advance a feeling of connection and closeness. As romance scams progress, it is very common for attackers to start saying that victims are their girlfriend or boyfriend, or even call them “husband” or “wife” as a way of signaling their devotion.
Carter emphasizes that a core tactic used by romance scammers is to make their heartthrob personas seem hapless and vulnerable. Criminals lurking on dating apps, for example, will sometimes even claim that they were previously scammed and are wary of trusting anyone new. This names the elephant in the room right away and makes it seem less likely that the person the victim is chatting with could be a scammer.
When it comes to extorting money from their victims, this vulnerability is crucial. “They will do things like explain that they have some kind of cash-flow problem in their business, not ask for money, drop it, then maybe a few weeks later bring it back up again,” Carter says. At which point, she explains, the person being manipulated may want to help and proactively offer to send money. Attackers may even go so far, at first, as to argue with victims and attempt to dissuade them from sending funds, all to manipulate targets into believing that it is not only safe but also important to take a stand and assist someone they care about.
“It’s never framed as the perpetrator wanting money for themselves,” Carter says. “There is a real link between the language of fraud criminals and the language of domestic abusers and coercive controllers.”
In a lot of cases, criminals find romance scam success with people who are struggling with feelings of loneliness, says Brian Mason, a constable with the Edmonton Police Service in Alberta, Canada, who works with the victims of scams. “Especially with romance scams, it’s very difficult to convince the person that the person they’re speaking with is not in love,” he says.
Mason says that in one instance he spent two years working with a victim of a romance scam and found out, when updating them on the case, that they had been back in touch with their scammer. “He looped her back in and got her to start sending money again, and she was doing it just so she could see his photos, because she was lonely,” Mason explains. At the end of 2023, the World Health Organization declared high levels of loneliness to be an ongoing threat to people’s health.
Stigma and embarrassment are major reasons that it can be difficult for victims to accept the reality of their situation. And Kingston’s Carter notes that attackers exploit this from the start by telling victims that their conversations should stay between them, because the relationship is too special and no one will understand. Keeping the relationship secret, combined with tactics to trick the victim into offering money rather than asking for it, can make it difficult for even the most careful, thoughtful person to grasp the manipulation that’s happening.
Scammers “dull down red flags and alarm bells; they hide them,” Carter says. “The victim not only has a lot of money taken from them, but it’s taken from them by the person that they love and trust the most in that moment. Just because it’s online, just because it was completely fake, doesn’t mean it wasn’t real to them.”