Happy Valentine’s Day. Don’t let romance scams, which are at an all-time high during the holidays, break your heart.
These scams cost Americans $3 billion last year alone. This is almost certainly an undercount, given the particular reluctance of victims to report falling for such schemes.
Many romance scams fall under the umbrella of so-called “pig slaughter” scams, in which scammers build relationships with victims and gain their trust over long periods of time. The nickname is a crude reference to fattening a pig before slaughter, and they go all over the pig, repeatedly trying to extract money from the target. Between 2020 and 2024, these scams defrauded people around the world of more than $75 billion.
Now, AI is making these scams increasingly accessible, affordable, and profitable for scammers. In the past, romance scammers had to have a good command of the English language if they wanted to effectively scam Americans. According to Fred Heiding, a postdoctoral researcher at the Harvard Kennedy School who studies AI and cybersecurity, AI-enabled translation has completely removed that obstacle, and scammers now have millions more potential victims at their disposal.
AI is fundamentally changing scale and serves as a force multiplier for fraudsters. A single person who used to run a few scams at a time can use these toolkits to run 20 or more simultaneously, Chris Nyhuis, founder of the cybersecurity firm Vigilant, told me by email. AI-assisted scams are significantly more profitable than traditional ones and are becoming cheaper and easier to execute.
On the dark web, scammers can purchase romance scam toolkits complete with customer support, user reviews, and tiered pricing packages. These toolkits come with pre-made fake personas with AI-generated photo sets, conversation scripts for each stage of the scam, and deepfake video tools, Nyhuis told me. “The skill barrier to entry has virtually disappeared.”
I wondered if romance scammers could become automated and out of work, but the Kennedy School’s Heiding told me that “often it’s just augmentation, rather than complete automation.” Many of the scammers are also victims, with at least 220,000 people trapped in scam centers in Southeast Asia and forced to defraud their targets, facing horrific abuse if they refuse. Leveraging AI means “crime syndicates [who run these centers] It probably just has better profit margins,” Heiding said.
For now, there is a human being behind the scenes of the scams, even if they are just pressing start on an AI agent. But other than that, it can be completely automated. At the moment, Heiding told me, AI isn’t much better than human romance scammers, but the technology is evolving quickly. In 2016, Google DeepMind’s AlphaGo beat the world’s best human Go player in a landslide. Human forecasters believe that AI will far surpass their ability to predict the future very soon.
“I wouldn’t be surprised [if] “Within a few years or a decade, we will have artificial intelligence scammers who simply think in completely different patterns than humans,” Heiding said. “And unfortunately, they’ll probably be very, very good at persuading us.”
What does love have to do with this?
Romance scams are unique: They target a fundamental human need for love and connection. You may have heard that we are in a loneliness epidemic, officially declared by the US Surgeon General in 2023, with health risks equivalent to smoking up to 15 cigarettes a day. Social isolation is linked to higher rates of heart disease, dementia, depression and even premature death, and 1 in 6 people worldwide reportedly feel lonely. And lonely people are the main targets.
Scammers send initial AI-generated messages to potential victims. Over time, they use lovebombing techniques to convince them that they are in a romantic relationship. Once trust is established, they request money through methods that are difficult to recover, such as gift cards, bank transfers, or cryptocurrency. They will often compensate for crises that require urgent transfers. They could trick the victim after achieving their goals or continue the scam to get more profit.
AI romance scams use fake video calls, “cheap fake” social media profiles, and voice cloning technology like other AI-enabled scams to lure people in. But according to Nyhuis, they are “exceptionally dangerous because of what they exploit. Phishing uses urgency; tech support scams use fear. Romance scams use love, which can make people think irrationally or override their instinct that something is wrong.”
Older adults often experience social isolation and are frequently targeted by romance scammers. Retirement and grief can create circumstances that scammers deliberately manipulate, making victims feel seen and cared for, even as they are robbed of their life savings and the homes where they plan to spend their retirement years. But anyone can be fooled by these scams. Despite being digital natives, Generation Z is three times more vulnerable to online scams than older generations as they spend a lot of time online, although they tend to have (and therefore lose) less money than older victims.
Here’s something else that will break your heart: Scam victims are more likely to be targeted again. Scammers create profiles of their targets and sometimes add them to “sucker lists” shared among criminal networks. Victims of other crimes are also more likely to be revictimized, and being a victim of a romance scam is not a moral failing on the part of the target.
But it is something to be on guard against, as the vast majority of scam victims will not be able to get their money back. About 15 percent of Americans have lost money to online romance scams, and only 1 in 4 were able to recover all of the stolen funds.
Romance scams thrive on shame and secrecy. Sometimes victims are blackmailed and told that if they trust the people in their lives, the scammers will expose sensitive information. Sanchari Das, assistant professor and AI researcher at George Mason University, and Ruba Abu-Salma, senior lecturer in computer science at King’s College London, received a Google Academic Research Award to study AI-powered romance scams targeting older adults in 13 countries. His research examines how artificial intelligence tools can amplify traditional scam tactics and how families and communities can better support victims.
Researchers are making connections with gerontological societies and aim to create educational tools to support victims of AI romance scams. There is already a lot of information on prevention, but very little that tells victims what to do next.
Like so many people, I met my partner online. I’m grateful that we started dating in the late 2010s, before the explosion of AI-generated profiles on dating apps and sites.
AI is getting better at fooling people across the board. It has improved greatly on hand rendering, a previously reliable indicator for deepfakes, and is learning from its mistakes. “As these technologies improve, traditional signals to detect tampering are no longer dependent,” Das said. “At the same time, we are leveraging AI to counter these threats by detecting scam patterns, predicting emerging tactics, and strengthening protection responses. The goal is to build systems and communities that are as adaptable as the technology itself.”
Society is also becoming increasingly desensitized to the romance with AI. One study found that nearly a third of Americans had an intimate or romantic relationship with an AI chatbot. The 2013 film. Hisin which a man falls in love with an AI voiced by Scarlett Johansson, was set in 2025. It wasn’t too far from reality.
AI chatbots are specifically designed to keep people engaged. Many use a “freemium” model, where basic services cost nothing, but charge a premium for longer conversations and more personalized interactions. Some “companion bots” are designed for users to form deep connections. Although people know that the “couple” is AI, these companion bot apps sell user data for targeted advertising and are not transparent about their privacy policies. Isn’t that also a kind of intimacy scam, a way to extract resources from lonely people for as long as possible?
There are steps you can take to protect your heart, your wallet, and your peace of mind. It seems obvious, but refusing to send money to someone you don’t know in person will stop a romance scam. You can demand spontaneous video calls and ask the person on the other end to do something random; Deepfakes still fight with “unwritten” actions.
“Be suspicious of anyone you’ve never met in person — that’s the only safe approach in a digital world increasingly full of scams,” said Konstantin Levinzon, co-founder of free VPN service provider PlanetVPN, in a press release. “If someone you meet on a dating site seems suspicious, do a reverse image search to check if their photos are stolen from other sources. And if the conversation turns to money, or if someone asks you for personal information, leave the conversation immediately.”
You can also use a VPN to hide your location, as scammers can track users’ locations and try to customize their scams based on the city or country of the target. If you are scammed, informing the FBI’s Internet Crime Complaint Center, the Federal Trade Commission, and your bank in advance increases the chances that you will be able to recover the stolen funds. Several nonprofit organizations offer support to victims of romance scams.
“No matter how alone you feel now, no matter how ashamed you are, you will recover from this and one day you will look back and see how you got through it,” Nyhuis said. “These scammers are good at taking away hope. Don’t let them take it away from you.”

