The saying "technology is a double-edged sword" has gained a new interpretation recently—scammers are using AI for face-swapping to commit fraud, but victims can also use AI to save themselves.
A bizarre case was just reported in the Bay Area, USA. A woman met a "boyfriend" on a social platform. He portrayed himself as a crypto investment expert, and after months of building a connection, he started recommending a so-called "insider investment platform." Faked profit screenshots kept coming, and the woman became interested. She started by testing the waters with $15,000, then withdrew $490,000 from her retirement savings, and finally even mortgaged her house for $300,000. After transferring the money in batches to a Malaysian account, the platform suddenly disappeared—want to withdraw? Pay a fee first; account abnormal? Add more money to unfreeze it. It's the classic pig-butchering scam, and nearly $1 million was swindled in total.
But here's the twist: although the woman was ensnared, she hadn't completely lost hope. She compiled all her chat records with the "boyfriend" and screenshots of the platform, then sent them to ChatGPT with a simple question: "Does this seem legit?"
AI's response hit the nail on the head: this was a textbook "romance + crypto investment" combo scam. The scammer first builds emotional dependency, then exploits the anonymity and cross-border nature of cryptocurrency to move funds, finally using a fake platform to create the illusion of profit. ChatGPT not only explained the scheme but also listed several key red flags—the platform's domain had been registered for a short time, customer service responses were templated, and withdrawal rules kept changing.
The irony in this case is that while the scammer may have used AI to generate their scripts, the victim also relied on AI to uncover the scam. Although the money couldn't be recovered, at least she stopped her losses in time and didn't sink any deeper.
Ultimately, technology itself is neutral. A knife can chop vegetables or hurt people; AI can help scammers with their packaging but can also help ordinary people avoid traps. Next time you're faced with a "guaranteed investment opportunity," maybe ask ChatGPT first—it's a lot more reliable than asking your friends.
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
The saying "technology is a double-edged sword" has gained a new interpretation recently—scammers are using AI for face-swapping to commit fraud, but victims can also use AI to save themselves.
A bizarre case was just reported in the Bay Area, USA. A woman met a "boyfriend" on a social platform. He portrayed himself as a crypto investment expert, and after months of building a connection, he started recommending a so-called "insider investment platform." Faked profit screenshots kept coming, and the woman became interested. She started by testing the waters with $15,000, then withdrew $490,000 from her retirement savings, and finally even mortgaged her house for $300,000. After transferring the money in batches to a Malaysian account, the platform suddenly disappeared—want to withdraw? Pay a fee first; account abnormal? Add more money to unfreeze it. It's the classic pig-butchering scam, and nearly $1 million was swindled in total.
But here's the twist: although the woman was ensnared, she hadn't completely lost hope. She compiled all her chat records with the "boyfriend" and screenshots of the platform, then sent them to ChatGPT with a simple question: "Does this seem legit?"
AI's response hit the nail on the head: this was a textbook "romance + crypto investment" combo scam. The scammer first builds emotional dependency, then exploits the anonymity and cross-border nature of cryptocurrency to move funds, finally using a fake platform to create the illusion of profit. ChatGPT not only explained the scheme but also listed several key red flags—the platform's domain had been registered for a short time, customer service responses were templated, and withdrawal rules kept changing.
The irony in this case is that while the scammer may have used AI to generate their scripts, the victim also relied on AI to uncover the scam. Although the money couldn't be recovered, at least she stopped her losses in time and didn't sink any deeper.
Ultimately, technology itself is neutral. A knife can chop vegetables or hurt people; AI can help scammers with their packaging but can also help ordinary people avoid traps. Next time you're faced with a "guaranteed investment opportunity," maybe ask ChatGPT first—it's a lot more reliable than asking your friends.