AI and Crypto Scams: A Growing Threat
- Nominis Research Team
- 2 days ago
- 4 min read
Updated: 7 hours ago
As AI and crypto continue their meteoric rise, they bring game-changing benefits to individuals, startups, and enterprises alike. AI-driven automation is streamlining financial transactions, optimising trading strategies, and making compliance smarter than ever. Crypto is breaking barriers, enabling instant global payments, and fuelling a new wave of decentralised finance. Together, they’re revolutionising the way we transact, invest, and build wealth in the digital era.
But where there’s innovation, scammers aren’t far behind—especially when AI makes deception more sophisticated than ever. Enter a new breed of fraudsters who harness AI-generated deepfakes, fake donation campaigns, and automated trading bots to manipulate the system, drain funds, and disappear without a trace. Whether it’s AI-generated phishing scams or hyper-realistic social media hype pumping worthless tokens, bad actors are evolving at an alarming rate.
AI-driven trading bots are designed to optimize cryptocurrency trading by analyzing market trends and executing trades. While these bots can enhance efficiency, they are also vulnerable to exploitation. Fraudulent AI trading platforms often promise guaranteed high returns, luring investors into Ponzi schemes. Regulators, such as the Commodity Futures Trading Commission (CFTC), have issued warnings about AI-powered trading scams that falsely claim to deliver consistent profits. A recent case involves AIXBT, an AI-driven crypto market analyst bot built on the Virtuals Protocol, which lost approximately $100,000 worth of ETH in what appears to be a scam or exploit. The attack, described as a "dashboard access hack," manipulated the bot into sending 55 ETH to the attacker through malicious replies. However, its core systems remain unaffected. In response, the team has paused its data dashboard, migrated keys, and upgraded security measures.
Deepfake technology enables scammers to create hyper-realistic videos and audio recordings, impersonating influential figures to promote fraudulent crypto investments. A well-known case involved a deepfake of Elon Musk endorsing a fake crypto platform, convincing unsuspecting investors to transfer funds. AI is also enhancing phishing attacks and social engineering tactics, making scam messages more convincing by analyzing users’ online behavior and personalizing fraudulent communications.
The hype around AI has led to the rise of fake investment platforms claiming to leverage AI for extraordinary returns. These scams often feature fabricated testimonials and simulated trading results, deceiving investors into depositing funds that scammers ultimately siphon away.
Another example included a victim who lost £76,000 to a deepfake scam on Facebook, that used AI-generated footage of financial expert Martin Lewis and Elon Musk, to promote a fake Bitcoin investment scheme.

Similarly, AI is being used in "pig butchering" scams, where fraudsters build long-term relationships with victims before luring them into fake crypto investments. AI automates these interactions, making the deception more believable and increasing victims' losses.
Scammers aren’t missing an opportunity to exploit crises by creating fake donation campaigns that use AI-generated content to appear legitimate, tricking donors into sending cryptocurrency to fraudulent wallets. Utilizing AI-generated content, these fraudulent schemes craft convincing narratives and realistic images to deceive donors into contributing cryptocurrency to illegitimate wallets. McAfee's research highlights that scammers often set up counterfeit websites or social media posts masquerading as legitimate charities, diverting donations into their own accounts. The emotional urgency surrounding such catastrophes increases the likelihood of individuals falling victim to these attacks.
And the worst part? They’re not just after individual investors. Exchanges, DeFi platforms, and financial institutions are prime targets too—getting caught in the crossfire of AI-driven fraud, money laundering, and regulatory violations. Without airtight compliance, real-time alerts, threat detection and transaction monitoring, businesses risk losing more than just money—they’re gambling with their reputations, regulatory standing, and customer trust.
The solution? Fighting AI with AI. Advanced fraud detection, blockchain analytics, and automated investigations are now mission-critical for crypto businesses looking to stay ahead of cybercriminals. Governments are cracking down with stricter compliance rules, making it clear: if you’re not proactively securing your platform, you’re already behind.
Welcome to the AI-powered battlefield of crypto. The stakes are high, but with the right defense, businesses can outmaneuver scammers, safeguard users, and build a more secure digital financial ecosystem.
AI & Crypto Scams: FAQs
Q: What is an AI-powered pump-and-dump scam?
In an AI-driven pump-and-dump, fraudsters use artificial intelligence to generate hype around a cryptocurrency. They might create deepfake videos of influencers promoting a token, automate fake social media engagement, or deploy AI-written news articles to drive up interest. Once the price is artificially inflated, scammers sell their holdings, causing the token's value to crash and leaving unsuspecting investors with losses. Be cautious of projects that seem to explode in popularity overnight.
Q: How can I protect myself from AI-driven crypto scams?
While we strive for accuracy in our content, we acknowledge that errors may occur. If you find any mistakes, please reach out to us at contact@nominis.io Your feedback is appreciated!