Artificial Intelligence-Powered Deepfakes Contributed Significantly to Major Cryptocurrency Frauds in the Previous Year, According to Recent Analysis

The recent Bitget report reveals a disturbing 40% of high-value cryptocurrency frauds in 2024 utilized AI-powered deepfakes, marking a significant escalation in the sophistication of cyber scams which now include eerily accurate impersonations of figures like Elon Musk. This surge in deepfake technology use not only challenges cybersecurity but also complicates efforts to secure transactions, emphasizing the urgent need for adaptive fraud detection systems in the digital finance sector.

Nathan Mercer

June 16, 2025

According to a recent Bitget report, a staggering 40% of high-value cryptocurrency frauds in 2024 were driven by artificial intelligence-powered deepfakes, signaling not just a technological advancement in the tools available to scammers, but a worrying escalation in the sophistication of these schemes. This finding underscores a disturbing trend where the blending of cutting-edge technology with old-fashioned grift creates a formidable challenge in cybersecurity. See the detailed analysis on Decrypt.

The report, which also highlights a significant 24% increase in overall crypto scam losses, to $4.6 billion, pinpoints how deepfake technology has become a go-to method for high-stakes fraudsters. These aren't the grainy, easily-spotted fakes of yesteryear. Today's AI algorithms can mimic voices and visuals with unnerving accuracy, with Elon Musk being a 'popular' choice for impersonations, tricking people into fraudulent investment schemes or non-existent giveaways. The misuse isn't just limited to fake endorsements or financial scams. More alarmingly, these capabilities extend to subverting security measures like Know-Your-Customer (KYC) protocols and even orchestrating complex phishing operations via platforms such as Zoom.

Beyond the technical prowess of deepfakes, the Bitget report also calls attention to the enduring problem of social engineering and the resurgence of Ponzi schemes, albeit in modern digital guises involving DeFi, NFTs, and GameFi. These scams exploit human psychology and trust, leveraging shiny new tech terminologies to lure investors into what are essentially repackaged schemes of robbing Peter to pay Paul. In some instances, the scammers use deepfakes to add a layer of legitimacy to these already deceitful practices, by forging celebrity endorsements that can convince even the savviest investors.

What this means for the industry is pretty stark: the battle against crypto fraud is not just against rogue code or a phishing email. It's against highly sophisticated operations that blend technology and psychology to exploit trust on a global scale. For those invested in the security and integrity of digital finance, the challenge is not just to build better defenses but also to foster a keener awareness among users. The old wisdom of 'seeing is believing' no longer holds up in this digital age, where seeing might just be deceiving.

For entities handling large volumes of transactions, such as crypto exchanges or online markets, integrating robust fraud detection systems that can adapt to the evolving tactics of scammers is crucial. Companies like Radom offer on- and off-ramping solutions that not only streamline the conversion process between crypto and fiat but also emphasize the security measures essential to thwart these advanced fraudulent schemes.

As the technology evolves, so too must our strategies to counteract its misuse. It's a cat-and-mouse game where the stakes are incredibly high, both in terms of financial assets and the trust of the global digital community.

Sign up to Radom to get started