How ‘Deepfake Musk’ became internet’s biggest scammer – ET CISO
https://etimg.etb2bimg.com/thumb/msid-112643315,imgsize-790561,width-1200,height=765,overlay-etciso/ot-security/how-deepfake-musk-became-internets-biggest-scammer.jpg
All Steve Beauchamp wanted was money for his family. And he thought Elon Musk could help.
Beauchamp, an 82-year-old retiree, saw a video late last year of Musk endorsing a radical investment opportunity that promised rapid returns. He contacted the company behind the pitch and opened an account for $248. Through a series of transactions over several weeks, Beauchamp drained his retirement account, ultimately investing more than $690,000.
Then the money vanished — lost to digital scammers on the forefront of a new criminal enterprise powered by artificial intelligence.
The scammers had edited a genuine interview with Musk, replacing his voice with a replica using AI tools. The AI was sophisticated enough that it could alter minute mouth movements to match the new script they had written for the digital fake. To a casual viewer, the manipulation might have been imperceptible.
“I mean, the picture of him — it was him,” Beauchamp said about the video he saw of Musk. “Now, whether it was AI making him say the things that he was saying, I really don’t know. But as far as the picture, if somebody had said, ‘Pick him out of a lineup,’ that’s him.”
Thousands of these AI-driven videos, known as deepfakes, have flooded the internet in recent months featuring phony versions of Musk deceiving scores of would-be investors. AI-powered deepfakes are expected to contribute to billions of dollars in fraud losses each year, according to estimates from Deloitte.
The videos cost just a few dollars to produce and can be made in minutes. They are promoted on social media, including in paid ads on Facebook, magnifying their reach.
“It’s probably the biggest deepfake-driven scam ever,” said Francesco Cavalli, the cofounder and chief of threat intelligence at Sensity, a company that monitors and detects deepfakes. Musk was by far the most common spokesperson in the videos, according to Sensity, which analysed more than 2,000 deepfakes. He was featured in nearly a quarter of all deepfake scams since late last year, Sensity found. Among those focused on cryptocurrencies, he was featured in nearly 90% of the videos.
The deepfake ads also featured Warren Buffett, the prominent investor, and Jeff Bezos, the founder of Amazon, among others.
It is difficult to quantify exactly how many deepfakes are floating online, but a search of Facebook’s ad library for commonly used language that advertised the scams uncovered hundreds of thousands of ads, many of which included the deepfake videos.
YouTube was also flooded with the fakes, often using a label that suggests the video is “live.” In fact, the videos are prerecorded deepfakes.
After former President Donald J Trump spoke at a Bitcoin conference Saturday, YouTube hosted dozens of videos using the “live” label that showed a prerecorded deepfake version of Elon Musk saying he would personally double any cryptocurrency sent to his account. Some of the videos had hundreds of thousands of viewers, though YouTube said scammers can use bots to artificially inflate the number. One Texan said he lost $36,000 worth of Bitcoin after seeing an “impersonation” of Musk speaking on a so-called live YouTube video in February 2023, according to a report with the Better Business Bureau, the nonprofit consumer advocacy group.
“I send my bitcoin, and never got anything back,” the person wrote.
YouTube said that it had removed more than 15.7 million channels and over 8.2 million videos for violating its guidelines from January to March of this year, with most of those violating its policies against spam. The prevalence of the phony ads prompted Andrew Forrest, an Australian billionaire whose videos were also used to create deepfake ads on Facebook, to file a civil lawsuit against Meta for negligence in how its ad business is run.
Meta, which owns Facebook, said the company was training automated detection systems to catch fraud on its platform, but also described a cat-and-mouse game where well-funded scammers constantly shifted their tactics to evade detection. YouTube pointed to its policies prohibiting scams and manipulated videos. The company in March made it a requirement that creators disclose when they use AI to create realistic content.
Scammers often target older internet users who may be familiar with cryptocurrency, AI or Musk, but unfamiliar with the safest ways to invest.