Recently, some scammers used deepfake technology to emulate Elon Musk’s voice and action to deceive people to invest in a crypto scam program.
Scammers use artificial intelligence and machine learning to create ai-generated voices and mannerisms of prominent individuals to promote crypto scams.
Organizations should educate their staff on detecting and handling deep fake cases.
The advancement in technology has come up with greater risks than ever before. For example, some malicious actors are using artificial intelligence and related technologies to fake voices and actions of prominent people to mislead their audience as well as to steal their digital assets. In particular, these bad actors fake news and deepfake videos to mislead investors and scam them. This article looks at how some of these malevolent actors deepfaked Elon Musk YouTube during SpaceX launch.
Read also: Elon Musk Is Accused Of Manipulating Dogecoin
On 6 June, some scammers inundated YouTube with deepfake videos of Elon Musk as they tried to exploit crypto investors of their digital assets during the SpaceX launch period. During the Starship rocket launch event the scammers live streamed the AI-generated voice of Musk on over 35 YouTube channels. Through the livestream they faked Musk’s voice and mannerisms to lure investors to a crypto project where they promised investors to earn double their invested amounts.
The video footage showed as if Musk was speaking while outdoors at SpaceX. However, the footage had blurred sections programmed in a way to conceal the reality of the situation. The fraudulent streams which coincided with the launch of SpaceX’s fourth test beamed the rocket to a point where it was entering the atmosphere from the earth. It also showed the moment when the rocket plunged into the Indian ocean as was planned.
Read also: Inion Fraud Schemes and How to Prevent Them
The impersonators incorporated various elements to make the audience believe that it was Elon Musk who was talking to them. According to a Cointelegraph’s publication, a YouTube channel masqueraded as SpaceX’s official account’s scam livestream had more than 170,000 viewers tuned in. Some experts believe that most of the purported viewers were mere bots used to convince the audience that it was a legitimate live stream. Such a viewer impact was designed to lure crypto investors to invest in Musk’s supposedly investment project.
The AI generated voice with Musk’s likeness persuaded the audience to send different types of crypto assets such as Bitcoin or Ether to the cited digital wallet addresses. The creators of the deepfake video did much to convince people that it was in fact Musk who was speaking. For example, they include his usual mannerism and other aspects of his natural voice such as stammering and pauses. That was evident when the voice said, “This is not a fake, this is a real giveaway. I personally guarantee it to you.” There was also a QR code which the investors would scan to make their crypto deposits.
The voice added, “You have a chance to see your crypto propel exponentially as our rocket propels toward the stars.” That statement was followed by an applauding crowd, whose voices were also faked. At the moment, there is no correct figure of the number of people who were scammed through that YouTube scam. However, there is evidence that some people did send their cryptocurrency to the said addresses. The following screenshot shows an individual who sent crypto assets worth $30 to one of the scammers’ wallets.
Source: x.com
However, the video streams were later taken down and the channel was rebranded to emulate the Cardano Foundation. Mysk, a research group, was the first to the crypto community of the cryptocurrency scam as the next image indicates.
Source: x.com
No doubt, Mysk helped the crypto community to realize the danger it was in at that time. Such online security s are essential to preserve the dignity of the digital asset sector. However, this is not the first time that Musk was entangled in such a scam. The scammers target the crypto billionaire because of his open support for cryptocurrencies and his endorsement of several crypto projects such as Dogecoin and Shiba Inu. For example, in April Talal Haj Bakry and Tommy Mysk, identified another fake SpaceX YouTube account with a similar message, inviting crypto investors to invest in a double-your-money scammy project.
First, it is important to note that deepfakes leverage powerful techniques using artificial intelligence and machine learning to manipulate audio and visual content to deceive certain sections of the community. Therefore, the best means to counteract such machinations is to use similar technologies to detect deepfakes and mitigate the cybersecurity threats they pose.
In other words, organizations including digital firms should use machine learning and artificial intelligence technologies to authenticate visual and audio content they may view as suspicious. Therefore, organizations should invest in relevant machine learning solutions that leverage automation to liveness detection and verification. This is because what the human eye may miss machine learning based s can detect.
Second, organizations should put in place s to handle emerging cybersecurity risks related to deepfakes and impersonification. As an example, they should include deepfake awareness as part of their organizational educational cybersecurity programs. In other words, they should train their staff at all levels in the organization to identify deepfaked content and respond appropriately.
Lastly, it is essential for organizations that operate in the digital sector to come up with means to communicate with each other when deepfake incidents occur. They should also work alongside law enforcement organizations to handle such cases.
Cases of deepfake are increasing in the digital industry especially the cryptocurrency sector. Recently, some crypto scammers impersonated Elon Musk to hoodwink some investors to put their funds in a cryptocurrency scam. However, organizations such as Mysk were able to detect that cybersecurity threat. In the future organizations in the digital sector should educate their staff on how to detect and deal with deepfake cases.