
Cybersecurity firm Cybertrace has issued a stark warning regarding a highly convincing deepfake video featuring Australian mining magnate and businessman Andrew "Twiggy" Forrest. This case represents a significant escalation in the sophistication of cryptocurrency-related fraud, demonstrating how artificial intelligence technology can be weaponized to exploit public trust in prominent figures.
The deepfake video, which has surfaced on social media platforms, shows Forrest promoting a fake cryptocurrency trading platform that promises high returns. Deepfake technology uses advanced machine learning algorithms to manipulate video and audio content, creating realistic but entirely fabricated footage that can be nearly indistinguishable from authentic recordings. In this instance, scammers leveraged Forrest's reputation and credibility to lend legitimacy to their fraudulent scheme.
The video, which initially appeared on Facebook, encourages users to sign up for a fraudulent platform claiming to generate substantial daily profits for "ordinary people." The psychological manipulation is evident in the messaging, which targets individuals seeking financial opportunities by promising accessible wealth generation. The victims are directed to a website called "Quantum AI," a name that Cybertrace says is intertwined with scams and financial fraud. The use of technical-sounding terminology like "Quantum AI" is a deliberate tactic to create an illusion of legitimacy and cutting-edge technology.
In a comment, Cybertrace CEO Dan Halpin claimed that the deepfake video could deceive individuals, as the scammers behind it appear to possess sales expertise. This observation highlights an important evolution in cryptocurrency fraud: perpetrators are no longer relying solely on technical skills but are combining technological sophistication with professional marketing techniques to maximize their reach and effectiveness.
Halpin further noted its length and repetitive nature, which enhances its convincing nature. He stated:
"The video is long and in many ways repetitive, which can be quite convincing, and appears to have been created by someone with knowledge of sales and marketing."
The repetitive structure serves a psychological purpose, reinforcing key messages and creating a sense of familiarity that can lower viewers' critical defenses. This technique is commonly used in legitimate advertising but becomes particularly dangerous when applied to fraudulent schemes.
The deepfake manipulates Forrest's behavior and body language, utilizing footage from a "fireside chat" conducted by Rhodes Trust in October. By sourcing material from a credible event, the scammers were able to create a realistic foundation for their fabricated content. The technical execution demonstrates access to sophisticated AI tools capable of analyzing and replicating subtle facial expressions, vocal patterns, and mannerisms.
Cybertrace detected the deepfake video on January 27, showcasing an AI-altered version of Forrest endorsing a fictitious cryptocurrency trading software. In the video, the altered version of Forrest promises viewers the opportunity to join him and his team as partners in the world's most intelligent stock and cryptocurrency trading software, guaranteeing substantial profits regardless of market conditions. Such guarantees are classic red flags in investment fraud, as legitimate financial opportunities always carry risk and cannot promise consistent returns.
Forrest, a former CEO of Western Australian mining firm Fortescue Metals Group, is a highly successful entrepreneur with a net worth of $29.4 billion. His prominence and reputation make him an ideal target for scammers seeking to exploit his credibility. The choice of such a high-profile figure also ensures wider distribution of the fraudulent content, as news of the scam itself generates publicity.
The deepfake video concludes with Forrest urging viewers to sign up for the platform before it's too late, adding an element of urgency to the scam. This artificial scarcity tactic is designed to pressure potential victims into making hasty decisions without conducting proper due diligence.
Cybertrace has cautioned users to exercise extra vigilance due to the recent surge in deepfake fraud. The accessibility of AI tools and the declining cost of creating convincing deepfakes have contributed to a dramatic increase in this form of financial crime. As the technology becomes more democratized, the barrier to entry for would-be scammers continues to lower, creating a proliferation of sophisticated fraud attempts.
In addition to Forrest, other notable Australian individuals such as Gina Rinehart, the country's richest person, entrepreneur Dick Smith, and TV host Allison Langdon have also been targeted by scammers using deepfake videos, as highlighted by Cybertrace. This pattern suggests organized efforts by fraud networks to systematically exploit trusted public figures across different sectors and demographics.
As reported, Lee Hsien Loong, the prime minister of Singapore, has also warned his social media followers regarding the use of deepfake videos that use his voice and image to promote cryptocurrency scams. The international nature of these incidents demonstrates that deepfake fraud is a global phenomenon requiring coordinated responses from governments, technology platforms, and cybersecurity organizations.
At the time, Loong even shared an example video of himself being interviewed, which was created by scammers to endorse a fraudulent "hands-free crypto trading" scheme. By publicly acknowledging and exposing these scams, public figures can help educate their audiences and reduce the effectiveness of such attacks.
"The use of deepfake technology to spread disinformation will continue to grow," said Loong.
"We must remain vigilant and learn to protect ourselves and our loved ones against such scams."
This warning underscores the importance of digital literacy and critical thinking in the modern information environment. As AI-generated content becomes increasingly sophisticated, individuals must develop skills to identify potential manipulation and verify information before taking action.
Scammers have been employing various methods to deceive individuals and steal their fiat currency or tokens since the inception of cryptocurrencies. The evolution from simple phishing emails to sophisticated deepfake videos represents a concerning trend in the creativity and technical capability of fraud perpetrators.
In recent years, high-profile social media breaches have demonstrated the scale of potential cryptocurrency fraud. For example, hackers compromised the accounts of prominent Twitter users, including former United States President Barack Obama and President-elect Joe Biden, to promote a Bitcoin scam. These incidents highlight vulnerabilities in digital platforms and the need for enhanced security measures.
To protect against deepfake fraud and cryptocurrency scams, individuals should:
As deepfake technology continues to advance, collaboration between technology companies, regulators, and cybersecurity experts will be essential to develop effective detection tools and legal frameworks to combat this emerging threat.
Deepfake videos use AI to create convincing fake footage of real people. Scammers deploy these videos to impersonate celebrities or financial experts, promoting fraudulent investment schemes. Victims transfer funds believing they're following legitimate advice, losing money to criminals exploiting deepfake technology for financial deception.
Verify through official channels directly. Check for audio-video sync issues, unnatural facial movements, and blinking patterns. Use reverse image search. Consult blockchain security experts. Never click suspicious links or provide private keys based on video content alone.
Scammers created convincing deepfake videos impersonating an Australian billionaire to promote fake investment schemes. Victims were deceived into transferring funds to fraudulent platforms, suffering significant financial losses through this social engineering attack targeting crypto investors.
Report the fraud to local authorities and financial regulators immediately. Document all evidence including video, transaction details, and communications. Contact your bank to attempt transaction reversal. File complaints with relevant cybercrime units. Report the fraudulent content to social media platforms. Consult a lawyer about legal recovery options. Act quickly as time-sensitive actions increase recovery chances.
Deepfake-based financial fraud typically results in criminal charges including wire fraud, identity theft, and securities fraud, carrying prison sentences up to 20 years and substantial fines. Civil liability, asset recovery orders, and regulatory penalties from financial authorities are also common consequences.











