
Cybersecurity firm Cybertrace has issued a stark warning regarding a highly convincing deepfake video featuring Australian mining magnate and businessman Andrew "Twiggy" Forrest. The deepfake video, which has surfaced on social media platforms in recent weeks, shows Forrest promoting a fake cryptocurrency trading platform that promises high returns to unsuspecting investors.
The fraudulent video, which initially appeared on Facebook before spreading to other platforms, encourages users to sign up for a deceptive platform claiming to generate substantial daily profits for "ordinary people." The victims are directed to a website called "Quantum AI," a name that Cybertrace says has become increasingly intertwined with scams and financial fraud across multiple jurisdictions. The platform falsely presents itself as an advanced AI-powered trading system capable of predicting market movements with unprecedented accuracy.
In a detailed analysis, Cybertrace CEO Dan Halpin claimed that the deepfake video could easily deceive individuals, as the scammers behind it appear to possess sophisticated sales and marketing expertise. Halpin further noted its considerable length and strategically repetitive nature, which significantly enhances its convincing appearance and psychological impact on viewers.
"The video is long and in many ways repetitive, which can be quite convincing, and appears to have been created by someone with knowledge of sales and marketing."
The deepfake manipulates Forrest's behavior and body language with remarkable precision, utilizing footage from a "fireside chat" conducted by Rhodes Trust in October of the previous year. Cybertrace detected the deepfake video in late January, showcasing an AI-altered version of Forrest endorsing a fictitious cryptocurrency trading software with persuasive language and familiar mannerisms.
In the manipulated video, the altered version of Forrest promises viewers the opportunity to join him and his team as partners in what he describes as the world's most intelligent stock and cryptocurrency trading software, guaranteeing substantial profits regardless of market conditions. This promise of consistent returns, regardless of market volatility, is a classic hallmark of investment fraud schemes.
Forrest, a former CEO of Western Australian mining firm Fortescue Metals Group, is a highly successful entrepreneur with a net worth of $29.4 billion, making him one of Australia's most recognizable business figures. The scammers deliberately chose such a prominent and trusted figure to lend credibility to their fraudulent scheme, exploiting his reputation for business acumen and success.
The deepfake video concludes with Forrest urging viewers to sign up for the platform before it's too late, adding an element of urgency and scarcity to the scam—a common psychological manipulation tactic used to pressure victims into making hasty decisions without proper due diligence.
Cybertrace has cautioned users to exercise extra vigilance due to the recent surge in deepfake fraud incidents targeting high-profile individuals across various industries. The rise of accessible AI technology has made it increasingly easy for scammers to create convincing fake videos that can deceive even tech-savvy individuals.
In addition to Forrest, other notable Australian individuals such as Gina Rinehart, the country's richest person, entrepreneur Dick Smith, and TV host Allison Langdon have also been targeted by scammers using deepfake videos, as highlighted by Cybertrace in their recent security advisories. These cases demonstrate a concerning trend where scammers systematically target trusted public figures to promote various fraudulent schemes, particularly in the cryptocurrency and investment sectors.
The problem extends beyond Australia's borders. As reported in recent months, Lee Hsien Loong, the prime minister of Singapore, has also warned his social media followers regarding the use of deepfake videos that use his voice and image to promote cryptocurrency scams. At the time, Loong even shared an example video of himself being interviewed, which was created by scammers to endorse a fraudulent "hands-free crypto trading" scheme, demonstrating the global nature of this threat.
"The use of deepfake technology to spread disinformation will continue to grow," said Loong in his public warning.
"We must remain vigilant and learn to protect ourselves and our loved ones against such scams."
Scammers have been employing various methods to deceive individuals and steal their fiat currency or digital tokens since the inception of cryptocurrencies. The evolution of these tactics reflects the increasing sophistication of cybercriminals and their ability to adapt to new technologies.
In 2020, hackers compromised the accounts of prominent Twitter users, including former United States President Barack Obama and President-elect Joe Biden, to promote a Bitcoin scam, demonstrating that even the most secure accounts can be vulnerable to determined attackers. This incident highlighted the importance of multi-layered security measures and the need for constant vigilance in the digital age.
Security experts recommend several protective measures: verifying investment opportunities through official channels, being skeptical of promises of guaranteed returns, checking for official statements from the individuals supposedly endorsing products, and reporting suspicious content to platform administrators and law enforcement authorities. As deepfake technology becomes more sophisticated and accessible, the responsibility to remain informed and cautious falls increasingly on individual users and their networks.
Deepfake is synthetic media created using artificial intelligence and deep learning algorithms, particularly generative adversarial networks (GANs). It replaces or manipulates faces in videos to create highly realistic but fake footage, often used deceptively in scams and misinformation campaigns.
Observe facial details carefully for unnatural movements or inconsistencies in lip-sync. Check if mouth movements match audio precisely. Look for flickering eyes, unusual skin texture, or blinking anomalies. Use AI detection tools designed to identify deepfakes. Verify through official channels before trusting the content's authenticity.
Deepfake scams impersonate executives, create urgency to transfer funds, and exploit trust. Perpetrators use AI to synthesize realistic videos and audio, then pose as CEOs or high-ranking officials. They manufacture time pressure through fake emergency scenarios, target lower-level employees, and bypass verification procedures to steal money before detection.
Verify the celebrity's official accounts directly, check official company websites independently, research the investment through multiple credible sources, and be cautious of unsolicited celebrity endorsements. Deepfakes are increasingly sophisticated, so always confirm through official channels before investing.
Report to police immediately with detailed evidence. Contact financial institutions to freeze accounts. Provide information to payment platforms involved. Cooperate fully with law enforcement investigation. Document all fraudulent communications and transactions for legal proceedings.
Deepfake fraud is a serious crime with severe penalties including substantial fines and lengthy imprisonment. Laws strictly prohibit using technology for deception and social disruption. Violators face strict legal prosecution and criminal liability.
Verify identities through multiple channels before any financial transactions. Watch for unnatural eye movements, blurred facial features, or inconsistent audio quality in videos. Request live video calls with additional verification steps. Never share personal or financial information based solely on video content. Use official contact information to confirm requests independently.
Deepfake videos threaten financial security by creating highly realistic fraudulent content that misleads investors into scams. This technology undermines trust in financial transactions and enables criminals to impersonate legitimate figures, causing significant monetary losses and market instability.











