Deepfake technology, which utilizes generative AI platforms such as Midjourney 5.1 and OpenAI’s DALL-E 2, has demonstrated remarkable capabilities in producing lifelike images and videos. However, the U.S. Federal Bureau of Investigation (FBI) has recently issued a warning, stating that criminals are exploiting deepfakes to target victims for extortion.
FBI warns the public about the extortion scam
According to the FBI, it has received reports from victims, including minors and non-consenting adults, whose photos or videos were manipulated to create explicit content. In a public service announcement (PSA) alert released on Monday, the agency highlighted the growing prevalence of online extortion cases, particularly those involving “sextortion scams” employing deepfakes. Law enforcement agencies received over 7,000 reports in the previous year alone regarding online extortion aimed at minors, with an increase in incidents related to deepfakes since April.
Deepfakes are fabricated audio or video content generated through artificial intelligence, often making it challenging to discern the authenticity of the material. The rise of platforms like Midjourney 5.1 and OpenAI’s DALL-E 2 has contributed to the sophistication and accessibility of deepfake technology. Unfortunately, this has also enabled malicious actors to exploit the technology for illicit purposes.
The agency highlights risks associated with using the technology
The FBI’s warning comes in the wake of several incidents that highlight the potential dangers of deepfakes. For instance, a deepfake video featuring Tesla and Twitter CEO Elon Musk was widely circulated to deceive cryptocurrency investors. The manipulated video utilized footage from Musk’s previous interviews, cleverly edited to fit the scam’s narrative.
It is essential to note that not all deepfakes are malicious. Some deepfakes have gained attention for their humorous or creative elements, such as a viral video depicting Pope Francis wearing a white Balenciaga jacket. Additionally, AI-generated deepfakes have been employed to digitally recreate deceased individuals, aiming to bring them back to life in various contexts.
In response to these threats, the FBI has provided recommendations to safeguard against deepfake-related extortion. They advise against paying any ransom, as it does not guarantee that the criminals will refrain from posting the deepfake content. The agency also emphasizes the importance of exercising caution when sharing personal information and content online, suggesting the use of privacy features to limit access to accounts. Monitoring children’s online activities, being vigilant for unusual behavior from past acquaintances, and conducting frequent searches for personal and family member information online are further measures recommended by the FBI.
From Zero to Web3 Pro: Your 90-Day Career Launch Plan