The Federal Bureau of Investigation has issued a stark warning about the growing threat of “deepfakes” being used in cyber extortion.
In a recent report, the FBI said that malicious actors are using deepfakes to manipulate photographs or videos, often obtained from social media accounts or the open internet, and create sexually-themed images that appear authentic.
They then circulate these photos on social media or pornographic websites for the purpose of sextortion schemes or to harass the victim.
The FBI mentioned that the improvements in the quality, customizability, and accessibility of artificial intelligence-enabled image generators have further contributed to the growth of deepfakes.
The commission said it has received reports from victims, including minors, whose photos or videos were altered to create explicit content that was then publicly circulated.
Many victims were unaware their images had been copied, manipulated, and circulated until it either came to their attention or they stumbled across them online.
Once the manipulated content is circulated, victims face significant challenges in preventing its continual sharing or removal from the internet.
“Malicious actors have used manipulated photos or videos with the purpose of extorting victims for ransom or to gain compliance for other demands (e.g., sending nude photos),” the FBI said.
The federal agency recommended that people exercise caution when posting or direct messaging personal photos, videos, and identifying information on social media, dating apps, and other online sites.
Moreover, people should use discretion when posting images, videos, and personal content online, particularly those that include children or their information, as they can be captured, manipulated, and distributed by malicious actors without your knowledge or consent.
Applying privacy settings on social media accounts, running frequent online searches for personal information, using reverse image search engines, exercising caution when accepting friend requests or communicating with unknown or unfamiliar individuals, and securing online accounts with complex passwords and multi-factor authentication are also among the FBI’s recommendations.
Deepfakes Used to Target Crypto Users
As of late, there have also been instances where deepfakes were used to target unsuspicting crypto users.
For instance, in May, a deepfake of Tesla and Twitter CEO Elon Musk was created to promote a crypto scam. The video contained footage of Musk from past interviews, manipulated to fit the fraudulent scheme.
Scam promoters have long resorted to deepfakes to drum up demand among potential crypto investors.
Scammers impersonate anyone from influencers to high-profile crypto figures, but also ordinary people to gain victims’ trust.
Last year, Miranda, an e-commerce worker who did not wish to disclose her real name because her company had not given her permission to speak publicly, was targeted by such an attack when imposters released a deepfake video of the Melbourne woman promoting a crypto scam and published it on her Instagram account.
The FBI has warned that AI deepfakes are becoming a tool of choice in cyber extortion scams. In these scams, criminals create deepfake videos or audio recordings of victims that appear to be real. They then use these videos or recordings to threaten victims, demanding payment in cryptocurrency.
For example, in one recent scam, criminals created a deepfake video of a well-known cryptocurrency investor. In the video, the investor appeared to be giving instructions to his followers to invest in a particular cryptocurrency. However, the video was actually fake, and the criminals were using it to trick people into investing in their own cryptocurrency scam.
There are a few reasons why criminals are using deepfakes in cyber extortion scams. First, deepfakes are becoming increasingly realistic, making them more believable to victims. Second, deepfakes can be used to target specific victims, making the scams more personal and threatening. Third, cryptocurrency is a relatively anonymous way to receive payments, making it difficult for law enforcement to track down the criminals.
If you receive a threatening message that includes a deepfake video or audio recording, there are a few things you should do:
- Don’t panic. The criminals are counting on you to be scared and to give them money. Don’t let them win.
- Verify the authenticity of the video or recording. Do some research to see if the video or recording is actually real. If you’re not sure, don’t contact the criminals.
- Don’t pay the ransom. Paying the ransom will only encourage the criminals to continue their scams.
- Report the scam to the authorities. The FBI and other law enforcement agencies are working to track down and prosecute criminals who use deepfakes in cyber extortion scams.
By following these tips, you can help to protect yourself from cyber extortion scams that use deepfakes.
Here are some additional tips to help you stay safe from deepfake scams:
- Be wary of any messages or videos that seem too good to be true.
- Don’t click on links in messages from people you don’t know.
- Be careful about what information you share online.
- Use strong passwords and two-factor authentication.
- Keep your software up to date.