FBI Issues Warning Over AI Sextortion Deepfakes
FBI WARNS MINORS, NON-CONSENTING ADULTS AT RISK OF DEEPFAKE SEXTORTION
AI continues to change the landscape of social, legal and health concerns. AI is proving to move the horizon intensely close when it comes to its ability to edit media. Pictures and video are now easily manipulated using AI as a tool to change or outright create content that isn’t real. Consider how easily social media disseminates content, and how it might impact you to have pictures circulating showing you having sex with someone you shouldn’t be and, in fact, didn’t. It’s now such an actualized potential that the FBI has issued a warning over AI-generated sextortion deepfakes.
SEXTORTION IS SIMPLE: PAY UP TO MAKE EMBARRASSING CONTENT GO AWAY, BUT MAYBE PAY AGAIN
So far, the FBI has focused its warning on who the targeted victims already are: minors and non-consenting adults. Minors are particularly vulnerable as the coming-of-age years are always intense. Children who suffer sextortion deepfakes are at risk of developmental impact, and intense trauma that can lead to increased risks of suicide. And non-consenting adults? They are non-consenting victims because they didn’t agree to have their likenesses used in deepfake porn. But both groups have also been targeted with sextortion: pay us to make these fake videos disappear.
SOME MIGHT FIND IT TEMPTING TO SIMPLY PAY SEXTORTION TAB TO AVOID SCANDAL
Sextortion has existed for decades. But AI-generated sextortion deepfakes have taken the threat to an elevated level that is now part of the contemporary landscape. The content never had to be real. But now it is virtually real. Consider the threat from an extortionist saying they’ll send your family, friends and coworkers a video of you having sex with the boss’s spouse, or your local priest, etc. You might find it easier to pay up than to have to suffer all the intense fallout, and dozens of awkward conversations.
The threat is real. Laws and law enforcement are scrambling to keep up.