Makers and consumers of deepfakes—fake porn videos of celebrities created with a machine learning algorithm—have been booted from yet another major internet platform. Twitter’s the latest company to publicly denounce the posting of deepfakes to its website.
“We will suspend any account we identify as the original poster of intimate media that has been produced or distributed without the subject’s consent,” a Twitter spokesperson told me in an email. “We will also suspend any account dedicated to posting this type of content.”
Twitter also directed me to the platform’s intimate media policy, which states, "you may not post or share intimate photos or videos of someone that were produced or distributed without their consent."
Deepfakes falls solidly within that category.
Earlier today, I asked Twitter about an account posting deepfakes called @mydeepfakes and the account was suspended within hours.
Unlike Facebook and Instagram, Twitter’s media policy allows adult content in tweets as long as they're marked as “sensitive content,” but Twitter’s intimate media policy would rule out most deepfakes, as they’re generally done without permission from the targets.
Discord, Gfycat, and Pornhub have all announced that they won’t tolerate nonconsensual porn on their platforms, and specifically deepfakes, only hours after first being notified of the practice. Meanwhile, the deepfakes subreddit where all of this started currently has 90,000 subscribers. Reddit has not responded to Motherboard’s repeated requests for comment in 12 days.