deepfakes

I Paid $30 to Create a Deepfake Porn of Myself

An inside look at the world of custom-made deepfake forums, and the motivation of people who make them.

by Evan Jacoby
10 December 2019, 12:46am

Illustration by Cathryn Virginia

This article originally appeared on VICE US

While lawmakers continue to debate how to prevent deepfakes from spreading misinformation and defaming political figures, private citizens are left to grapple with the growing scale and accessibility of the technology, which is most often used to create nonconsensual, fake porn.

Deepfakes are digitally altered images that use machine learning algorithms to swap one person’s face for another’s. When they first appeared in late 2017, the term ‘deepfake’ was coined from its creator's Reddit username. He claimed to be hobbyist programmer, and was posting his creations to a subreddit devoted to photoshopping female celebrities' faces onto porn performers' bodies. Deepfakes became the subject of a now-banned subreddit, dominated by what Reddit called “involuntary fake pornography.”

Two years later, producing deepfakes is easier than ever, and despite being banned from platforms like Discord, Reddit, and Twitter, a marketplace for deepfake porn is thriving on easily-accessible websites. Here, people who don't have the resources or ability to create their own deepfakes can commission videos of celebrities, friends, and strangers. To test how easy it is to commission a custom deepfake, I joined this online face-swapping marketplace and paid a creator $30 to have my face digitally inserted into porn.

Aside from a few programs developed by researchers, most publicly-available deepfake software still needs to be trained on a dataset consisting of hundreds of photographs of the face that is being deepfaked. But this limitation—the fact that most people don't have hundreds of high-definition images of one ordinary person stored readily at hand—can be bypassed by using a single video. A 15-second Instagram story has 450 individual frames, which is enough to create a dataset of a face, or "faceset," which can train a machine learning algorithm to render your face onto another body.

In other words, if you post a video of your face on social media and someone manages to save the video, they could theoretically pay to have you appear in any other video, on any other body.

It was easy to find someone to take on my commission. I sent a private message to several creators advertising their services on the site’s request forum. A creator got back to me that evening, and I sent him a 13 second video of myself talking into my phone’s front-facing camera—about the length of an Instagram story. I also sent him a link to a Pornhub video to put my deepfaked face into.

“If they upload their life to Facebook then it's their problem really. Whatever [is] shared willingly is free to use.”

To make sure my experiment represented a realistic exchange, I didn’t tell the creator that the face he was using was mine, nor did I imply that I had received consent. He sent me sample screenshots of a deepfake he created using my faceset the following morning.

During my time on the deepfake forum, I spoke with four deepfake creators in order to get a better sense of the people and marketplace behind these videos. Each of these people drew different lines in the sand for what they would or wouldn’t do.

One creator told me that he accepts money to make “porn of random girls.” He creates them without question, he said, “as long as both source/target are clearly and obviously 18 [and up],” but doesn't ask for any legal proof that the people in the videos aren't minors. He said that the target video is the deciding factor for him, and that he “wouldn't put people into gas chambers or stuff like that. If it doesn't fit into even the broadest interpretation of a joke then that's a no.”

When I asked him how he would expect people to react if they discovered nonconsensual deepfake porn of themselves, he said that he thought “guys would laugh or take it as a compliment, girls would freak out and scream rape.”

Studies show that women in nonconsensual porn scenarios are still the primary targets of deepfakes, and that the victims experience trauma from this harassment, despite knowing that the videos are "fake."

Another deepfake creator specified that he was against using deepfake technology to create porn, but that he uses deepfake forums as a learning tool for producing deepfake parody videos. He told me that in other, non-porn deepfake communities, “the user base is not helpful usually and will tell you just to go read the [software] documentation,” which he said pushed him to join the deepfake porn community.

This creator pointed out, however, that while there is a "safe for work" section on the website, the users who are there for porn outnumber those using the forums to learn about technical data. Looking at the number of threads and posts on each section of the forum confirms this. He also added that even in the technical posts, people frequently use porn clips to demonstrate and compare different programs and algorithms.

I asked both creators if people should have an expectation of privacy in 2019, and their answers were similar. “People are far too relaxed about posting photos and videos on every form of social media,” the safe-for-work-only creator said, referring to photos and videos that deepfake creators can use to train machine learning models. The other creator put it more bluntly: “If they upload their life to Facebook then it's their problem really. Whatever [is] shared willingly is free to use.”

With the exception of a few states like Virginia and California, deepfake pornography remains legal in the U.S. While platforms and web hosts can prohibit deepfake videos, deepfake creators’ anonymity can make it nearly impossible to prevent new videos from being made.

Limiting your social media presence may not be enough to protect you, either. This victim-blaming attitude goes against what privacy experts say: that while limiting online exposure of your likeness is a good step, even if you keep your images offline, there's no surefire way to protect yourself against bad actors who want to target you with a deepfake.

Looking back at my experience, I’m still stunned by how easy the whole process was. Deepfake porn forums appear on the first page of Google’s search results, and creators are willing to make deepfakes for anyone with 13 seconds of video and $30. Some of these platforms boast user counts in the tens of thousands. Google has invested research resources into detecting deepfakes, but it hasn’t taken the easy and arguably more important step of making these sites harder to find on its search result pages.

After this particular creator sent me the completed video, he thanked me and told me future videos would be cheaper, as he had already trained the algorithm to recognize my face. Until the U.S. has better nonconsensual porn and media manipulation laws, and until we as a society can address the problem of harassment and abuse via manipulated videos, he'll likely continue to have customers.