There’s a video of Gal Gadot having sex with her stepbrother on the internet. But it’s not really Gadot’s body, and it’s barely her own face. It’s an approximation, face-swapped to look like she’s performing in an existing incest-themed porn video.
The video was created with a machine learning algorithm, using easily accessible materials and open-source code that anyone with a working knowledge of deep learning algorithms could put together.
Videos by VICE
A clip from the full video, hosted on SendVids, showing Gal Gadot’s face on a porn star’s body.
It’s not going to fool anyone who looks closely. Sometimes the face doesn’t track correctly and there’s an uncanny valley effect at play, but at a glance it seems believable. It’s especially striking considering that it’s allegedly the work of one person—a Redditor who goes by the name ‘deepfakes’—not a big special effects studio that can digitally recreate a young Princess Leia in Rogue One using CGI. Instead, deepfakes uses open-source machine learning tools like TensorFlow, which Google makes freely available to researchers, graduate students, and anyone with an interest in machine learning.
Like the Adobe tool that can make people say anything, and the Face2Face algorithm that can swap a recorded video with real-time face tracking, this new type of fake porn shows that we’re on the verge of living in a world where it’s trivially easy to fabricate believable videos of people doing and saying things they never did. Even having sex.
So far, deepfakes has posted hardcore porn videos featuring the faces of Scarlett Johansson, Maisie Williams, Taylor Swift, Aubrey Plaza, and Gal Gadot on Reddit. I’ve reached out to the management companies and/or publicists who represent each of these actors informing them of the fake videos, and will update if I hear back.
Fake celebrity porn, where images are photoshopped to look like famous people are posing nude, is a years-old category of porn with an ardent fan base. People commenting and voting in the subreddit where deepfakes posts are big fans of his work. This is the latest advancement in that genre.
“This is no longer rocket science.”
According to deepfakes—who declined to give his identity to me to avoid public scrutiny—the software is based on multiple open-source libraries, like Keras with TensorFlow backend. To compile the celebrities’ faces, deepfakes said he used Google image search, stock photos, and YouTube videos. Deep learning consists of networks of interconnected nodes that autonomously run computations on input data. In this case, he trained the algorithm on porn videos and Gal Gadot’s face. After enough of this “training,” the nodes arrange themselves to complete a particular task, like convincingly manipulating video on the fly.
Artificial intelligence researcher Alex Champandard told me in an email that a decent, consumer-grade graphics card could process this effect in hours, but a CPU would work just as well, only more slowly, over days.
“This is no longer rocket science,” Champandard said.
The ease with which someone could do this is frightening. Aside from the technical challenge, all someone would need is enough images of your face, and many of us are already creating sprawling databases of our own faces: People around the world uploaded 24 billion selfies to Google Photos in 2015-2016. It isn’t difficult to imagine an amateur programmer running their own algorithm to create a sex tape of someone they want to harass.
Read more: Facial Recognition for Porn is a Privacy Nightmare Waiting to Happen
Deepfakes told me he’s not a professional researcher, just a programmer with an interest in machine learning.
“I just found a clever way to do face-swap,” he said, referring to his algorithm. “With hundreds of face images, I can easily generate millions of distorted images to train the network,” he said. “After that if I feed the network someone else’s face, the network will think it’s just another distorted image and try to make it look like the training face.”
In a comment thread on Reddit, deepfakes mentioned that he is using an algorithm similar to one developed by Nvidia researchers that uses deep learning to, for example, instantly turn a video of a summer scene into a winter one. The Nvidia researchers who developed the algorithm declined to comment on this possible application.
In almost all of the examples deepfakes has posted, the result isn’t perfect. In the Gadot video, a box occasionally appeared around her face where the original image peeks through, and her mouth and eyes don’t quite line up to the words the actress is saying—but if you squint a little and suspend your belief, it might as well be Gadot; other videos deepfakes have made are even more convincing.
Porn performer Grace Evangeline told me over Twitter direct messages that porn stars are used to having their work spread around free to tube sites like SendVid, where the Gal Gadot fake is uploaded, without their permission. But she said that this was different. She’d never seen anything like this.
“One important thing that always needs to happen is consent,” Evangeline said. “Consent in private life as well as consent on film. Creating fake sex scenes of celebrities takes away their consent. It’s wrong.”
Even for people whose livelihoods involve getting in front of a camera, the violation of personal boundaries is troubling. I showed Alia Janine, a retired porn performer who was in the sex industry for 15 years, the video of Gadot. “It’s really disturbing,” she told me over the phone. “It kind of shows how some men basically only see women as objects that they can manipulate and be forced to do anything they want… It just shows a complete lack of respect for the porn performers in the movie, and also the female actresses.”
I asked deepfakes whether he considered the ethical implications of this technology. Did consent, revenge porn, and blackmail enter their mind while developing this algorithm?
“Every technology can be used with bad motivations, and it’s impossible to stop that,” he said, likening it to the same technology that recreated Paul Walker’s post-mortem performance in Furious 7 . “The main difference is how easy [it is] to do that by everyone. I don’t think it’s a bad thing for more average people [to] engage in machine learning research.”
Ethically, the implications are “huge,” Champandard said. Malicious use of technology often can’t be avoided, but it can be countered.
“We need to have a very loud and public debate,” he said. ”Everyone needs to know just how easy it is to fake images and videos, to the point where we won’t be able to distinguish forgeries in a few months from now. Of course, this was possible for a long time but it would have taken a lot of resources and professionals in visual effects to pull this off. Now it can be done by a single programmer with recent computer hardware.”
Champandard said researchers can then begin developing technology to detect fake videos and help moderate what’s fake and what isn’t, and internet policy can improve to regulate what happens when these types of forgeries and harassment come up.
“In a strange way,” this is a good thing, Champandard said. “We need to put our focus on transforming society to be able to deal with this.”
Correction: This story has been updated to clarify that deepfake’s algorithm is similar to the research produced by Nvidia researchers, but that there’s no evidence that it’s an application of their work.