Pornhub moderation
Image: Cathryn Virginia
Tech

Pornhub Doesn't Care

Pornhub hosts hundreds of Girls Do Porn videos; a Motherboard investigation shows the high-tech solution meant to stop those videos from spreading doesn’t work.
bestofvice_logo_LK copy
The most exciting stories on the internet.

On May 1, 2016, in the middle of final exams, a young woman got a text message that would change her life forever. It included a screenshot of a pornographic video posted online, featuring her. Panicking, she quickly tried to justify what she had done. "They said it would only be in Australia," she told her friend, according to court documents. "I only did it for money."

The video spread like wildfire. Jane Doe 11—one of 22 women who sued porn production company Girls Do Porn in 2016 for coercing them to have sex on video and lying to them about how the videos would be distributed—learned from the student council president that "everyone was watching it in the library, so much so that the internet essentially crashed." The student council president also informed her that "the administration actually had to watch the video. They all congregated in a room to watch it, to I guess make sure I didn't say the university name in the video."

Advertisement

The following morning, Jane Doe 11 sent Michael Pratt, the owner of Girls Do Porn and currently a wanted fugitive, an email: "This is not spam but a life or death situation. Please contact me…Please. I beg of you."

In October 2019, after Michael Pratt was charged with federal sex trafficking crimes, Pornhub removed Girls Do Porn's official Pornhub channel, as well as pages promoting Girls Do Porn as "top shelf" content and a reason to pay Pornhub a subscription fee. In January, after the ruling in the civil case found Girls Do Porn owed 22 women a total of $13 million, the official GirlsDoPorn.com site was taken offline.

But even with the official site shut down and its owners in jail or on the run, the ruling has done little to stop the spread of the videos online. Even today, hundreds of Girls Do Porn videos are easy to find, especially on Pornhub, which claims to get 100 billion video views a year and more than 100 million daily visits. Searching Google for Girls Do Porn videos leads users to Pornhub, where these videos are hosted against pre-roll and banner ads that Pornhub's parent company Mindgeek profits from.

Advertisement

Pornhub claims that victims of nonconsensual porn—as many of the Girls Do Porn videos are—can easily request to remove videos from the site, and that those videos can be "fingerprinted." Broadly speaking, video fingerprinting is a method for software to identify, extract, and then summarize characteristic components or metadata of a video, allowing that video to be uniquely identified by its "fingerprint." According to Pornhub, this would automatically prevent future attempts to upload a video that was flagged.

But a Motherboard investigation found that this system can be easily and quickly circumvented with minor editing. Pornhub's current method for removing Girls Do Porn videos and other forms of non-consensual porn not only puts the onus of finding and flagging videos almost entirely on potentially-traumatized victims—those victims can't even rely on the system to work.

Pornhub fails a simple test

Jane Doe 11 testified that she was devastated to see her video on many websites, including Pornhub, which described her in derogatory language and had more than 9 million views. She found out that her name and personal information were connected to her video and posted on numerous forums, along with links to her Facebook profile. These anonymous users also posted her brother's, sister's, and church's Facebook profiles to harass her. "Such comments made her question her desire to live," the ruling said.

In July 2019, Motherboard reported on Pornhub's central role in the Girls Do Porn case. The platform enabled rampant doxing and harassment of the women suing the porn production company, as well as other women not part of the lawsuit who we contacted independently, and "over 100" who are not part of the lawsuit but who the women's lawyers interviewed and said had the same experience. Whenever we published a story about the case, more women who were in Girls Do Porn videos who didn't know about the lawsuit reached out to us with identical stories.

Advertisement

During our reporting, we repeatedly asked Pornhub what it was doing to contain and remove the Girls Do Porn videos on its website. On October 22, a Pornhub spokesperson sent us a statement attributed to Pornhub VP Blake White:

"We strongly condemn non-consensual content, including revenge porn. Content that is uploaded to Pornhub that directly violates our Terms of Service is removed as soon as we are made aware of it and this includes non-consensual content. With regards to other unauthorized Girls Do Porn videos, Pornhub takes a number of actions to protect our community, and to keep content that violates our policies off of our platform. We use a state-of-the-art third party digital fingerprinting software, which scans any new uploads for potential matches to unauthorized material and makes sure the original video doesn’t go back up on the platform. Anyone who digitally fingerprints their content is then protected from having their video uploaded by unauthorized parties. It is completely free and strongly encouraged to fingerprint content."

With the consent and cooperation of several women who were featured in Girls Do Porn videos, Motherboard tested Pornhub's content removal and video fingerprinting systems using editing techniques common in porn regular users upload to Pornhub. We found that while Pornhub removed videos when provided with specific links, the video fingerprinting method that Pornhub calls "state-of-the-art" and relies on to automatically moderate its platform can be easily circumvented, allowing anyone to upload Girls Do Porn videos or other non-consensual videos.

Advertisement

"It’s not really 'doing the right thing' when you only act when it is in your self-interest."

To fingerprint a video, victims have to email Vobile, the company Pornhub contracts, to provide the service. After Vobile responds to a request confirming it has fingerprinted the video, the victim then has to flag the video to Pornhub by filling out a short form on the site. Once the videos are removed, it should be theoretically impossible to post the same video again to Pornhub.

To test Pornhub's system, Motherboard first downloaded a Girls Do Porn video from Pornhub. Then, we asked the woman in that video to email Vobile for fingerprinting, and then flag the video to Pornhub. Vobile confirmed the video was fingerprinted and Pornhub removed the specific video we flagged. After that video was removed, we tried to upload the exact same file to Pornhub. That identical video was seemingly automatically removed within an hour. A shorter portion of the same video file we uploaded was also seemingly automatically removed within an hour.

How to Remove Non-Consensual Videos From Pornhub

However, a slight edit of the same Girls Do Porn video circumvented Pornhub's fingerprinting method. The specific video we flagged for Pornhub was a 5 minute portion cut from the original Girls Do Porn episode. Our edit took a 30 second portion from the same Girls Do Porn episode, but it was sourced from a different video hosted on another Pornhub site. This 30 second portion featured similar shots to the fingerprinted video. We successfully uploaded this video to Pornhub, where it was available for anyone to find and view for 24 hours, before we removed it ourselves. During this time, it also would have been easy to find via Pornhub's search or tagging system if we labeled the video clearly as a Girls Do Porn video.

Advertisement

Overall, we successfully uploaded eight videos that used footage from the same fingerprinted Girls Do Porn episode. In addition to the 30 second video described above, we successfully uploaded videos by cutting them together with stock footage, removing the sound, and using video of the same episode from different sources of varying quality and with different watermarks. As far as we can tell, Pornhub only stopped us from uploading videos sourced from the specific video file we fingerprinted.

Hany Farid, a University of California, Berkeley professor who worked with Microsoft in 2009 to develop PhotoDNA, a technology used by Google, Facebook, and other internet platforms to automatically detect images of child sexual abuse, said that the fingerprinting technology that Pornhub and other platforms use to detect unwanted videos and images is never perfect. However, he said Pornhub and Vobile's inability to detect a 30 second portion of the fingerprinted Girls Do Porn video reuploaded to its platform is "pretty bad."

"[Companies] often will have bare, minimal technology that gives them the appearance of trying to do something, but the reality of the technology is not great," Farid said. "I'm not saying that is the case here—I know nothing about their technology—but it could just be that they simply haven't invested the resources. If they want to open the hood and let us look at it, I can tell you more about it, but I don't think they're going to do that."

Advertisement

“Fingerprinting or PhotoDNA are definitely useful tools, and we take full advantage of them in our practice," Honza Cervenka, an Associate at McAllister Olivarius, a firm that specializes in non-consensual pornography, said. "But the proper solution to this problem isn’t to require victims to play a never-ending game of technological catch-up. Victims are spending much more time than they ought, checking for their own videos. The onus should not be on them, it should be on the sites.”

Vobile did not respond to a request for comment. We sent a list of detailed questions to Pornhub, explaining our testing and other instances of videos that violate the company's policy that have not been removed. We also asked about the size of its moderation team, if all Mindgeek sites are moderated the same, and anything else it wants to share about its moderation efforts.

“Pornhub has been working closely with the plaintiff’s counsel, to fingerprint and remove all Girls Do Porn videos and compilations from our site," Pornhub said in a statement, attributed to Pornhub VP Blake White.

Brian Holm, one of the lead attorneys representing the women in the Girls Do Porn case, told Motherboard that he would consider this an overstatement.

"We're not working closely with them, we're just sending them our clients' links," he said. The legal team finds videos where their clients are featured, sends the video URL to Mindgeek, and the video is removed. This process is on the lawyers and plaintiffs to pursue, and is almost identical to how any other person can make a takedown claim on Pornhub.

Advertisement

Cara Van Dorn, an associate at Sanford Heisler Sharp, one of the firms representing the women, told Motherboard she is in direct contact with Mindgeek for this process.

"I do think they're trying at this point and that's better than not trying," she said. "Given the popularity of the Mindgeek enterprise I do think it's helpful. I don't know that we're ever going to cleanse the internet of these videos, but it's a big step forward to get Mindgeek involved."

How Pornhub Enables Doxing and Harassment

But Van Dorn also noted that Mindgeek has known about the accusations against Girls Do Porn since long before the civil trial. The women who finally brought the lawsuit, their lawyers, and many other victims have been submitting takedown requests to get those videos removed from sites across Pornhub's network, including sites like YouPorn and RedTube, for years.

"We had reached out to Mindgeek many times over the years and it wasn’t until the start of trial and obtaining numerous favorable rulings demonstrating the undeniable strength of our case that Mindgeek finally decided to start taking action," Van Dorn said. "It’s not really 'believing victims' when it takes a team of lawyers and years of litigation before you lift a finger. It’s not really 'doing the right thing' when you only act when it is in your self-interest."

"It took 99 days of trial, a fraud decision and a half dozen sex trafficking indictments for Mindgeek to finally take the victims seriously," Holm said. "This is obviously frustrating and a bit disheartening, as it only serves to embolden other traffickers knowing that it takes such a monumental effort from the victims before anyone will believe them."

Advertisement

"A pervasive lack of transparency"

For almost all of these women, the only porn video of them that exists online is the one they shot for Girls Do Porn. Some shot multiple videos with Girls Do Porn, and a handful went on to make adult movies elsewhere, but most of the women who suffered from the spread of the videos are trying to remove a single video which is copied and remixed endlessly.

The test videos Motherboard uploaded to Pornhub were designed to mimic many videos on Pornhub that feature Girls Do Porn content and use similar editing tricks.

These "compilation" videos, as they're called by users on Pornhub, recut existing Girls Do Porn videos. They cut several Girls Do Porn videos together, cut Girls Do Porn videos with unrelated porn, or set them to music. One Girls Do Porn compilation uploaded in January, for example, opens with audio taken from a TV news report about the Girls Do Porn trial.

Motherboard found Girls Do Porn videos that zoomed in or out on the image, mirrored it, overlaid text, or used other methods that could fool Pornhub's fingerprinting method. Compilation videos are not specific to Girls Do Porn, either—they're a popular genre on Pornhub that highlights a particular performer or fetish. But the practice of compliations means many users bypass Pornhub's fingerprinting without even trying or knowing that it exists.

In addition to not being able to detect lightly edited fingerprinted videos, Pornhub's platform enables the spread of Girls Do Porn videos in a variety of other ways.

Advertisement

In one instance, Motherboard found a video on Pornhub composed of a single still image soliciting help from Pornhub users in discovering the names of women in Girls Do Porn videos. The image listed 45 Girls Do Porn episode numbers and asked those who knew the women's names to get in touch via a listed ProtonMail address in order to gain access to a "private group." The video was only removed after we contacted Pornhub about it.

1580926476412-ph

A screenshot of a video on Pornhub trying to dox women in Girls Do Porn videos. We have edited the image to remove episode numbers and the group's email.

After Motherboard first reported on deepfakes in 2017, Pornhub blocked the phrase on its site and said that it would ban algorithmically-generated face-swap videos as nonconsensual porn. Today, searching for "deepfakes" on Pornhub will return no results, despite the fact that there are many deepfake videos still hosted on the platform. Similarly, some time after Motherboard started contacting Pornhub about the proliferation of Girls Do Porn videos on Pornhub, the site blocked the term "Girls Do Porn," which now also returns no results despite Pornhub still hosting hundreds of Girls Do Porn videos.

Searching for phrases related to Girls Do Porn easily surfaces those videos. These terms are widely used by people who still upload Girls Do Porn videos in titles and Pornhub's own video tagging system.

"What, exactly, is the incentive for a site like Pornhub to remove one of their most popular products?"

Advertisement

One of these search terms returned 202 videos featuring Girls Do Porn content. Clicking any of those videos caused Pornhub's recommendation system—showing users videos related to their past viewing habits, similar to YouTube—to surface more of them, meaning Pornhub isn't just quietly hosting these videos, it actively pushes users towards them.

Motherboard found that some Pornhub "playlists," or user-curated collections of videos hosted on Pornhub, are clearly labeled with the same Girls Do Porn-related phrases. Clicking on any of these playlists will lead users to dozens of Girls Do Porn videos as well. We also found private playlists with dozens of videos that feature stills from Girls Do Porn videos in thumbnails, but that can't be viewed by the public. Motherboard provided some of those phrases to Pornhub, but it still hasn't blocked them from returning search results.

Do you work at Mindgeek, or have firsthand knowledge of how moderation works within the Pornhub Network? We'd love to hear from you. Using a non-work phone or computer, contact Samantha Cole securely on Signal at +16469261726, direct message on Twitter, or by email at samantha.cole@vice.com or samleecole@protonmail.ch.

"We've suggested that it's not hard to run searches and find other videos—maybe they're not uploaded by Girls Do Porn, but they are overtly Girls Do Porn," Van Dorn said. "It doesn't take a rocket scientist to track these things down. It feels like they could be a little more proactive and do that on their own."

Advertisement

The burden of removal is on the targets of nonconsensual porn, and because of the way it spreads on sites like Pornhub, the takedown process is often exhausting, retraumatizing, and never-ending.

“While a lot of times victims want to go to court, they also just want it removed so they can move on with their lives,” New York Assemblyman Edward Braunstein, who co-sponsored a bill criminalizing spreading revenge porn that passed in 2019, said. “This is a process, and once a victim has gone through that process, that should be enough for websites to accept that this is revenge porn.”

Pornhub's refusal to be transparent about its moderation process only makes the issue of revenge porn worse, lawmakers said.

"A pervasive lack of transparency makes it nearly impossible for us to know how successful most of the companies are when it comes to taking down images once they’ve been flagged and proactively looking for these kinds of violations," California congresswoman and co-sponsor of the SHIELD (Stopping Harmful Image Exploitation and Limiting Distribution) act Jackie Speier told Motherboard. "Ideally, companies should implement policies with requirements designed to prevent the worst harms. That means having a robust notice-and-takedown system that is adequately staffed by trained individuals who receive the pay and support needed for such a daunting and critically important task."

Only victims face consequences

Pornhub and the other Mindgeek sites are not the only ones who make Girls Do Porn videos easy to find. Other tube sites also easily surface Girls Do Porn videos via search or recommended videos. Another popular porn site, xHamster, did not surface Girls Do Porn videos when we searched for "Girls Do Porn" and related terms. But once we found one video by searching for an alias of one of the women, it recommended dozens of other Girls Do Porn videos, many of which had hundreds of thousands of views.

"Our IT team is also working on automatically generating alerts on the 'Girls Do Porn' title, which should automatically block them from being uploaded," xHamster vice president Alex Hawkins told Motherboard in November.

Hawkins added that, even with human moderators, it's incredibly difficult to verify consent just by looking at a video.

"There are entire genres dedicated to faked casting couch, faked spycams, faked taxis and so on," Hawkins said. "Anytime there is a fantasy or fetish, especially if it's taboo or banned, businesses will rush into try and make consensually produced content that looks as real as possible while still being legal and consensual…The primary method we use to remove non-compliant videos is in response to a complaint."

On February 3, Hawkins told us that "as far as I'm aware, they're aren't any [Girls Do Porn videos] currently on the site. We've done this by flagging it as an upload term, disabling certain search terms, blocking channels and searching and removing content manually. If you see any content from [Girls Do Porn], please let us know so we can remove [it]."

At that point, we explained how to find dozens of Girls Do Porn videos still hosted on xHamster. While those videos were removed shortly after, at the time of publication, dozens of Girls Do Porn videos are still hosted on xHamster. Many have been there for years.

The technical and logistical problems of moderating Pornhub are not all that different from the challenges that a company like Facebook faces. When a platform has so many users—over 36 billion visits per year in Pornhub's case—instances of abuse and bad behavior are inevitable even if Pornhub had the best intentions and most robust moderation tactics. What is not clear is if a company like Pornhub faces any consequences that incentivize it to eliminate instances of abuse.

So far there's no evidence that the proliferation of Girls Do Porn videos had any negative impact on the company's bottom line. On the contrary: since Girls Do Porn videos generate millions of views, and since Pornhub displays ads next to those videos, we know that these instances of abuse are profitable.

“The core problem is that image-based abuse videos (so-called revenge porn), and coerced videos such as Girls Do Porn, are a significant part of Pornhub and other sites’ business models," Dr. Ann Olivarius, founder of McAllister Olivarius, the firm that specializes in non-consensual pornography, said. "They are massively popular with users and with site owners, and either free or incredibly cheap to produce. What, exactly, is the incentive for a site like Pornhub to remove one of their most popular products?"

"At the end of the day, Pornhub, Facebook, YouTube—they're all basically the same," Farid said. "They profit off of user generated content. If you told them, 'Hey if you really solve this problem you can make $10 billion,' would we end up with the same technology? I guarantee you the answer is no, they would have much, much better technology."

At the time of publication, the most recently uploaded Girls Do Porn video on Pornhub we could find was posted 22 hours ago.

If you need someone to talk to about an experience with sexual assault or abuse, you can call the National Sexual Assault Hotline at 1-800-656-HOPE (4673), where trained staff can provide you with support, information, advice, or a referral. You can also access 24/7 help online by visiting online.rainn.org.