Tech

In a Huge Policy Shift, Pornhub Bans Unverified Uploads

On Tuesday, Pornhub announced that it is banning downloads, restricting uploads, and expanding its content moderation efforts.
Image via Getty Images
Image via Getty Images 

In a series of major changes to how the platform operates, Pornhub announced on Tuesday that it would ban downloads, restrict uploads to verified users, and expand its moderation process—steps that non-consensual imagery activists as well as porn performers have been asking the platform to take for some time. 

Advertisement

This is a big shift to how Pornhub previously worked, where anyone with a free account could upload content without any verification, and copyrighted or abusive images populated the site unchecked until a rights holder complained to the platform. 

"Effective immediately, only content partners and people within the Model Program will be able to upload content to Pornhub," a statement released by Pornhub says. "In the new year, we will implement a verification process so that any user can upload content upon successful completion of identification protocol." 

As part of its moderation expansion, Pornhub says it will use a newly established “Red Team” that’s dedicated to auditing the platform for potentially illegal material. This is in addition to Pornhub’s use of CSAI Match, YouTube’s proprietary tech for combating Child Sexual Abuse Imagery, Content Safety API, Google's artificial intelligence tool for detecting illegal imagery, and PhotoDNA, a technology used by Google, Facebook and others to automatically detect images of child sexual abuse.

These are technologies that many other tech giants like Pornhub already use, and experts told Motherboard in February that using fingerprinting technologies to their full potential would greatly improve the lives of victims of abusive imagery, as well as content owners whose copyrighted materials are all over Pornhub. At the time of our reporting and investigation into Pornhub's moderation practices, the company would not comment on how, or if, it actively moderated abusive imagery. 

Advertisement

"In April 2020, we retained outside experts to independently review our compliance program and make recommendations that focus on eliminating all illegal content and achieving a 'best-in-class' program that sets the standard for the technology industry," Pornhub said in today's announcement. 

The changes also include the launch of a "Trusted Flagger Program," in which 40 non-profit internet and child safety organizations around the world "have a direct line of access to our moderation team, and any content identified by a Trusted Flagger is immediately disabled," according to the policy. In the US, those organizations are the Cyber Civil Rights Initiative and the National Center for Missing & Exploited Children.

"We at CCRI are encouraged by the changes Pornhub announced today, which include steps that many advocates working in the NCP and CSAM spaces have sought for years," Mary Anne Franks, president of the Cyber Civil Rights Initiative, told Motherboard. "One of CCRI's core positions is that producers, distributors, and facilitators of sexually explicit material have a responsibility to ensure that all individuals depicted in such content are not only adults, but fully consenting adults. We are glad that Pornhub has adopted several of our recommendations to underscore the importance of consent not just as a pre-condition for uploading sexually explicit material, but as an affirmative value in sexual dynamics. It of course remains to be seen how effective these policies will be, but we welcome these important initial steps towards responsibility in an industry notorious for avoiding it." 

Carrie Goldberg, CCRI board member and attorney, told Motherboard that it's "not enough" that companies self-police and moderate content their users upload. "We need a legal system that accommodates victims’ right to get justice when they’re exploited on these platforms—the legal right to demand content removal, disgorgement of profits, and damages for their pain," she said. "Platforms that peddle in child pornography and other abusive content believe Section 230 of the Communications Decency Act immunizes them from liability. Responsible moderation addresses only one part of the problem. Victims having no rights in our legal system remains unresolved. These companies must answer to the victims. Not just the general public." 

These changes follow reporting by the New York Times published this week detailing the experiences of child abuse victims whose content was uploaded to Pornhub, as well as an ongoing campaign by a religious conservative organization to have the whole platform shut down—an outcome sex workers say is a misguided, if not malicious, attempt at harming their livelihoods. On Monday, Visa and Mastercard said they would investigate the allegations and their relationship to MindGeek, Pornhub's parent company.   

Pornhub declined Motherboard's request for comment on Tuesday's policy change. 

12/9/2020, 9:10 a.m.: This piece has been updated to include comments from CCRI.

Correction: This story initially stated that Pornhub had just started using CSAI Match, Content Safety API, and PhotoDNA. Pornhub had already been using these tools, according to a statement in August.