Tech

Facebook Reported 20 Million Instances of Child Sexual Abuse in 2020

A list of more than 160 companies shows who reported the most child sexual abuse content, including Facebook, Twitter, and Pornhub's parent company, Mindgeek.
Facebook logo with a notification.
Getty Images 

A new annual report from the nonprofit National Center for Missing and Exploited Children (NCMEC) reveals the internet platforms and companies that report the most child sexual exploitation content on the web—and Facebook tops the list of more than 160 companies, by a large margin.

According to the report, released on Wednesday, Facebook and other platforms the company owns, including Instagram and Whatsapp, reported 20,307,216 instances for child exploitative content in 2020.   

Advertisement

Snapchat, by comparison, reported more than 144,000 instances, and Twitter, more than 65,000. Microsoft came in at 96,776, which includes Github, Skype, cloud service Azure, and Xbox. 

Roblox and Reddit accounted for around 2,000 reports each. Mindgeek, the parent company of multiple adult content companies including Pornhub, YouPorn, RedTube, Brazzers, and more, reported 13,229 instances of child exploitative content last year.  

All the companies on the list voluntarily report these numbers to NCMEC, and only account for reports, not confirmed cases of abuse. 

John Clark, president and CEO of NCMEC, told Motherboard that in 2020, 10.4 million of the nearly 21,500,000 reports were unique images that were identified and reported multiple times—and that the organization is currently in the process of identifying unique video files. "This shows the power of the technology ESPs [Electronic Service Providers] use to identify these known images of abuse," Clark said. "It also demonstrates the repetitive exploitation occurring, as these images are shared many times, further re-victimizing the children depicted."

NCMEC said it saw a 97.5 percent increase in reports compared to 2019, and speculated the increase was possibly spurred by risks to children who are isolated at home with abusers and more online than ever during the COVID-19 pandemic.

Facebook's numbers might seem astronomical, but it is by far the largest company on the list, with Facebook alone nearing 3 billion users globally. Since the cases are self-reported, the high number also indicates Facebook is actively seeking instances and reporting them transparently.

Advertisement

On Tuesday, ahead of the report's release, Facebook wrote in a company blog that it conducted an analysis of the content it reported to NCMEC in October and November, and found that much of the content reported was duplicates—the numbers reported don't necessarily reflect the number of children being exploited, or abusers on the site, but things like shares and reposts. 

"We found that more than 90 percent of this content was the same as or visually similar to previously reported content," Facebook's blog post states. "And copies of just six videos were responsible for more than half of the child exploitative content we reported in that time period. While this data indicates that the number of pieces of content does not equal the number of victims, and that the same content, potentially slightly altered, is being shared repeatedly, one victim of this horrible crime is one too many." 

Facebook also said that it evaluated 150 accounts that it reported to NCMEC for uploading child exploitative content, and estimated that more than 75 percent of those users "did not exhibit malicious intent," but were sharing the content in outrage or humor. The platform is testing two new tools to prevent child abuse on Facebook: a popup shown to users who search for terms associated with exploitation, and a safety alert shown to those who share viral exploitative content about the harm it causes, and the legal consequences involved in reposting it. These are in addition to removing the material, according to Facebook.

Advertisement

Pornhub also said on its policy page that it reported 13,000 instances of potential child sexual abuse material to NCMEC in 2020, but noted "that several thousand duplicates, with most reports submitted multiple times in an abundance of caution." The number of unique reports, it said, was 4,171.

The NCMEC report is especially pertinent to adult sites like Pornhub, following recent accusations of rampant child sexual abuse material on Mindgeek's platforms. In December, as part of a revamp of its safety policies, Pornhub announced that it was working with NCMEC, voluntarily registering with its electronic service provider program to report every instance of child sexual abuse material, and including the nonprofit as one of its "Trusted Flagger" partners, which alert the platform to content violations that might violate Pornhub's terms of service. 

Days after Pornhub announced its new policies and suspended all unverified content, Mastercard, Visa and Discover cut ties with the platform, citing unlawful content—a decision the payment processors are still standing by as of this month, despite its harmful impact on consensual sex workers. 

"Eliminating illegal content and ridding the internet of child sexual abuse material is one of the most crucial issues facing online platforms today, and it requires the unwavering commitment and collective action of all parties," a spokesperson for Pornhub told Motherboard in a statement. "The safety of MindGeek’s communities is our number one priority and even one instance of child sexual abuse material is too many, which is why we have worked diligently to set the standard for safety and security among tech and social media platforms."

"We want more reports and for more companies to report and for those who do report to report more," Clark said. "Higher numbers of CyberTipline reports can be indicative of a variety of things including larger numbers of users on a platform or how robust an electronic service provider's efforts are to identify and remove abusive content from their platforms." 

The full list is available here.