Tech

Leaked Documents Show Facebook’s ‘Threshold’ for Deleting Pages and Groups

This piece is part of an ongoing Motherboard series on Facebook’s content moderation strategies. You can read the rest of the coverage here.

On Tuesday, representatives from Facebook, Twitter, and YouTube testified to lawmakers about their platforms’ content moderation strategies. In the hearing, as New York Times reporter Sheera Frenkel pointed out, Facebook repeatedly referenced what it described as a “threshold” that must be reached before the platform decides to ban a particular page for violating the site’s policies.

Videos by VICE

“No one has ever defined what the threshold is or who sets it,” Frenkel tweeted. Other news outlets also mentioned this threshold: “Facebook can’t decide when a page should be banned,” Engadget wrote. But Facebook has decided when pages should be banned—it just hasn’t discussed its guidelines publicly.

Motherboard has obtained internal Facebook documents laying out what this threshold is for multiple types of different content, including some instances of hate speech. The news provides more granularity to understanding how exactly Facebook decides to ban certain pages or groups, and comes as the platform faces renewed criticism over its decision to not ban particular controversial pages, such as conspiracy site InfoWars.

One Facebook moderator training document for hate speech says that for Pages—Facebook’s feature for sections dedicated to, say, a band, organization, public figure, or business—the Page admin has to receive 5 “strikes” within 90 days for the Page itself to be deleted.

Alternatively, Facebook moderators are told to remove a Page if at least 30 percent of the content posted by other people within 90 days violates Facebook’s community standards. A similar 30 percent-or-over policy exists for Facebook Groups, according to the document.

In a similar vein, another hate speech document says that a profile should be taken down if there are 5 or more pieces of content from the user which indicate hate propaganda, photos of the user present with another identifiable leader, or other related violations. Although the documents obtained by Motherboard were created recently, Facebook’s policies change regularly, so whether these exact parameters remain in force is unclear. By comparison, YouTube has a three “strike” policy (although it may not necessarily be fair to compare two different platforms and features like-for-like.)

Of course this still depends on moderators identifying and labeling posts as violating to reach that threshold; there has been a fierce online debate about what qualifies as “hate speech” and “fake news” and whether InfoWars should be allowed on Facebook. Facebook said last week that it, “see[s] Pages on both the left and the right pumping out what they consider opinion or analysis—but others call fake news,” the company tweeted. “We believe banning these Pages would be contrary to the basic principles of free speech.”

Got a tip? You can contact this reporter securely on Signal on +44 20 8133 5190, OTR chat on jfcox@jabber.ccc.de, or email joseph.cox@vice.com.

Another document focused on sexual content says moderators should unpublish Pages and Groups under the basis of sexual solicitation if there are over 2 “elements”, such as the Page description, title, photo, or pinned post, that include either explicit solicitation of nude imagery, or, if the page is more subtle, includes either a method of contact or a location. This slide again reiterates the over 30 percent and 5 admin posts rules found in the hate speech document.

However, that threshold is going to vary depending on the type of content being shared. Facebook told Motherboard in an email that someone who shared child exploitation material, for example, is going to be banned immediately.

“The consequences for violating our Community Standards vary depending on the severity of the violation and a person’s history on the platform,” a spokesperson wrote in a statement. “For instance, we may warn someone for a first violation, but if they continue to violate our policies, we may restrict their ability to post on Facebook or disable their profile. We also may notify law enforcement when we believe there is a genuine risk of physical harm or a direct threat to public safety.”

During her testimony on Tuesday, Facebook’s Head for Global Policy Management, Monika Bickert, said, when discussing InfoWars’ Page, “if they posted sufficient content that violated our threshold, that page would come down. That threshold varies, depending on the severity of different types of violations.”

Solve Motherboard’s weekly, internet-themed crossword puzzle: Solve the Internet.