This piece is part of an ongoing Motherboard series on Facebook’s content moderation strategies. You can read the rest of the coverage here.
Over the past few weeks, Motherboard has been reporting on recent training material for Facebook moderators—the workers tasked with keeping terrorism, sexual abuse, and other offending content off of the platform. Some material showed how Facebook had something of an internal reckoning around American hate speech in particular after the events of Charlottesville, in which a white supremacist killed a counter-protester with a vehicle.
Videos by VICE
Now to provide more context around how Facebook decides what sort of content related to hate, Motherboard is publishing an extended selection of recent training materials. These include additional details on how Facebook differentiates between white supremacy, separatism, and nationalism—classifications that some experts say are really the same thing—as well as hate figures and speech more generally. The Guardian previously published documents related to hate speech moderation, some of which had a particular focus on immigrants.
Facebook has published a skeleton of its policies around hate speech, but these leaked documents include details on a much more granular level. To know whether Facebook is following its own policies, or what even Facebook considers a piece of offending hate speech, more information is required than what Facebook has published itself. To be clear, however, the documents do not necessarily present a full picture of how a moderator should handle an instance of hate speech in every case: in videos obtained by Motherboard, trainers reiterated certain points when asked questions by the audience.
Rather than publishing the original materials themselves, Motherboard has reconstructed the content of the slides to remove identifying details. The language remains intact. Additional context added by Motherboard is in square brackets.
“Our policies against organised hate groups and individuals are longstanding and explicit—we don’t allow these groups to maintain a presence on Facebook because we don’t want to be a platform for hate. Using a combination of technology and people we work aggressively to root out extremist content and hate organisations from our platform,” Facebook previously told Motherboard in a statement.
Got a tip? You can contact this reporter securely on Signal on +44 20 8133 5190, OTR chat on jfcox@jabber.ccc.de, or email joseph.cox@vice.com.