FYI.

This story is over 5 years old.

Tech

How Facebook Trains Content Moderators to Put Out ‘PR Fires’ During Elections

Internal Facebook documents obtained by Motherboard show specific steps and strategies taken by the company to fight content moderation issues that may spike during an election season.
Mark Zuckerberg
Image: Shutterstock

This piece is part of an ongoing Motherboard series on Facebook's content moderation strategies. You can read the rest of the coverage here.

Leading up to an election, Facebook becomes a battleground. Russian troll farms have flooded the platform with misinformation and memes designed to sow discord, and political campaigns are increasingly targeting voters on Facebook as well. Before the US midterms, Facebook CEO Mark Zuckerberg said the platform is doing more to fight hate speech particularly during elections: “We will all need to continue improving and working together to stay ahead and protect our democracy,” he wrote in a lengthy blog post explaining the steps Facebook is taking.

Advertisement

But internal Facebook documents obtained by Motherboard show that beyond protecting democracy, there’s a second, clearly stated reason that Facebook is interested in hardening its platform: protecting its public image. Facebook specifically instructs its content moderators to look out for posts that could cause “PR fires” around “hi-risk events” in the lead-up to elections.

The internal documents Motherboard obtained from a source talk specifically about three separate elections held in 2018. A second source, when presented with sections of one of the documents, said that it was indicative of others that the company distributes before elections around the world. Motherboard granted sources in this story anonymity to discuss internal procedures by Facebook, and is not naming the countries associated with specific election documents to preserve that anonymity.

Sources told Motherboard how this sort of preparation has become a norm for Facebook, with the company trying to educate its moderators about hot button political issues that are particular to different democracies around the world during an election season, but with varying results. They said that preparations have been implemented for elections in a variety of countries and areas, including Canada, the Caribbean, the Middle East, North Africa, and the US midterms. Facebook told Motherboard other examples include elections in Nigeria and Mexico, as well as the Irish Presidential and Georgian elections.

Advertisement

Got a tip? You can contact this reporter securely on Signal on +44 20 8133 5190, OTR chat on jfcox@jabber.ccc.de, or email joseph.cox@vice.com.

Before the 2018 US midterms, Facebook invited many American reporters to see its 'election war room' to show how it was attempting to combat misinformation and other election issues. But the documents obtained by Motherboard show specific steps and strategies taken by the company to fight content moderation issues that may spike during an election season.

“PR fires,” according to a slide included in multiple documents, are “Any type of external event, trend, news, and also actions taken by reps internally that could be picked up by the media and have a negative impact on Facebook’s reputation or even put the company at legal risk.” Content moderators are even given a yes/no option on whether a piece of content may constitute a “PRFireRisk,” according to another slide. Instagram, which Facebook owns, also uses the term, Motherboard previously reported using internal Instagram documents.

Two sources said these documents were a recent development, although Facebook told Motherboard it has prepared content moderators for significant political events, including elections, for the past several years.

Facebook’s army of content moderators are tasked with removing all sorts of content from the site, from revenge porn, to graphic violence, to people praising crime. Facebook told Motherboard its safety and security staff now totals in at 30,000 people, and around half of those are content reviewers. During an election, the amount of politically-focused or related content is likely to ramp up, even if it deals with the same broad types of content, such as hate speech. The documents Motherboard obtained were designed to brief moderators so they could, hopefully, make more informed decisions about what sort of posts violate the social network’s terms of use and potentially remove them.

Advertisement

The slide decks typically start with some basic context on what the election is for, the number of seats available, the number needed for a party to win the vote, and details on the key parties involved. One presentation also lays out what Facebook sees as high risk events in the country, such as protests or controversial pieces of legislation that voters may be concerned about. Another includes important events from previous elections, including violence and threats towards leaders.

Two of the documents focused on elections include a list of further reading for the moderator, such as a list of banned organizations in the country (Motherboard has previously obtained and reported on a similar document for hate groups in the United States.) All of the presentations are in English.

Another section of one of the documents lists some of the “trends” Facebook expects to see during the election season. These include fake, troll, and impersonation accounts; users distributing revenge porn of political supporters, and a variety of hate speech and harassment.

Motherboard also obtained documents that focused more on identifying vulnerable people to moderators, such as particular politicians or journalists, before an election or other major political event. These were shorter than the other slide decks, typically just including a series of photos of each person, their name and some brief context of the event.

Advertisement

One of the sources said that “just a few slides” is what moderators can expect for most elections, if the election is mentioned at all. The source added that Facebook sometimes updates moderators about particular candidates, such as Christine Hallquist, the first transgender nominee for governor in Vermont.

Some content moderators are also not familiar with more mainstream political figures such as Ted Cruz, according to the source.

One of the sources said that “just a few slides” is what moderators can expect for most elections.

“For Facebook it’s basically like a checklist where if a candidate’s campaign intersects with the policy in a notable way they tell us. Like if someone advocates views that are hate speech,” the source said. They also lamented that Facebook doesn’t specifically address some other elections that a significant number of users are discussing on the platform.

On Monday, Facebook’s Katie Harbath Global Politics and Government Outreach Director and Samidh Chakrabarti, Director of Product Management, Civic Engagement, announced in a blog post new transparency measures, particularly around political ads and the upcoming European Parliament election.

A Facebook spokesperson told Motherboard in an email that “We take our responsibility to prevent interference in elections extremely seriously, and we continue to advance our technology and improve our policies and planning in order to address both current threats as well as new risks.”

“Our preparations for an election begin many months before ballots are cast and involve more than a dozen teams at the company, ranging from threat intelligence to engineering to data scientists to country experts,” the spokesperson added.

Subscribe to our new cybersecurity podcast, CYBER.