News

These Are Facebook's Internal Rules on Voter Suppression and Disinformation

Leaked moderation slides show that Facebook has a series of hyper-specific rules about how you're allowed to discourage people from voting.
Screen Shot 2020-11-02 at 11
Image: KYODO VIA AP IMAGES

In recent weeks, as the presidential campaigns intensified, Facebook and other social media companies have struggled to deal with the sheer volume of disinformation and voter suppression tactics posted to their platforms. 

In a bid to help its tens of thousands of moderators who have to try and remove problematic content from the platform, Facebook has spelled out how its policies on voting and the election should be enforced.

Advertisement

Leaked copies of these documents show that Facebook’s enforcement guidelines are causing more confusion for moderators. The distinction between what's allowed and what's not is, in some cases, very confusing. 

“Unless you say things in a certain way, it won't be deleted,” a moderator who has worked for Facebook for three years, told VICE News, adding: “even for me, it’s as confusing as hell.”

Take, for example, these two comments, which are used as examples in the internal documents which were shared with VICE News:

  • “Don't risk contracting COVID, avoid the polling station today.”
  • “If you want to guarantee catching COVID, go vote today!”

Both of these statements suggest that voting is dangerous, which has been a documented tool used by voter suppression campaigns against Black communities across the U.S. in recent weeks.

But, if you post both comments on Facebook, only the latter one will be removed for breaching the company’s guidelines on voting misinformation, according to an internal Facebook document called "Known Questions" which is used as guidance for its moderators. 

The internal documents also say that moderators should remove comments and posts that say, for example, "If you are [protected characteristic], your vote will not be counted." But they should leave up posts and comments that say "If you are [protected characteristic], don't bother voting, it won't count." 

Advertisement

Facebook did not respond to a question about what the difference is between these two comments.

The fine-line distinctions are nothing new: Leaked moderator documents obtained by Motherboard in 2018 show that Facebook has hundreds of hyper-specific rules like these, which tries to make content moderation more like a foolproof, yes-or-no decision tree rather than something that requires human judgment. This, of course, is not how it works in the real world, because ultimately it is a judgment call to take a post down or leave it up. The rules themselves are also made by human policy makers, who must decide where the line between voter suppression and normal speech is.    

“There’s 100,000 lines in the sand we just can’t see,” Sarah Roberts, a UCLA professor who studies content moderation, told Motherboard in 2018.

The "voter interference" document shows that comments like “don’t vote, it’s stupid” or “a vote for the Green Party won’t count in this election” are perfectly fine. As are “drop-off boxes aren't sanitized. I wouldn't risk using them” and “All elections are predetermined, your vote doesn't count.”

The documents were leaked by a current Facebook moderator who wanted to highlight what they felt was Facebook’s excess power to influence the outcome of an election.

“They've gotten so powerful and it's scary to think that they are capable and able to influence any elections,” the moderator told VICE News, who granted them anonymity because they were concerned they would be punished for sharing the documents.

Advertisement

Like all 35,000 content reviewers at Facebook, the moderator is employed by a third-party company and is not a full-time Facebook employee. As well as policing misinformation, content moderators are responsible for policing the platform for hate speech and some of the more violent and graphic content seen online—a responsibility that has caused many moderators to suffer from PTSD

Along with Known Questions or KQ, the other document moderators rely on is called the Implementation Standards — known by many of the moderators as “the Bible.” The moderator said that the Implementation Standards codify the rules, while the Known Questions document is relied on to “know how to interpret” the rules. 

Both of these are living documents and are constantly updated to reflect the latest Facebook policy. The Known Questions document was last updated a month ago, while the Implementation Standards were updated just four days ago.

In a section of the Known Questions document that discusses voter interference, Facebook’s rules say that moderators have to remove a comment that says “Liberals vote on Tuesday, Conservatives on Wednesday.” However, an identical comment that appends “JKJK” at the end is allowable.

Facebook’s efforts to combat hate speech on its platform is by looking at attacks on people based on what it calls a protected characteristic, such as race, ethnicity, national origin, religious affiliation, sexual orientation, sex, gender, age, gender identity, and serious disability or disease.

Facebook and its CEO Mark Zuckerberg have spent the last four years trying to convince lawmakers and critics that they would not see a repeat of 2016 when Russian trolls weaponized the platform to spread disinformation and Cambridge Analytica helped the Trump campaign suppress voters.

But it has not been smooth sailing for Facebook in the lead up to the 2020 election and a day out from Election Day, it is still unclear how well Facebook’s new policies have been implemented or if they will work at all.

The moderator who spoke to VICE News, who has been reviewing content on Facebook for three years, said that even for them, interpreting the policies was difficult.

“For us, it's often very confusing,” the moderator said, adding that at the end of the day, the only thing that matters “is that we take the same action as the full-time Facebook employees.  Facebook policies are mysteriously confusing and I don't think that's accidental.”