Facebook allows users to live stream acts of self-harm, post videos of violent deaths, pictures of certain kinds of child abuse, and even a video of an abortion — unless it contains nudity in which case Facebook’s army of moderators will automatically take it down.These are just a fraction of the rules Facebook moderators use as they battle to strike a balance between freedom of speech and avoiding causing harm in the real world. The rules have been revealed in over “100 internal training manuals, spreadsheets and flowcharts” reviewed by the Guardian and published on the newspaper’s website.
The so-called Facebook Files reveal for the first time the inner workings of the social network’s moderation policy, something that Facebook and its CEO Mark Zuckerberg have come under increasingly pressure about, in recent years.Responding to the Facebook Files publication, Monika Bickert, the company’s head of global policy management, told VICE News: “We work hard to make Facebook as safe as possible while enabling free speech. This requires a lot of thought into detailed and often difficult questions, and getting it right is something we take very seriously.”The manuals deal with issues such as violence, hate speech, terrorism, pornography, racism and self-harm. They are used by Facebook’s 4,500 human moderators and work in conjunction with the site’s automated content filters which focus on removing the most extreme content, such as child pornography and terrorism.Here is how Facebook deals with certain content:
- Child abuse: Certain content of non-sexual child abuse, including physical abuse and bullying, is permitted on Facebook according to the manuals. Images of such content is not actioned, while videos are marked as disturbing. Items with a celebratory or sadistic nature are banned.
- Nudity: Facebook says that sharing nudity and sexual activity in “handmade” art is allowed but digital art showing sexual activity is not.
- Abortion: Facebook allows videos of abortion, but only if there is no nudity
- Self-harm: Facebook will allow users to live stream video of themselves self-harming because it “doesn’t want to censor or punish people in distress who are attempting suicide.” However once it is deemed that the person can no longer be helped, they will remove the footage.
- Animal cruelty: Facebook’s rules allows users to post photos of animal abuse such as humans kicking or beating animals, with only extremely upsetting imagery to be marked as “disturbing.” The reason Facebook says is to raise awareness about what is happening.
- Threat of violence: Threats of violence are only removed in certain cases. A threat made against a head of state or even a candidate for head of state, are automatically removed. Also in this protected category are certain law enforcement officials, activists and journalists. However should a user say: “To snap a bitch’s neck, make sure to apply all your pressure to the middle of her throat”, or “fuck off and die,” these are not removed as they are not seen as credible threats.