Facebook just updated its community guidelines to clamp down on misconduct on the site, which means that scrolling through your news feed will be that much less interesting, but probably a little bit safer; it's a mixed bag of rules that are likely to be beneficial to users and restrictions that could ultimately censor them.The new community guidelines were revealed today in a blog post, which stated that the revamped guidelines are meant to clarify Facebook's stance on longstanding restrictions—like those on hate speech, for example—and introduce new ones.
The site also released its transparency report for the second half of 2014, which outlines how many requests the site receives from governments around the world for user data or to restrict content. According to the report, government requests to restrict content are up.So, how much of this applies to you, and should you be worried about any of the new restrictions on what you can post? Here are the highlights from Facebook's new guidelines.
Revenge Porn Is Finally BannedFacebook is finally banning pornographic images posted without a user's consent, a vile practice commonly known as "revenge porn." Both Reddit and Twitter banned revenge porn on their platforms this year—in February and March, respectively. So, Facebook's a little bit late to the game in banning something that's been a serious problem for a long time. You could also argue that their restrictions don't go far enough.The rules still place the onus on victims to report images of themselves to Facebook only after they find them on the site. They will then have to wait for Facebook staff to review the report and rule on whether they should take the offending image down or not. Reddit took a similar approach, which was slammed by victim's rights advocates.Supporting "Dangerous Organizations" Is Not AllowedOpenly supporting "dangerous groups," which Facebook defines as those involved in "terrorist activity" or organized crime, is now banned on the site. Praising, condoning, or otherwise supporting their leaders or their actions is also taboo.
"We welcome broad discussion and social commentary on these general subjects," the Facebook blog post says, "but ask that people show sensitivity towards victims of violence and discrimination."Facebook has a history of taking down pages that support the Islamic State, for example, which doesn't make this move very surprising. The problem is that, as always, the definition of "terrorist activity" is up for debate. Under Canada's proposed Bill C-51, for example, the government would expand anti-terror legislation to any group targeting infrastructure or the country's economy. Recently uncovered documents revealed that Canadian law enforcement already sees environmental activists as carrying out "criminal extremism."Since Facebook operates according to the laws of the countries it operates in, it's entirely within the realm of possibility that a post supporting an aboriginal direct action protest group could be considered a violation of Facebook's community guidelines.No Butts, Breasts, or CGI NudityFacebook has finally clarified its rules surrounding posting nudity, and they're a little bit prudish. According to the new community guidelines, no photos of genitals, full-on exposed butts, or breasts with the nipple exposed will be allowed on the site from here on out. Notable caveats include sculptures, paintings, breast feeding, and if an exposed breast has a mastectomy scar—a nice nod to moms and breast cancer survivors who may want to use the site to promote awareness.
But, in a bit of unexplainable weirdness, Facebook also unequivocally bans any kind of computer-generated nudity unless it's for humour, education, or satire. The artistic allowances made for non-digital representations of the human body apparently don't apply to design and animation on Facebook.This rule pretty much validates user flight from Facebook to other, more liberal platforms like Ello. The upstart site made its name by placing precious few limitations on users regarding nudity and pornographic images.Then again, Ello seems to be marketed towards a hipper, more ostensibly radical crowd, while Facebook is likely where your mom and weird cousin hang out.Graphic Images Shared for Your "Sadistic Pleasure" Will Be RemovedPhotos depicting gory or violent scenes are also banned on Facebook, but the company now makes the distinction between images shared to raise awareness for human rights issues, for example, and those shared for the purposes of "sadistic pleasure."Assumedly, this means that users can report people who share gore just for the lols and Facebook will act to remove the post. This rule seems pretty innocuous on the surface, but the absence of caveats for art or satire—which Facebook makes for nudity—is notable in the case of images depicting violent acts.Whether or not art can be made from violence is certainly up for debate, but Facebook appears to have closed the door to that possibility by default.
You Can't Publicly Shame PeopleThe internet was supposed to be an empathy machine, but everybody seems to love a good public shaming these days. Mostly, this kind of thing happens when someone does, says, or posts something really terrible and the internet freaks out about it. "Shame culture" is even having its own backlash moment, with author Jon Ronson publishing a new book called So You've Been Publicly Shamed.This restriction is explicitly meant to stop the creation of pages targeting "private individuals," which Facebook defines as "people who have neither gained news attention nor the interest of the public, by way of their actions or public profession."The new rule appears to address the kinds of issues that Ronson addresses in his book, like the case of Justine Sacco, a PR professional who lost her job (and quickly found a new one) after being shamed for tweeting that she wouldn't get AIDS in Africa because she's white.Public shame, a time-honoured tradition since the first stockade, will have to live somewhere other than Facebook from now on.Governments Are Asking Facebook to Restrict More ContentAll of these restrictions could have impacts at the level of government censorship. According to Facebook's transparency report, which was released at the same time as the new guidelines, the number of government requests for user data has remained fairly consistent, but instances of governments requesting that certain types of content be restricted has increased.
"The amount of content restricted for violating local law increased by 11 percent over the previous half, to 9,707 pieces of content restricted, up from 8,774," the blog post states. "We saw a rise in content restriction requests from countries like Turkey and Russia, and declines in places like Pakistan."Pakistan's mention is notable, since the country is well known for its frequent requests for content removal on Facebook. India and Turkey, nations notorious for censoring social media content within their borders, still lead the pack.This finding is concerning in light of the new community guidelines that prevent support for "dangerous organizations," in particular. That definition is extremely malleable, and could conceivably be used to silence protesters or critics, like in the aforementioned hypothetical case of Canada's Bill C-51.
Facebook has historically been reticent to release details on the kinds of takedown requests it receives and how it evaluates them. This hasn't changed. Although Facebook's blog post states that the company will "continue to scrutinize each government request and push back when we find deficiencies," it doesn't offer much more information on how requests are assessed.The transparency report itself is little more than a list of countries and the number of requests they filed. Other companies, like Google and Twitter, publish the documents they receive from companies asking to restrict content.Will the new community guidelines, in concert with increased requests for content restrictions on the part of governments, result in more posts, pages, and comments being removed from the site? We'll just have to wait and see. If you don't want to bother with the whole mess, you could always just use another site that won't place limits on what you have to say—but will you, really?
Governments could define and restrict "dangerous organizations" differently