Facebook Needs to Fix Its Censorship Double Standards

As Facebook admits its algorithmic downfalls, hate speech still grows.
October 24, 2016, 2:00pm
Image: Shutterstock

Facebook has finally accepted that its algorithms that decide whether or not a post is "acceptable" may not be that good, and has announced that over the next few weeks it will start allowing more items that people find newsworthy and significant onto Facebook, even if they violate its community standards.

The about turn comes as the social network has in recent months come under fire for deleting posts such as the iconic Vietnam War image of anapalm-burnt Kim Phúc and a Le Monde news feature that showed an image of a cancer victim's mammogram.

It's understandable that dealing with the subjective nature of historically and culturally significant images or news stories is a complex task, but it's so far been clear that the task should not be left to computer algorithms at their current stage of intelligence. Facebook also has to deal with differing cultural norms and laws in countries around the world—another problem that is not yet best left to algorithms. On top of this, as I explained in September, Facebook must not overstep its role of a news aggregator to become a gatekeeper.

Read more: Tomorrow's Wars Will be Livestreamed

"In the weeks ahead, we're going to begin allowing more items that people find newsworthy, significant, or important to the public interest — even if they might otherwise violate our standards," wrote Facebook's vice president of public policy Joel Kaplan and global operations vice president Justin Osofsky. "We will work with our community and partners to explore exactly how to do this, both through new tools and approaches to enforcement. Our intent is to allow more images and stories without posing safety risks or showing graphic images to minors and others who do not want to see them."

The duo said that Facebook has yet to exactly work out how it's going to do this, but presumably it will mean more human intervention into vetting images or posts that may be flagged as inappropriate. Motherboard has asked Facebook for more information, but the company had nothing more to add at this time.

"Send them into the gas chambers"

But the news comes as Facebook enters an even trickier phase of its content strategy—a phase that is broadcasting live wars, suicides, and in today's case, the destruction of a refugee camp via Facebook Live.

I was brought to tears as the live comments and emojis rolled in as a camera panned across a grey, depressing scene at the Calais refugee camp in France, which is today being demolished. "Drone strike and done," commented a Facebook user. "All terrorist!" wrote another. The comment that pushed me over the edge was simply, "Send them into the gas chambers."

Is this kind of hate speech acceptable, but an image of a cancer victim's breast isn't? Facebook has a long way to go, and so does society.

Get six of our favorite Motherboard stories every day by signing up for our newsletter.