Tech

Facebook’s Own Training Materials Fell for Fake News

This piece is part of an ongoing Motherboard series on Facebook’s content moderation strategies. You can read the rest of the coverage here.

From Alex Jones to Russian troll factories to Iranian influence operations, Facebook is increasingly facing pressure to deal with its disinformation problem. But Facebook has internally fallen for ‘fake news’ itself. In a Facebook document obtained by Motherboard, the company falsely attributes an image to the recent Myanmar genocide, when it actually dates from an earthquake in another country years earlier.

Videos by VICE

The mistake, made in a document used to train some of Facebook’s content moderators and which appears to originate from a years-old piece of fake news, is particularly ironic, given that Facebook recently said it was going to remove misinformation on its platform that could lead to violence.

The image shows dozens of Buddhist monks, some wearing face masks and hair nets, surrounded by piles of dead, barely clothed bodies.

“The Body of Muslims slaughted [sic] By Buddhist,” a caption reads. Facebook’s own slide, which is discussing content that may contain nudity or is disturbing, adds “This is a newsworthy exception for the victims of violence in Burma [Myanmar].” (Facebook sometimes allows images that would otherwise be flagged as sensitive or removed if they serve a broader public interest.)

In reality, the photo was taken by Getty Images photographer Guang Nu during the aftermath of the 2010 earthquake in Jeiegu, China, Motherboard found through a reverse image search.

Caption: A screenshot of the misleading image included in a Facebook training document. Image: Screenshot
Caption: The original image, which is of the aftermath of an earthquake. Image: Getty

“The complexities (by its own design) of Facebook’s content moderation solution mean that the introduction of errors can happen at many points in the production chain,” Sarah T. Roberts, an assistant professor at UCLA who studies commercial content moderation, told Motherboard in an email.

Jillian York, director for international freedom of expression at the Electronic Frontier Foundation, wrote in an email “it could be an innocent mistake, it could also be serious bias. It definitely demands more transparency,” referring to potential biases from people making the training manuals.

Numerous blogs and social media pages have shared the image claiming it shows the massacre of Muslims in Myanmar over the years. A 2012 article from Indian publication The Hindu explained that the photo went viral as ‘proof’ of Muslim persecution (the Hindu traced its correct origins to the earthquake.) The Hindu explained that various local news organizations had incorrectly picked it up in this context at the time. Motherboard confirmed with internet archives that a Pakistani news organization did host the image in the context of violence against Muslims in Myanmar. From here, the misleading use of the image spread throughout other publications, and led to emails and text messages calling for vengeance, The Hindu wrote.

Got a tip? You can contact this reporter securely on Signal on +44 20 8133 5190, OTR chat on jfcox@jabber.ccc.de, or email joseph.cox@vice.com.

To be clear, this particular mistake or piece of misinformation does not undermine the very stark reality of the Myanmar military murdering thousands of Rohingya Muslims in the country, and displacing hundreds of thousands more from their homes. On Monday, Myanmar jailed two Reuters journalists for seven years; the pair had worked on a groundbreaking piece documenting the military’s role in a massacre. Facebook has been criticised for not acting quickly enough in stamping out hate speech on its platform in Myanmar, which arguably contributed to violence in the country.

But the earthquake image does still highlight the pervasiveness of fake news, and how it can even penetrate companies that are currently trying to deal with the issue themselves. Last month Motherboard published an extensive investigation into Facebook’s content moderation strategies, relying on leaked documents and nearly two dozens interviews, which found that the company’s approaches are sophisticated but have significant holes. Many of these issues stem from Facebook’s gargantuan scale.

A Facebook spokesperson told Motherboard in a statement that “the image depicting victims of an earthquake that struck China several years ago was used incorrectly to train our reviewers on how to label content containing nudity. We are no longer using the slide and thank Vice for bringing it to our attention.”