Here Are All the Facebook Papers Stories You Need to Read

The stories about the massive trove of leaked Facebook documents just keep coming. Here’s a guide to help you navigate them.
Frances Haugen, a former Facebook employee, testifies during the Senate Commerce, Science and Transportation Subcommittee on Consumer Protection, Product Safety, and Data Security hearing titled Children's Online Safety on Tuesday, October 5, 2021. (Tom W
Frances Haugen, a former Facebook employee, testifies during the Senate Commerce, Science and Transportation Subcommittee on Consumer Protection, Product Safety, and Data Security hearing titled Children's Online Safety on Tuesday, October 5, 2021. (Tom Williams/CQ-Roll Call, Inc via Getty Images)

Want the best of VICE News straight to your inbox? Sign up here.

Facebook has been in the news a lot in the last few weeks, but over the course of the last few days, more than a dozen news organizations published over 50 new stories based on leaked internal Facebook documents.


These stories depict a company in turmoil. Facebook is undermining democracy across the globe while putting its users at risk—and it knows it. Meanwhile, its employees are screaming at their bosses about the problems on the platform are not being addressed fast enough, while its leaders continue to deny the scale of the problem.

That these stories are based on Facebook’s own words and reports is all the more damning. 

It can be overwhelming to even know where to begin tracking everything that’s been revealed, where the stories are coming from, who is behind them, and what they all mean. And the torrent of stories is unlikely to end any time soon. So, here’s a guide to help you navigate them.

What are the Facebook Papers?

The papers are a trove of 11 million documents consisting of internal Facebook files and reports, as well as criticism shared by the company’s employees with their colleagues on Facebook’s own internal social network, which is called Workplace.

Where did they come from?

The documents were collected by Frances Haugen while she was working as a product manager at the company. Haugen grew dismayed by Facebook’s failure to protect users and spent months gathering the damning evidence before leaving her job in May 2021.

Haugen provided the documents as part of a series of complaints she filed with the Securities and Exchange Commission this month.

Why were all the stories published at the same time?

Haugen first shared the trove of 11 million documents with the Wall Street Journal, which published them in a series of articles called the Facebook Files beginning in September, citing Haugen as an anonymous source.

Haugen then agreed to go public, appearing in an interview on 60 Minutes on October 3, before testifying in front of Congress two days later. She then decided that the Wall Street Journal had not published enough stories on the files and decided to make them available to a lot more journalists.


Through a PR company run by a former aide to former President Barack Obama, Haugen invited 17 news outlets to form a consortium and gain access to the documents. 

The consortium, which added new members on Monday, has not been given access to all the documents yet. The files are first being redacted to remove the names of Facebook users and low-level employees before being handed to Congress—which is also investigating the leak—which means more stories are likely to trickle out for many weeks to come.

What are the big takeaways?

The rest of the world: The biggest story to emerge from the leaked internal documents—and Haugen’s stated aim in bringing them to light—is the fact that Facebook prioritizes users in the English-speaking world, and particularly in the U.S., to the detriment of everyone else. While this was already pretty well known, the scale of that failure is still mind-blowing. According to a 2020 report cited by the Washington Post, Facebook allocated 84% of its budget for fighting misinformation to the U.S., which represents less than 10% of its global user base. The rest of the world had to make do with the remaining 16%.

The result of this lack of investment in markets outside its home country have been catastrophic, according to the leaked internal documents. For example, in Ethiopia, the company knew its platform was being used to incite violence but it did little to stop the spread of hate speech. In Vietnam, the company was bowed to government pressure to censor posts critical of the government rather than risk losing an estimated $1 billion in revenue in one of its fastest-growing markets. Meanwhile in India, the company’s apps, in particular WhatsApp, have been used to spread inflammatory hate speech during times of real-world violence, a 2020 internal report found.


Employee anger: One of the most common themes across many of the stories published this week is that Facebook’s own employees knew about the problems facing the company and tried to voice their anger. “It is not normal for a large number of people in the ‘make the site safe’ team to leave saying, ‘hey, we're actively making the world worse FYI.’ Every time this gets raised it gets shrugged off with ‘hey people change jobs all the time’ but this is NOT normal,” one employee wrote on Workplace. 

Five months before the Jan. 6 Capitol riot, a departing employee wrote that they company was “​​willing to act only *after* things had spiraled into a dire state.” When Zuckerberg posted a message expressing sadness about what happened in Washington on Jan. 6, one employee responded: “We’re FB, not some naive startup. With the unprecedented resources we have, we should do better.” Politico has a good run down of the angry comments posted by Facebook workers to its internal network.

Radicalization: Facebook knew that it had a problem with radicalization on the platform and so in 2019 it hired an independent researcher to carry out some tests on the platform. One of those tests saw a new account set up, which followed figures like former President Donald Trump and brands like Fox News. Within two days, Facebook’s recommendation engine was suggesting the account should follow QAnon groups. The researcher concluded that Facebook’s algorithms were driving people down online “rabbit holes” and exposing them to “a barrage of extreme, conspiratorial, and graphic content.”And yet, Facebook didn’t begin banning QAnon groups and accounts until late 2020.


The insurrection: In the days after the Capitol insurrection, Facebook’s Chief operating Officer Sheryl Sandberg attempted to downplay the company’s role in the event, saying it was primarily planned and organized elsewhere online. The leaked documents, however, suggest otherwise. CNN reported that an internal report revealed that Facebook failed utterly to see what was happening on its own platform with the Stop the Steal movement, looking only at individual posts and groups rather than the threat posed by the movement as a whole.

“Almost all of the fastest growing FB Groups were Stop the Steal during their peak growth,” the report’s authors wrote. "Because we were looking at each entity individually, rather than as a cohesive movement, we were only able to take down individual Groups and Pages once they exceeded a violation threshold. We were not able to act on simple objects like posts and comments because they individually tended not to violate, even if they were surrounded by hate, violence, and misinformation." Facebook did ultimately change its approach, but it was too late. Meanwhile, the New York Times highlighted the fact that the documents show that the social network’s employees “repeatedly raised red flags about the spread of misinformation and conspiracies before and after the contested November vote.” 

Where can I read more?

Here are all the stories published so far based on the Facebook Papers leak: