News

Facebook Knew It Was Fueling QAnon

A damning whistleblower report reveals how Facebook mishandled the rise of QAnon—and other militarized social movements.
A protester yells inside the Senate Chamber on January 06, 2021 in Washington, DC. (Win McNamee/Getty Images)​
A protester yells inside the Senate Chamber on January 06, 2021 in Washington, DC. (Win McNamee/Getty Images)
Logo_Disinfo Dispatch Padding
Unraveling viral disinformation and explaining where it came from, the harm it's causing, and what we should do about it.

Facebook knew its recommendation algorithm was a problem, so it set up a study to test out just how bad the problem was.

In the study, entitled “Carol’s Journey to QAnon—A Test User Study of Misinfo & Polarization Risks Encountered through Recommendation Systems,” Facebook set up some brand-new accounts.

Each of these accounts followed a small number of high-quality or verified conservative-interest pages, such as Fox News, former President Donald Trump, and former first lady Melania Trump. The study found that “within just one day, Page recommendations had already devolved toward polarizing content.” 

Advertisement

Within two days, recommendations began to include conspiracy content. “It took less than 1 week to get a QAnon recommendation,” the report found.

The shocking revelations about how Facebook mishandled the rise of QAnon—as well as other militarized social movements—are revealed in one of eight whistleblower complaints filed by former Facebook product manager Frances Haugen with the Securities and Exchange Commission last week and published by CBS on Monday evening. 

The revelation about how quickly new accounts can become radicalized is contained in a complaint focusing on Facebook’s role in the 2020 election and the Jan. 6 insurrection at the U.S. Capitol.

But the complaint reveals much more about how Facebook failed to recognize the threat posed by QAnon on its platform. It also reveals that employees were exasperated by the company’s continued failure to act on that threat.

For years, Facebook ignored the warnings of extremism researchers who flagged the potential threat posed by the rise of QAnon on mainstream platforms like Facebook and Instagram.

This escalated in 2020, as the pandemic saw a rapid uptick in the number of people being radicalized into QAnon conspiracies through Facebook. Yet Facebook only acted when it was too late, and in some cases it even reversed decisions that had helped limit the spread of QAnon.

Advertisement

According to Haugen’s complaint, Facebook knew its ability to connect people in huge numbers via its Groups was facilitating the rise of QAnon. 

“The QAnon community relied on minimally connected bulk group invites. One member sent over 377,000 group invites in less than 5 months,” according to an internal Facebook report entitled “Harmful Non-Violating Narratives.”

The report’s author noted that Facebook’s own engineers had found ways to slow this growth. One way was by limiting the number of invites a user could send out in a single day to 100, a measure introduced ahead of the U.S. election in October 2020.  

“[However] we have rolled back the pre-election rate limit of 100 Group invites/day due to it having significant regression on Group growth,” the report stated.

In the same report, the company admitted it had allowed QAnon to fester and grow on its platform and that its own policies allowed this to happen.

“Through most of 2020, we saw non-violating content promoting QAnon spreading through our platforms,” the report’s authors wrote. Belief in the QAnon conspiracy took hold in multiple communities, and we saw multiple cases in which such belief motivated people to kill or conspire to kill perceived enemies.” 

The report added that Facebook’s “policies don't fully cover harms” and that the company implements policies “for many of these areas that limit our ability to act.” It also flags the fact that high-profile accounts, such as lawmakers and celebrities, “were able to serially spread claims without crossing our falsifiable misinformation-based lines for enforcement.“

Advertisement

The report says the company’s own systems make it difficult to effectively tackle issues like QAnon. “We've often taken minimal action initially due to a combination [of] policy and product limitations making it extremely challenging to design, get approval for, and roll out new interventions quickly.”

While Facebook publicly stated that it did everything in its power to stop the spread of QAnon, its own employees knew differently.

One employee who left the company in late 2020, wrote a leaving statement—known internally as a “badge post”—that blasted Facebook’s executives for failing to act more quickly on the QAnon threat. 

“I've seen promising interventions from integrity product teams, with strong research and data support, be prematurely stifled or severely constrained by key decision-makers—often based on fears of public and policy stakeholder responses. For example, we've known for over a year now that our recommendation systems can very quickly lead users down the path to conspiracy theories and groups,” the employee wrote, according to Haugen’s complaint.

“While the Recommendations Integrity team has made impressive strides in cleaning up our recs, [Facebook] has been hesitant to outright ban/filter conspiracy groups like QAnon until just last week. In the meantime, this fringe group/set of beliefs has grown to national prominence with QAnon congressional candidates and QAnon hashtags and groups trending in the mainstream. We were willing to act only after things had spiraled into a dire state.”