News

Facebook's QAnon Ban Is Too Little, Too Late

A new study found that 20% of Americans recognize and believe in at least one of four conspiracy theories that originated from QAnon.
AP Photo/Ted S. Warren

Within hours of Facebook announcing it was going to ban QAnon from all its platforms, researchers who track these conspiracy movements were reporting that hundreds of groups, with hundreds of thousands of followers, had been wiped out almost instantly.

“Holy hell. Just had a look at my list of QAnon groups and pages on Facebook and it's bloodbath out there,” Shayan Sardarizadeh, a disinformation researcher for BBC Monitoring, tweeted.

Advertisement

Those sentiments were echoed by researchers across the globe who had been warning the social network about the threat of QAnon for years.

While disinformation experts welcomed Facebook’s action, they also warned that the move to eradicate the movement from its platforms may have come too late: QAnon is no longer a single entity, but has mutated into a group with multiple strands, ranging from the extreme belief that liberal elites are child-eating pedophiles to the softer #SavetheChildren campaigns, which make no reference to Q. Eradicating it from Facebook’s platforms won’t be as simple as banning groups that refer explicitly to QAnon.

After months of warnings from researchers and journalists, Facebook finally took action against QAnon on August 19, announcing that it would remove QAnon related pages, groups, and accounts if they were found to be engaged in discussions about potential violence.

Facebook removed 1,500 QAnon groups and pages under this rule, but it was not enough.

So on Tuesday, it announced a much more severe ban, designating QAnon as a “militarized social movement.”

“Starting today, we will remove any Facebook Pages, Groups, and Instagram accounts representing QAnon, even if they contain no violent content,” the company said on its website.

Rather than relying on users’ reports to find these accounts and groups, Facebook will rely on its Dangerous Organizations Operations team, a specialist group within the company that tracks movements like QAnon and how they evolve to avoid detection.

Advertisement

But that’s not going to be easy.

“The Q ecosystem is built around coded, seemingly innocuous language, cross-platform shares of materials, and image-based memes and content like screenshots of tweets with no caption — so FB's action will need to account for this behavior and the likely reaction/attempts by Q communities to return to the platform using these methods,” Ciarán O’Connor, an analyst who tracks QAnon at ISD told VICE News.

Research published Tuesday, hours before Facebook’s announcement, backed up the theory that QAnon is merging with other conspiracies such as anti-mask, anti-5G, anti-vaxx, and anti-globalist communities.

The research from Brian Schaffner, a political scientist at Tufts University, found while they may not have ever heard of QAnon 20% of Americans recognized and believed in at least one of four conspiracy theories that originated from QAnon.

Still, Facebook’s actions have had an instant impact.

Researchers who are tracking QAnon on Facebook have uniformly reported that the majority of the accounts and groups they follow have vanished.

The biggest single impact of this will be to dramatically stop the ability of QAnon followers to recruit new members who may have unwittingly become entangled in the conspiracy theory through one of the less extreme versions.

While the initial takedown has been swift and widespread, Facebook and Instagram have so many users that it’s going to take time to find and eradicate all the content.

Advertisement

Unsurprisingly, people are still finding instances where the company’s ban has not been properly implemented.

“There are still plenty of Save The Children groups active, which formed as a direct result of QAnon trying to push the movement to a wider audience using diluted QAnon content,” said Aoife Gallagher, an analyst who tracks QAnon at the Institute for Strategic Dialogue (ISD).

Experts are now calling on Facebook to follow through on its new policy, something the company has been criticized for failing to do in the past.

“I hope to see Facebook fulfill this new policy and stay abreast of the dog whistles and rebranding the QAnon community has become extraordinarily skilled at,” Nina Jankowicz, Disinformation Fellow at the Wilson Center, told VICE News.

Within hours of Facebook’s ban, other platforms were promoting themselves as new homes for Facebook’s lost QAnon believers.

One of those was Gab, a social network that positions itself as one of the only platforms that allows for truly free speech. It has become a haven for Nazis, the far-right, and most notoriously, it was the place where a man posted a violent manifesto before shooting 11 people at the Tree of Life Synagogue in Pittsburgh in October 2018.

“Gab is happy to announce that we will be welcoming all QAnon accounts across our social network, news, and encrypted chat platform,” the company’s CEO Andrew Torba wrote in an email sent to all users.

Advertisement

Gab and Parler do not have the reach or scale to replace Facebook and Instagram as recruitment platforms to grow the movement, but if QAnon followers are into more extreme sites, they could be radicalized even further.

“It's safe to say that a lot of moderate QAnon followers may not migrate to lesser-known platforms like Parler and Gab, which are certainly not as user friendly [but] because those platforms do not regulate hate speech and disinformation, people that do migrate to those platforms may be exposed to even more extremist material than what was being shared on Facebook,” Gallagher said.

Facebook’s former head of security Alex Stamos also drew a parallel between how ISIS operated online and how QAnon may now operate, suggesting that the most extreme members may migrate to a smaller home online, but continue to use Facebook as a recruiting tool.

But the most strident QAnon supporters are taking the Facebook ban in their stride, pointing out that this was all part of the plan.

Last month, the eponymous Q posted a message telling followers to “deploy camouflage” and avoid making any reference to QAnon on mainstream sites like Facebook and Twitter to avoid just such a ban.

Now the ban is in place, QAnon supporters are using it as evidence that Q was right all along.

The QAnon accounts and groups who have survived on Facebook have flushed their pages of references to Q and are setting up back-up accounts in case they too are purged.

QAnon followers on Twitter, rather than worrying that Twitter would follow Facebook’s lead and ban them, appeared to be too excited about the news that the Department of Justice and the FBI are holding a press conference Wednesday morning, to worry about anything else.

Cover: In this May 14, 2020, photo, a person wears a vest supporting QAnon at a protest rally in Olympia, Wash., against Gov. Jay Inslee and Washington state stay-at-home orders made in efforts to prevent the spread of the coronavirus. QAnon is a wide-ranging conspiracy fiction spread largely through the internet, centered on the baseless belief that President Donald Trump is waging a secret campaign against enemies in the "deep state" and a child sex trafficking ring run by satanic pedophiles and cannibals. It is based on cryptic postings by the anonymous "Q," purportedly a government insider. (AP Photo/Ted S. Warren)