TikTok’s recommendation algorithm is prompting users to follow far-right extremist accounts, including those linked to conspiracy movement QAnon and militias like the Oath Keepers and the Three Percenters.
The revelation comes from an investigation by the media watchdog group Media Matters for America, which found that TikTok’s algorithm was recommending content and accounts that the platform claims it has prohibited.
The research found that TikTok continues to promote content from QAnon, Oath Keepers, the Three Percenters, and the pro-Trump movement the Patriot Party, on the “For You” section of the video-sharing app.
The For You section is where TikTok’s much-vaunted algorithm puts content it believes a user will like.
Media Matters’ investigation found, however, that if a user follows one of the extremist accounts, the algorithm quickly recommends similar accounts, and also accounts from other far-right groups.
These accounts shared threats of violence, COVID-19 misinformation, and far-right conspiracy theories. “While the four movements identified are different, they appear to be circulating similar content,” a Media Matters spokesperson said.
One of the Three Percenter accounts claimed that violence was coming on July 4 in the form of a patriot revolt. Another Three Percenter user joined a video chain, writing “say when” to mean he is ready to join a revolt when the time comes.
One QAnon account last week shared one of the conspiracies about the Ever Given ship blocking the Suez Canal that was being shared widely within QAnon circles.
Media Matters found that after a user followed a QAnon account from the “For You” page, TikTok recommended another QAnon account. After the user followed that second QAnon account, TikTok recommended a Three Percenter account.
In the wake of former President Donald Trump’s loss in the November election and President Joe Biden’s inauguration, many QAnon followers felt disillusioned and extremism experts worried they were vulnerable to recruitment by groups like the Three Percenters and the Oath Keepers.
Now, it appears TikTok is facilitating that route to further radicalization.
In total, the investigation found six common scenarios that show how following specific accounts from the For You page shapes the type of extremist content recommended by the platform’s “suggested accounts” algorithm.
Media Matters for America
“This is uniquely harmful because it has the potential to further radicalize people interested in these far-right extremist movements, and it doesn’t even require users to seek them out; TikTok hand-delivers the extremist movements to its users, many of whom are 14 or younger,” the report said.
TikTok told VICE News that it “works aggressively to stop the spread of disinformation and violence.” The company also said it deleted all of the accounts referenced in Media Matters’ report. However, at least three of the accounts referenced in the report are still active. And VICE News separately easily identified several accounts clearly linked to these groups.
This is not the first time TikTok has had to deal with criticisms about extremist content on its platform.
Last year, the platform saw a massive spike in content related to the Boogaloo movement, with videos using the #boogaloo hashtag garnering millions of views. The videos were full of teens and 20-somethings dancing with guns and joking about civil war.
TikTok has also made several attempts to rid its platform of QAnon content, first banning QAnon hashtags and then banning accounts that shared QAnon linked disinformation. But those efforts were not totally successful: months after TikTok banned QAnon hashtags, a Media Matters investigation in October 2020 found more than a dozen QAnon hashtags being shared widely, racking up almost 500 million views combined.