News

Anti-Vaxxers Are Learning How To Game TikTok’s Algorithm – And They’re Going Viral

A VICE World News investigation led to TikTok removing over 50 videos, many of which had been on the platform for weeks and been watched millions of times.
Anti-Vaxxers Are Learning How To Game TikTok’s Algorithm – And They’re Going Viral
Photo: TikTok

“I just wanted to show you guys – I’m NHS staff,” says an unnamed woman, as she twitches and rolls her eyes back. “I had my COVID jab on the 17th of March, that left me with seizures and I was paralysed for two weeks. But it’s left me with tic attacks. And I just wanted to show you what happens.”

This TikTok spreading COVID misinformation was seen 1.2 million times, and the user, @palestine_eye_of_tiger2, gained several thousand new followers in just a few days. It’s one of more than 50 videos that the account posted over August and September which featured clips of people appearing to have seizures and describing their symptoms as an adverse effect of the COVID vaccine.

Advertisement

In other videos posted by the account, the people shaking and twitching make no such claim, but they carry the same text emblazoned across each video – “safe and effective Pfizer” or “safe and effective Moderna”. Adverse reaction conspiracy theories around the COVID vaccine have been rife this summer, such as a viral video from influencer Dominique De Silva which has been debunked by fact checkers. While rare, symptomatic seizures may be a sign of a blood clot affecting the brain, you are far more likely to get a blood clot from COVID-19 than the vaccine, and there is no link between the vaccine and tic disorders

VICE World News has been unable to verify the identity of the woman featured in the video with 1.2 million views, or who the owner of the TikTok account who posted it. 

When VICE World News reported the video in-app as COVID-19 misinformation, we received a notification a few hours later that it didn’t break community guidelines. It was one of three of @palestine_eye_of_tiger2’s videos which had been seen over a million times, while many of the others had between 10,000 and 100,000views. When VICE World News showed the videos to TikTok, the account was deleted and they were removed – yet they had been visible on the platform for weeks and had amassed millions of views.

None of this content should be reaching users on TikTok, 2020’s most downloaded app around the world and a platform that claims it is doing everything it can to fight misinformation. So how did videos from @palestine_eye_of_tiger2 spill onto TikTok users’ For You Page and go viral – and why didn’t TikTok notice?

Advertisement
Photo: LIONEL BONAVENTURE/AFP via Getty Images

Photo: LIONEL BONAVENTURE/AFP via Getty Images

Ciarán O’Connor, an analyst at the Institute of Strategic Dialogue thinks their success was down to both sheer critical mass of content, and camouflage. “My guess is the videos come from sites like bitchute or more likely some devoted Telegram channel that offers a feed of ‘COVID vax reaction’ videos,” he said. TikTok is becoming “a go-to space to post off-platform videos, produced elsewhere and devoid of crucial context, but are nonetheless framed by conspiracy theorists and communities as alarming evidence of supposed adverse reactions and all serve to encourage fear among the viewers.”

What @palestine_eye_of_tiger2 managed to do is trick the app into thinking it wasn’t doing anything wrong. Firstly, it’s avoided writing any text that actually refers explicitly to COVID-19 – something that normally flags TikTok’s AI to automatically include a COVID-19 tag on it, which directs users to a fact-checked information hub in-app. Since August 17th, the account posted around 60 videos about vaccines and accumulated well over 3 million views, yet not a single one featured this COVID information tag, a promise TikTok made in a blogpost in December 2020. “That’s a pretty big failing for me,” said O’Connor. 

The user had however written “Moderna”, “Pfizer” and “Astrazeneca” in several videos, suggesting that TikTok hasn’t included the text of the vaccines themselves as triggers for the COVID tag. 

Advertisement

“The content itself is of course misleading,” O’Connor said. One video shows Denmark footballer Christian Eriksen's collapse during Euro 2020 which was not related to vaccines, and another shows footage that was debunked by the Associated Press last year. “It does show the account is trading in false content. This isn’t a user posting from their bedroom about their lived experience – they are producing a conveyor belt of Covid conspiracy videos.”

Every video had the same motif, the on-screen text “safe and effective”, which juxtaposed the actual video content. By avoiding inflammatory or critical language about the vaccines – and actually including text that confusingly praised it – the account appeared to be camouflaging its videos from moderation.

The account also used TikTok Sounds, the in-app audio catalogue, to cloud over the original video’s audio. With the two playing at the same time, the fact that the audio comes from another user again might also have helped the account evade action.

TikTok_covidmisinfo.jpg

Screen grabs of the TikTok account before it was removed. Photo: TikTok

TikTok has tried to counter misinformation on its platform; it works with third-party fact-checking organisations, and its in-app information hub on coronavirus has had billions of views. But the fact-checkers aren’t making TikToks that go nearly as viral as those like @palestine_eye_of_tiger2’s. Politifact, which has been active for over a year on TikTok, still only has 5,000 followers – @palestine_eye_of_tiger2 has been able to get nearly three times that in just one month. 

Advertisement

Nor is TikTok stopping new conspiracy theories from spiralling on the app; last week, Rolling Stone found that the TikTok algorithm was promoting videos that falsely touted ivermectin, the house deworming medication, as a COVID cure.

O’Connor said he believes that this isn’t @palestine_eye_of_tiger2’s first account; he found that a @palestine_eye_of_tiger1 used to exist, but was banned for violating community guidelines. Considering that there’s evidence remaining on TikTok that this account posted similar videos to that of “tiger2”, he’s probably not wrong. Could it be that this really is the same person, freely spreading misinformation beyond TikTok’s attempts to ban them? 

When VICE World News approached TikTok with the videos, a spokesperson said: "We strive to promote an authentic TikTok experience by limiting the spread of misleading content, including audio, and promoting authoritative information about COVID-19 and vaccines across our app. This account violates our Community Guidelines and we have removed it from TikTok."

TikTok did not respond to our questioning over how the videos were amplified on the For You Page. “There’s such a lack of transparency in that algorithmic assistance,” said O’Connor. “As researchers we don’t understand how it’s being promoted or taking off. We don’t know how, or why, this is on TikTok’s For You Page.” 

In the first quarter of 2021, TikTok announced that it had removed over 30,000 videos for promoting COVID-19 misinformation, and that 61% of them were removed at zero views. But it’s not clear many others, like @palestine_eye_of_tiger2’s, are managing to slip through the gaps – and getting millions of views in the process.