News

TikTok Is Pushing Incel and Suicide Videos to 13-Year-Olds

It only takes ten minutes on the platform for TikTok’s powerful algorithm to start suggesting extremist content to kids, a new report found.
tiktok-incels-targeting-young-users
EMS-FORSTER-PRODUCTIONS/Getty Images

Within minutes of TikTok’s youngest users signing up for a new account, the platform’s powerful algorithm is bombarding teens with extremist content, including videos promoting suicide and the virulently misogynistic incel subculture, according to new research published today by corporate accountability group Ekō and shared with VICE News.

Advertisement

Despite TikTok’s promises to crackdown on this kind of content, it’s still easily discoverable by new users who want to seek it out. And TikTok’s recommendation algorithm is so advanced that it will begin pushing increasingly extreme content into the feed of new users after they’ve used the app for just 10 minutes, the new research states. ​​

The researchers set up nine different new accounts on TikTok, which has replaced Instagram and Facebook as the de facto social media platform for teenagers in America, with 150 million active users. They stated their age as 13, the youngest age users can join the platform. They then mimicked users who were curious about topics like incel content.

The researchers found that after viewing just 10 videos related to these topics, the “For You” pages of the new accounts were filled with similar, and often more extreme, content.  

One test account was shown a post that included a clip of Jake Gyllenhaal, whose films have been popular amongst incels. The video shows the actor with a rifle in his mouth saying, “Shoot me. Shoot me in the fucking face,” alongside text that reads: “Get shot or see her with someone else?” 

The video, which has now been removed, racked up over 440,000 likes, over 2.1 million views, 7,200 comments, and over 11,000 shares. The majority of commenters were supporting the suggested suicide, while other users shared commentary about their loneliness, many suggesting “they feel dead inside.” One commenter even suggested their own suicide in the next four hours.

Advertisement

“Ten minutes and a few clicks on TikTok is all that is needed to fall into the rabbit hole of some of the darkest and most harmful content online,” Maen Hammad, Ekō campaigner and co-author on the report, said in an emailed statement to VICE News. “The algorithm forces you into a spiral of depression, hopelessness, and self harm, and it’s terribly difficult to get out of that spiral once the algorithm thinks it knows what you want to see. And it’s extremely alarming to see how easy it is for children to fall into this spiral.”

In a statement issued to VICE News in response to Ekō report on Tuesday, TikTok said: “We work hard to prevent the surfacing of harmful content on our platform by removing violations.”

Ekō’s research is just the latest revelation about TikTok’s failure to tackle problematic content on its platform, including self-harm, terrorism, and disinformation. And despite repeated promises to do better, TikTok’s policies have so far failed to stem the spread of this content.

Incels, or “involuntary celibates,” are a large online community overwhelmingly made up of young men who formed a bond around their lack of sexual success with women. The incel subculture is a part of the “manosphere,” an umbrella term for the interconnected misogynistic communities typified by the content produced by accused rapist Andrew Tate. Incel communities began on online channels and forums such as Reddit and 4chan. However, in recent years they have begun successfully using and building their networks on more mainstream social media platforms, such as YouTube and TikTok.

Advertisement

The report found that for 13-year-olds joining the platform, finding content related to incel culture was incredibly easy using dozens of different hashtags, particularly content related to Elliot Rodger, the 22-year-old who shot and killed 7 people in Isla Vista, California in May 2014 after posting a video d to YouTube where he complained about being a virgin and having never kissed a girl. He is viewed as a hero in the incel community.

That video has been banned from all major social media platforms, but Ekō researchers were able to find multiple copies of it circulating on TikTok.

One post celebrated the shooter by posting a version of the video with text that read: “To my fav Murderer.” Another video where he can be heard saying: “Tomorrow is the day of retribution… If I can’t have you, girls, I will destroy you,” has been viewed millions of times on TikTok.

TikTok has known about the spread of incel content on its platform for years. A report from the Institute of Strategic Dialogue (ISD), published in August 2021, highlighted the problem of incel content on TikTok, and noted that videos of the Isla Vista shooter in particular were being shared widely. In a statement to VICE World News at the time, TikTok promised to “work aggressively to combat hateful behavior by proactively removing accounts and content that violate our policies.”

Advertisement

After the 2021 report was published, the company did remove some of the videos highlighted in the report that violated its community guidelines, and TikTok has banned certain search terms, such as “incel” and “blackpill,” a term that describes a belief, popular among incels, that looks are genetically determined and that women choose sexual partners based solely on their physical features. But VICE News was still able to find many more examples of the Isla Vista shooter video still on the platform.

TikTok disputed the new report’s findings, claiming it does “not reflect genuine behavior that we see on TikTok.” But there is growing evidence that the misogynistic worldview of figures like Andrew Tate is taking hold among children as young as 12.

“What is clear from ongoing monitoring of hateful and extremist ideologies, content and creators on TikTok is that misogyny is still openly posted and promoted by users with apparent little resistance on the platform,” Ciaran O’Connor, a researcher at ISD who first flagged the problem of incel content on TikTok in 2021, told VICE News. 

O’Connor said that in the space of a few minutes he was able to find content praising the shooter, which suggests that the rudimentary steps TikTok has taken to tackle this content are just not good enough.

Advertisement

In addition to widely available incel videos, TikTok is rife with problematic content. Last month a report highlighted how it was promoting self-harm content to teenage girls, while a report from ISD last week highlighted that the platform is full of content glorifying the Christchurch massacre.

“It’s reflective of a wider failure on TikTok that users determined to post or promote hateful figures or ideologies are adept at using evasion tactics to bypass simple keywords bans on TikTok,” O’Connor said. “We have repeatedly called on TikTok to evolve its policies beyond narrow hashtag and keyword bans yet, as documented in other research related to terrorist content, it appears users are exploiting this enforcement gap with ease.”

One of content creators’ most-widely used tactics for avoiding the blocks TikTok puts in place is to deliberately misspell search terms, while the use of more specific hashtags make it very easy to find incel content on the platform. 

“It took me about 2 minutes of searching for you to find content that praises Elliot Rodger, for example, by using the hashtag #incelcore,” Tim Squirrell, head of communications and editorial at ISD, told VICE News. Squirrell said that sending young users down these rabbit holes could have dire consequences.

“One of the side effects of having to search for more specific terms in order to access incel content is that you often find yourself accessing other kinds of content that people are trying to hide,” Squirrel said. “So for example, if you search for the name of a famous incel-adjacent 4chan board, you can very quickly find yourself watching Roblox reenactments of gore videos.”

But even if TikTok, does tackle these issues,  it is facing a possible ban in the US over its parent company ByteDance’s links to the Chinese government, and later this week its CEO Shou Zi Chew will appear before Congress for the first time.

Want the best of VICE News straight to your inbox? Sign up here.