News

TikTok Is Flooding Vulnerable Teenage Girls With Self-Harm Content: Report

Research claims that the TikTok algorithm takes just 2.6 minutes to show vulnerable girls videos touting restrictive eating plans and self-harm content.
A teenager holds a smartphone with the TikTok logo on January 21, 2021 in Nantes, France.
A teenager holds a smartphone with the TikTok logo on January 21, 2021 in Nantes, France. (Photo by LOIC VENANCE / AFP via Getty Images)

TikTok’s algorithm is inundating vulnerable children as young as 13 with self-harm and eating disorder videos minutes after they join the platform and the company appears to be doing nothing to stop it, according to new research published Thursday.

The research, from the Center for Countering Digital Hate (CCDH), a British non-profit organization, states that TikTok’s powerful recommendation algorithm takes just 2.6 minutes to show vulnerable teenage girls content that includes explicit pro-thinness videos, restrictive eating plans, and self-harm content.

Advertisement

The researchers found that the accounts they set up featuring characteristics of vulnerable teenagers were served these kinds of videos 12 times more than standard accounts.

“The results are every parent’s nightmare: young people’s feeds are bombarded with harmful, harrowing content that can have a significant cumulative impact on their understanding of the world around them, and their physical and mental health,” Imran Ahmed, CEO of CCDH, wrote in the report titled Deadly by Design

In the U.S., TikTok has overtaken Instagram and Facebook as the social media platform of choice for young people, with two-thirds of all teenagers using the platform for an average of 80 minutes every day. The company’s popularity has been driven largely by its powerful recommendation algorithm, which serves up content for users in their “For You” feed and is the primary way users interact with the app.

“It's just a beautiful package, but absolutely lethal content,” Ahmed said. It's the speed and ferocity of the algorithm that makes it so dangerous.”

VICE News reviewed some of the footage that CCDH researchers found being served up to the “For You” feeds of the teen girl accounts they set up. Several of the videos featured razor blades and discussions of self harm, while one video taken from a hospital bed featured the question: “You’re not even that fat, why are you so insecure?” and the user’s response: “I see things THAT NOBODY else sees.”

Advertisement

The most extreme content included videos filmed by members of the eating disorder community, who discuss topics like a teenage boy going to school after they attempted to die by suicide, or teenage girls discussing methods to self-harm,, and “thinspo” or thinspiration, which are videos aimed at motivating weight loss. In one instance, a vulnerable teen account was shown three videos of other teens discussing suicide plans in the space of a single minute while scrolling through videos.

“It's like being stuck in a hall of distorted mirrors, where you're constantly being told you're ugly, you're not good enough, maybe why don't you kill yourself. It is a really full on experience for those kids because it's just rushing at them. And that's the way the TikTok algorithm works,” Ahmed told reporters at a press briefing this week.

TikTok disputed the findings of the research, claiming that the methodology did not “reflect genuine behavior or viewing experiences of real people.” 

A spokesperson for the company said it “regularly consults with health experts, removes violations of our policies, and provides access to supportive resources for anyone in need.” TikTok added that it was in the process of removing the content flagged in this report that violated its policies.

CCDH has published a TikTok Parents Guide to help the parents of teenagers suffering from these issues on how to spot troubling behavior and details about how the app works. But Ahmed believes that lawmakers should take the lead, and just this week, a bipartisan bill was introduced in Congress that, if passed, would ban the app completely in the U.S. due to fears of the influence of Chinese government control on TikTok.

“This is a classic moment for regulators to step in and say we need to have some sort of non proliferation agreement, that we won't just create algorithms that are more and more addictive and more and more lethal,” Ahmed said. “But we don't have that, we have nothing, we have no guidance from the government or from regulators, from the FTC, from anywhere else.”