Users on mega popular children’s lip-synching app TikTok are sharing calls for violence against people of colour and Jews, as well as creating and sharing neo-Nazi propaganda, Motherboard has found.
Some accounts verbatim read “kill all n*****,” “all jews must die,” and “killn******.” (The words are uncensored on the app, which is a sort of melding of Vine and Instagram that allows users to create short videos synced to music.)
Motherboard found the content on the Chinese-made app, which is used by hundreds of millions people, many including teenagers and children in the United States, within minutes of starting a basic search.
“We’ve never talked to Tik Tok, but clearly we need to,” Heidi Beirich, director of the intelligence project at the Southern Poverty Law Center (SPLC), told Motherboard in an email. “They need the site to be cleaned up—and now.”
The news signals social media platforms’ continued reckoning with hate speech. But even with Facebook mishandling its approach to white nationalism, and Discord providing a haven for serious neo-Nazi groups, TikTok is doing a particularly bad job at moderating white supremacists on its platform.
The hate speech material on TikTok is varied. Some accounts signaled support for Atomwaffen, a violent neo-Nazi group linked to the murders of several Jewish people across the United States. One account Motherboard found was called “Race War Now.” The user profile photo of another account was of an offensive caricature of a Jewish person, depicting a greedy rat.
One video contained a succession of users making Nazi salutes. Another video included the message, “I have a solution; a final solution,” referring to the Holocaust.
Hashtags include 1488, a number important to neo-Nazis, and Sieg Heil, the infamous Nazi slogan.
One TikTok video Motherboard found, which encourages viewers to read Siege, a book popular with neo-Nazis, included the hashtag #FreeDylannRoof. Roof was given nine consecutive life sentences for the massacre of nine African Americans at the historically Black Emanuel African Methodist Episcopal Church in Charleston, South Carolina in 2015.
“It’s just outrageous and dangerous, given how many young people, like Dylann Roof, have been radicalized online and then shifted to violence,” Beirich said.
TikTok merged with the app Musical.ly in August, after ByteDance purchased the latter in 2017. The app has garnered wide praise both from its army of users and media outlets; the New York Times recently described TikTok as “the only good app,” and the Verge called it “joyful.”
When contacted by Motherboard, TikTok did not provide a statement in time for publication.
Beirich said what Motherboard found “is horrifying. That is especially true since this service targets children and I can’t think of worse things to be putting in front of them.” Some of the people in and sharing the offensive videos appear to be children. Some of the accounts say that the posts are a “joke” or “ironic,” but as Motherboard has reported multiple times, these “jokes” can and do radicalize real people and nonetheless harm already marginalized groups.
Caroline Sinders, research fellow with Digital HKS, who has studied online extremism, told Motherboard in an online chat, “I don’t think it matters even if something is a humorous joke in meme culture, I think it’s important to a center a platform’s policies on harassment and hate speech.”
“‘killallni****s’ isn’t a joke; I would argue it is a form of hate speech,” she added.
Some accounts do complain about being reported by other users. One user who complained as such hosted a video of someone in a Klu-Klux Klan style cloaks.
Motherboard has previously found other content moderation issues with TikTok. Earlier this month, Motherboard found people were soliciting nude images of young users on the platform.
ByteDance recently said it would increase the number of content moderators on TikTok from 6,000 to 10,000 people.
Correction: This article previously said TikTok merged with an app called Music.ly. The correct name is Musical.ly. Motherboard regrets the error.