News

TikTok Has an Incel Problem

Toxic masculinity and incel language like #redpill can be found all over the app.
Women suffer misogynist abuse on TikTok, which claims the language does not violate their terms of service.
Women suffer misogynist abuse on TikTok, which claims the language does not violate their terms of service. (Photo: Getty Images)

“This woman needs a beating,” says the TikTok comment that Rue Partida reported for breaking community guidelines. Less than 30 minutes later, she received a notification—according to TikTok, the comment didn’t violate any rules at all.

“What the fuck is wrong with TikTok for not taking this comment down, but everyone else gets banned left and right for nothing?” tweeted Partida, a 21-year-old in LA. “

Advertisement

In the past six months, more and more TikTok users have been sharing their experiences of encountering misogynistic content on the app, to the extent that one user, Benito Thompson @_mindofmusic, has become known for singing the word “misogyny” to a violin and piano musical accompaniment in response to a number of videos on the app. “Users definitely tag me in misogynistic content all the time now,” he told VICE World News in an email.

Using the stitch feature, in which someone can include a clip of another TikTok user’s video in their own TikTok, women have been calling out behavior or jokes they consider to be demeaning. 

While Facebook, YouTube, and Twitter have been taking very public criticism for their failure to protect users from hate speech, TikTok has somehow slid conspicuously under the radar. But there’s concern the platform isn’t doing  enough to counter misogynistic comments and hate speech, some of which is associated with groups with a history of real-world violence against women, such as incels.

Back in August, the Institute of Strategic Dialogue released a report by analyst Ciarán O’Connor looking into hate speech on TikTok, which found numerous examples of misogyny. Accounts promoting Men Going Their Own Way (MGTOW), a social movement that is part of the wider manosphere marked by overt misogyny, were discovered alongside videos that combined misogyny with other forms of hate, such as white supremacy. 

Advertisement

O’Connor also found videos recorded by Elliot Rodger that had been uploaded by a variety of users. Rodger killed seven people in 2014 in a shooting, stabbing, and vehicle attack in California, including one in which he laid out his plans to murder people, specifically women, prior to carrying out his attack. The 22-year-old self-described incel killed himself following the attack.

Several months later, VICE World News has been able to find historic video clips of Rodger speaking disparagingly about women, simply by searching his name, including one where he says “you girls have starved me of sex, and enjoyment, and pleasure for my entire youth.” It’s been seen over 123,000 times. 

While searching for “Elliot Rodger” brings up the app’s banner telling the user “this phrase may be associated with hateful behavior,” we were able to find videos using associated terms, including: “elliot rodge”, “elliot rodgers footage” and “elliot rodgers edit.” Users are also able to use the hashtags #elliot and #rodgers next to each other. 

OK, So Facebook Is Bad. Now What?

After VICE World News raised the videos with TikTok, the platform deleted them. 

Similarly, VICE World News was able to find that while #MGTOW was a banned hashtag, users were allowed to have MGTOW in their usernames, such as @mgtowistheway and @mgtowmemes, which has published videos like “3 worst type of cockblockers,” a video in which a 3D character tries to insert a stick into a woman’s mouth while laughing, and a video in which a man jokes that he is surprising his “favorite person” and enters her home before revealing that she doesn’t know he exists. 

Advertisement

A TikTok spokesperson said: "Hate has no place on TikTok, and we do not tolerate any content or accounts that attack, incite violence against or otherwise dehumanise people on the basis of their gender. We work aggressively to combat hateful behavior by proactively removing accounts and content that violate our policies and redirecting searches for hateful ideologies to our Community Guidelines.”

Going by its banned hashtags, TikTok seems to be aware of a possible incel problem, although they have never mentioned it publicly; #incel is banned, as are #femoid, a derogatory term for women, and #blackpill, a belief in which looks are genetically determined, that women choose partners purely based on such looks, and that there is no point in trying to alter this. 

But the bans have been uneven. #redpill, in which adherents believe this can be changed by picking up dating tips or changing how they look, is flourishing at 942.9 million views. Much of the content might not be outright hate speech, but among the most-viewed videos under this hashtag are TikToks titled “Do you think make-up is a form of lying in a relationship?”, “If you are interested in a girl but she says: ‘Let’s just be friends’, tell her “NO!”’, and “Women be like: I deserve a $9,000 ring, $50,000 wedding...and all she has to offer is pre-owned pu**y.”

Advertisement

#Hypergamy, where plenty of women post as well as men, has over 100 million views. It’s a word used in the incel community to describe their belief that women will always go for men they see as superior in looks and status. 

“Regarding why some incel-adjacent lexicon is allowed, and why others aren't, it would be encouraging if TikTok were to share publicly, or in a limited capacity with researchers/media to help us understand, what definitions it uses for terms that have a clear link to hateful ideologies,” said O’Connor. “Essentially—greater transparency.”

While O’Connor is unsure why #redpill would be permitted while #blackpill is not, he says it highlights “how TikTok’s efforts to limit the use of offensive phrases or terms linked to extremist activity or communities are quite narrow.”

He added that TikTok routinely fails to address alternative spellings of banned hashtags, allowing users to easily evade them.

“I think something that online companies need to see is that, while incels and the Blackpill are an extreme form of misogyny, the same ideologies and the same attitudes towards gender roles are still present across multiple Red Pill communities, and even in some communities which would not necessarily identify as being ‘red-pilled,’” said Frazer Heritage, who specializes in gender and sexuality at the School of Social Sciences at Birmingham City University.  

Advertisement

“One of the issues online platforms face is that swaths of society are misogynistic—or at least built on misogynistic ideas. So, there is a difficulty in drawing a ‘line in the sand,’” he said. 

Heritage added that this becomes even harder if content moderators are low-paid and have to deal with thousands of images every day; discerning a joke between friends, irony, and full-frontal abuse becomes impossible.

Not all misogynistic insults would necessarily be recognizable to users themselves, nevermind the app or its moderators. User @loriloo007 made a video talking about a user who had commented on her video on menopause saying: “post wall women are invisible to me.” She had no idea what it meant. A search on Urban Dictionary told her it was “a woman who has passed her fertility window and has become very unattractive to men. After women hit 35, they are basically a post wall woman. It’s like they hit a wall at full speed and their face got all messed up.”

In terms of content creation, it’s in the “red pill”–adjacent spaces where life and dating advice laced with misogyny run rampant. User @MahdiTidjani gained traction for his controversial views around women, and his videos, now deleted, still survive in the TikTok graveyard of stitched videos. “I lead, you follow. If you don’t like it, tough shit,” he says in one about women. 

Advertisement

“When you’re a single mother, you’re on your own, your sexual market value has deteriorated dramatically,” he says, in a video another user has uploaded of him. In another video, in which a woman has used the duet function to respond, rolling her eyes, he says: “If you’re a high-flying woman and have a successful career...we don’t care. Your master’s, your Ph.D., your CEO title, we don’t give a damn. We don’t value it. If the market of men doesn’t value it, don’t bring it to the relationship table expecting brownie points.” 

“You opened your legs to him,” he writes in another video, during which he says straight to camera: “The sheer lack of accountability some of you women have is astounding.” 

Tidjani’s videos eventually got him permanently banned. But plenty more remain. Male users in his vein are increasingly being questioned beyond the parameters of TikTok into other social media platforms. 

YouTuber Kuncan Dastner made a video in September about “The TikTok Incel in an Alpha Male Costume,” @heartbreaknino617, and includes clips of him saying things like “I’m a man, I’m a leader, I’m the ‘toxic asshole guy’, the ‘tool’ right, because when we go out, you expect me to take care of you...but when I tell you to do something, do it.” He also posted a clip in which he makes fun of women asking for equality. 

Advertisement

In a YouTube video with over 3 million views, Kurtis Connor criticizes men on TikTok who give alpha-male dating advice that is, for him, toxically masculine. One of their videos gives advice on how to get girls to “chase” you, and recommends leaving women on read (not replying). “Especially if she says something stupid. Let her question what she did wrong and let the hamster wheel work,” says the TikTokker he clips up. Connor calls it “textbook manipulation.” 

Abbie Richards, a disinformation expert who’s prolific on TikTok, thinks users rightfully find some types of this masculinity content troubling. As part of her bank of research, she sets up burner accounts that view different kinds of content, to see what other videos the TikTok algorithm recommends.

Her “masculinity” search, she says, quickly escalated “further into radical, evil ideologies.” She would begin watching videos featuring men doing workouts or joining the Marines, but was soon led down a rabbit hole to videos like “how you should never settle for any woman”. 

TikTok is by no means the only social media platform dealing with an algorithm that refers people to more-extreme content. YouTube has been criticized for hosting incel content and for not taking other hateful content down quickly, such as videos made by far-right activists like Tommy Robinson. Facebook’s own internal research found that its recommendation system is capable of pushing users to extremes. 

Advertisement

But many users expect TikTok to learn from the mistakes of other platforms, as opposed to repeating them. 

It’s certainly been the experience of 22-year-old TikTok user Dagmawi Demeke, who is based in Ethiopia. Last week he tweeted: “5 days on tiktok and it's convinced I'm an incel redditor.” 

Demeke thinks that TikTok directs dating advice-style videos to men who support “alpha male” ideology. “The content is often about being assertive, dominant, and not a simp.” 

While Demeke has watched content he thinks could have similar audiences on YouTube, such as Jordan Peterson, a psychology professor known for his criticisms of feminism and a “backlash against masculinity,” he said that he didn’t intentionally seek it out on TikTok. 

“It’s odd that I was getting that content a few days into creating my TikTok account,” he says. 

Richards agrees that a lot of the content is incel–adjacent, if not blatantly advocating for violence against women. “‘Three facts. Number 1,’” she read to VICE World News from one of the videos she encountered in her “masculinity” search. “Being fat is not OK. Number 2. If you work relentlessly to your goals, they will happen. And 3, there’s no such thing as a guy best friend.”  

Advertisement

But going back to the comment that Rue tried to report, for many female users on the app, their experience of misogyny is not a creator spouting negative dating advice; it’s the comments they read on their own accounts or the accounts of other women. 

“I can say that I’ve made several similar reports over harassment toward different minority women (Islamic, Trans, Jewish, Fat, Disabled, Black and Native American) who are all targeted by similar threats of hate speech,” she told VICE World News over Twitter DMs.

Carolina Are, an online moderation and AI bias researcher and pole dancer, has had individuals respond to TikToks of her dancing with comments like  “fatherless behavior,” “never having a daughter 😳” and “women are not women anymore because of women like this.”

She thinks that moderating language like this is challenging. “It’s just very tricky because while they’re not necessarily dodgy using slurs, they’re implying stuff about my sense of morality, my worth or value as a person. So you’re not going to be able to regulate those with an algorithm, with an automated moderation process,” she said.

She shows a long list of words she’s had to filter out of her comments, including “raped,” “whore,” and “asking for it.” “But that puts the onus on me to protect myself. It’s exhausting to go over and over this.”

Advertisement

Are sees more misogyny when her content seems to get amplified on the For You Page. Her account’s following will grow, but at a price. 

“It’s a trade-off between visibility and being targeted by these people,” she says.

For women who talk about gender and women’s rights, it can feel like misogynists are easily directed to their content if it hits the For You Page. 

Sophie Darling is a 27-year-old Twitch streamer from Bangor, Northern Ireland, who tweeted last month about the ugly comments she;s witnessed on TikTok. 

In a video she made about the challenges of content creation as a woman, a user wrote: “Hard work? Wear clothes that cover you, and you won’t be sexually harassed.” 

Another said: “Streamer… maybe get yourself a real job and get your arse in the kitchen.” 

Darling said she saw a slew of hateful comments after the murder of Sarah Everard in the UK, where a trend emerged of women attempting to demonstrate that no matter what they wear, they risk deadly male violence. 

“Some of the comments make me genuinely sad and fearful for the state of humanity. This was a staged performance, but there was still an excessive amount of victim-blaming for what the girl on screen was wearing, that the scenario of her hanging out with men makes her deserve to die.”

TikTok says they’re constantly training their moderation teams to be aware of and detect hateful behavior, and that partner organisations like Galop, Glitch, Stonewall, and Mama all work with TikTok to support them. 

They also say they’re in touch with experts who help them regularly evaluate their policies, including updating lists of keywords and hashtags on which they  take action.

But for many users, these updates don’t seem to be comprehensive or rapid enough. 

While Richards is troubled by the way TikTok can lead people down rabbit holes, she also loves watching videos from Benito Thompson or @drewafualo, who roasts men who post content complaining about women. 

“In this grey area where it’s tough to police things, is there anything in there that technically violates standards? No,” said Richards. “But do they have every right to be roasted? Absolutely.”