This story is over 5 years old.


We Analyzed More Than 1 Million Comments on 4chan. Hate Speech There Has Spiked by 40% Since 2015.

Violent threats against minorities have also proliferated on the anonymous message board.

On 4chan you'll find anime, porn, and sports chatter. You'll also find an endless stream of racist threats, stomach-churning memes, and misogynistic vitriol — and it’s getting worse, according to a VICE analysis of more than 1 million comments on one of the site’s most popular message boards.

On the heavily trafficked “politically incorrect” board, slurs against racial, ethnic, religious, or sexual or gender minorities have increased by 40% since 2015, while neo-Nazi propaganda has proliferated. And users on the forum are increasingly making violent threats against minority groups: Comments that include both hate speech and violent language have increased by 25% over the same period.


After a wave of recent attacks by white nationalists across the world, social media platforms have begun cracking down on hate speech. But anonymous online forums like 4chan — a comment board designed to facilitate discussions between users posting threads of text, images, and memes — have remained a toxic, anonymous mixture of hate, bigotry, and misogyny, and have given violent extremists a kind of digital safe space.

Perpetrators of recent far-right terrorist attacks have been connected to similar forums. The shooter in the March massacre in Christchurch, New Zealand, had spent time on 8chan, one of 4chan's even more vitriolic offshoots. In his racist manifesto, he used coded white nationalist language found on both platforms. A man who attacked a synagogue in Poway, California, in April, also spent time on 8chan. Both wrote on message boards about their deadly plans and were encouraged by users there to carry them out.

**Read more: **Decoding the racist memes the New Zealand shooter used to communicate.

4chan is still the most popular English-language type of imageboard site, attracting about 20 million monthly visitors — comparable to the volume of traffic at a major news site. Users developed 8chan and neinchan, more radical variants of the original, in response to perceived censorship. They’re smaller, less moderated, and even more extreme communities.


But bigotry is rampant on 4chan as well. On that platform, hate speech is present in one out of every 15 comments, exceeding the volume on other, openly racist websites. Users frequently celebrate violence against minorities and far-right terrorists. Former users reported that exposure to the language, memes, and videos on the site desensitized them to hateful language and bloodshed, even in their offline lives.

The findings raise questions about free speech, terrorism, and the policing of online extremism. While mainstream platforms have made some efforts to kick white supremacists off their websites, they may just be driven further underground where it’s even harder to monitor radicalization and threats.

Read more: Military revenge porn is thriving on anonymous servers and image boards.

8chan doesn’t keep complete archives, for example, in part because of child pornography present on many pages. But researchers believe that 8- and neinchan contain even more extreme white supremacist content than 4chan. None of the chan sites can be easily shut down because they operate under uncertain ownership and can easily switch to different servers if a company refuses to host their content. There haven't, however, been reported attempts by governments to do so — although there has been some debate over whether shutting down the chans could drive extremists even further off the grid.


A teenage programmer launched 4chan in 2003 as a venue to nerd out on anime, but it quickly spiraled out of control. Within a few years, racism and misogyny were pervasive on at least one forum, and visitors there were collaborating on frightening pranks — reportedly including calling in bomb threats and posting swastikas on a children’s site.

These days, 4chan is still home to mundane discussions about comics and news. But it is also a cesspool of hate.

4chan boards only keep 15 pages worth of comments at a time, but another website, 4plebs, archives 4chan’s “/pol/” messageboard content back to 2014. VICE News collected about a million randomly selected comments from 4plebs, approximately 10 percent of the total number of comments, and searched them for common racial, homophobic and ethnic slurs, as well as white supremacist slogans and violent words like “shoot,” “kill,” ”stab,” and “bomb.”

Since a low point in 2016, the use of six slurs targeting racial and ethnic minorities or LGBTQ people has increased by more than 40% on 4chan’s “politically incorrect” (‘/pol/’) board, billed as a forum for news and politics discussion and the second most popular board on the site. One in every 15 posts on /pol/ so far in 2019 has included some term of hate speech. That’s 30% more slurs than found within the comments on an openly Neo-Nazi website, the Daily Stormer, a VICE News analysis of the comments on that site found.


Former 4chan users reported frequently seeing such speech even on topic boards less toxic than /pol/, such as the “random” (‘/b/’) and sports boards.

Because 4chan users often employ alternative terms to substitute for racial slurs, these numbers almost certainly underestimate the true frequency of hate speech on the site. Posters can utilize a variety of misspellings, dog whistles, and coded language to refer to certain races or ethnicities. Users often likened various minority groups to animals in ways that were clearly racist, even if they didn’t explicitly invoke a slur. According to a former user, in one case, the moderators of the /b/ board used software to change all instances of the n-word in comments to the phrase “peanut butter.”

Since 2015, about one out of every 10 posts on /pol/ using hate speech also used violent language. Sometimes the text included a direct threat of violence toward a minority group; in other cases, the writer implied that a racial minority was committing or would commit violence against white people, a common white supremacist trope. Dozens of comments exhorted other 4channers to “kill themselves,” often punctuating the phrase with a racial or homophobic slur. Sometimes posters made threats indirectly, speculating about what groups would next be terrorized, bombed, or shot.


The proportion of comments containing both hateful and violent speech has increased in the last three years as well, rising by about 25%. “I think they are modeling the type of behavior they want others to engage in,” said Oren Segal, the director of the Anti-Defamation League’s Center on Extremism.

Other comments are more ideological, regurgitating Nazi slogans or making glowing references to white supremacist groups. The use of half a dozen white supremacist watchwords — including “heil Hitler,” “sieg heil,” and “white power”— peaked in early 2017 at around triple the average volume in the previous two years.

Read more: How a small Budapest publishing house is quietly fueling far-right extremism.

The rise in that language began in the summer of 2016, increasing in tandem with the presidential election and only beginning to abate in 2018. Comments mentioning now-President Donald Trump or his catchphrase “Make America Great Again” were about 10% more likely to also contain a Nazi slogan. The total volume of Nazi watchwords has since declined but is still about 40% higher than before the 2016 election. They appear in about one in every 100 comments.

Even as 4chan has grown more extreme, overall web traffic on the site has more than doubled in the last decade. Visits to the site peaked in early 2018 and have fallen significantly since, but 4chan remains one of the thousand most popular sites worldwide. It draws visitors from around the world, and it’s even more popular in the United Kingdom and Canada than it is in the U.S.

Section break

The real-world effect of exposure to the large amounts of violent, hateful, and white supremacist speech on 4chan and its sister sites is largely unknown. Research does show that exposure to frequent use of hate speech desensitizes people and can increase prejudice. And anecdotally, former 4chan users described this kind of language becoming “normalized,” potentially “bleeding into their offline lives.”

One wrote in an email that he had become “desensitized to things that I would have maybe found appalling before, the biggest example being ISIS beheading/torture vide[os] & gore in general.” Another former participant wrote: “4chan seems to push people towards the darkest side of humanity.”

The increase in hate and violent speech on 4chan “is likely due to multiple overlapping dynamics,” said Patrick James, a researcher at the National Consortium for the Study of Terrorism And Response to Terrorism (START). Paradoxically, the crackdown on this kind of content on more mainstream sites like Reddit and Twitter may have led users to congregate on 4chan. “As these actors are pushed out of other sites, 4chan and 8chan have increasingly become sites where these people can have a community and share extremist ideas without fear of it being shut down,” James said.


And even as some social networks have censored or blocked extremist content, some mainstream political leaders have at times embraced white supremacist language. Studies suggest that the use of dehumanizing language by political leaders, such as Donald Trump’s statements about Mexican immigrants being rapists, helps fuel racist comments on the internet, which then become concentrated on sites like 4chan. Other leaders worldwide have echoed similar language in the last few years, with the United Kingdom’s Nigel Farage calling President Obama “a loathsome creature”; following the Brexit vote Farage campaigned for, hate crimes in the U.K. surged. In Brazil, newly-elected president Jair Bolsonaro made threats of violence against gay men and racist remarks against Indigenous people and citizens of African descent.

The white nationalist shooters in New Zealand, California, and Pennsylvania described explicit connections to 8chan in manifestos explaining the motivations behind their attacks. “In each one of those cases, these folks signaled back again to these communities letting them know what they were about to do. And in at least two cases, Poway and New Zealand, [other users] were encouraging them and hoping that they would join them on this journey,” Segal said.

Researchers said many aspects of chan culture do predispose their vast usership to extremism. But despite the overt connections between some terrorists and these communities, it’s unclear whether exposure to imageboard extremism is correlated with violence or causes it.


“There is quite a lot of research showing how people are radicalized or what leads them down radicalization pathways, but the truth is, for all types of extremism, we really don’t know what leads one person to become violent and another person to not make that violent choice,” explained Cynthia Miller-Idriss, a professor at American University and expert on far-right extremism.

Read more: The U.S. doesn't prosecute far-right extremists. Here's how it could.

Attacks by individuals with extreme ideologies similar to those featured prominently on 4chan are increasing in the United States. The proportion of terrorist attacks by far-right attackers has nearly tripled in the last five years compared to the previous five years, according to Erin Miller, the principal investigator at the University of Maryland’s Global Terrorism Database.

The number of fatalities from these attacks increased more than seven-fold, even excluding the deadly attacks in Las Vegas and Parkland, where it was unclear whether to classify the incidents as terrorist violence. The FBI announced in May that there has been a significant increase in threats from domestic white supremacist terrorists over the last year.

Imageboard sites host a great deal of material unrelated to white nationalism, which sometimes puts hate speech or Nazi slogans a click away from sports or anime discussions, a phenomenon Miller-Idriss described as “gateway content.”


“If you were to go onto 4chan looking for debates on the hockey playoffs, you’re much more likely to come across some sort of neo-Nazi meme than you are on Facebook,” James said. Much of the other material is lighthearted and humorous, which former 4channers said helped bring them there.

“If the same vessel that provide[s] you the most laughs also validates your most hateful thoughts, then, yeah, that’s incredibly awful,” said one former user in an email.

James and Segal said members of organized white extremist and neo-Nazi groups often engage across 4chan. James said there isn’t a lot of formal recruiting on the platform, but groups use it as a “testbed” for messaging. Many popular white supremacist memes such as Pepe the Frog originated there and spread to other sites, in some cases being adopted into more mainstream internet discourse.

The tone of many memes exhorts 4chan users to act immediately to produce revolutionary change. One of the most popular tropes is the idea of “white genocide,” the false conspiracy theory that Jews are organizing a mass replacement of white citizens by non-white people in the United States and other countries. (The use of this term on 4chan has more than doubled in the last three years.) According to Miller-Idriss, dire, apocalyptic visions like this may inspire radicalization by giving potential terrorists a way to feel heroic. By performing violent attacks, they may believe they are tearing down an imaginary system of oppression against whites.

Read more: How police departments are dealing with officers spewing hate online.

And 4chan users shower such terrorists with praise. Past attackers from New Zealand to Norway are mentioned hundreds of times in comments, with users sometimes proclaiming their innocence and often calling them heroes or bestowing them titles like “commander.” The level of praise was explicitly correlated with the degree of violence achieved by the assailant — higher body counts led to more positive mentions, including specific compliments on the efficiency of an attacker.

Experts agreed that this type of post-event validation could give future terrorists a reason to commit similarly violent attacks. “The role of these platforms to not only radicalize but to inspire the next attack — I think we’re starting to see that play out in the recent attacks that we’ve seen,” Segal said.

Cover: The alt-right Kekistan flag is flown at an Act for America rally to protest sharia law on June 10, 2017 in Foley Square in New York City. Members of the Oath Keepers and the Proud Boys, right wing Trump supporting groups that are willing to directly confront and engage left-wing anti-Trump protestors, attended the event. (Photo by Andrew Lichtenstein/ Corbis via Getty Images)