In 2007, researchers at the Southern Poverty Law Center drew attention to what was, at the time, a new problem. Neo Nazis were filming their rallies in a way that thrust viewers into the center of the action, conveniently excluding counter-protestors from the frame. They wanted to control their presentation, cast themselves as winners whose hate is tolerated by the public, and entice potential members with an attractive vision of group identity and inclusion. According to the SPLC analysis, YouTube was making it easier for Neo Nazis to distribute materials and recruit new members.
White supremacy and hate speech have always been a problem on YouTube and other social media. But over the past 11 years, our understanding of YouTube’s role in spreading hate has shifted. The central issue isn’t necessarily that bigots can easily distribute materials within and among themselves. A report by Becca Lewis published last week for the nonprofit research group Data & Society titled “Alternative Influence: Broadcasting the Reactionary Right on Youtube” argues that YouTube’s problem with hate speech stems largely from video collaborations—specifically, collaborations between influencers with big audiences like Joe Rogan and fringe far-right figures of the “Intellectual Dark Web,” whose personal brands are built upon racist, misogynist, and anti-LGBT hate speech. These collaborations benefit both parties, who each get the opportunity to expand their own audiences into the other’s fan base, Lewis argues.
But for the viewers, these collaborations could lead viewers down where, for example, they begin as fans of Rogan and later end up becoming consumers of radicalized, far-right hate speech.
“The main point of entry I focus on in the report is conservative and libertarian influencers with mainstream appeal—like Joe Rogan like [Dave] Rubin,” Lewis told Motherboard in a phone call. “When they host other members of the Intellectual Dark Web, it’s easy to get drawn into that world.”
"Any attempts to respond to it that rely on mainstream media sources of trust simply aren’t going to work"
The collaborative network—which Lewis calls the Alternative Influence Network (AIN)—is not simply an attempt to influence YouTube’s recommendation algorithm; rather, the AIN builds a culture in which personal reliability and authenticity legitimize the people in their network, not factual veracity or institutional support. Rather than building trust through fact-gathering and rigorously sourced information, AIN relies on the persona of a down-to earth, countercultural underdog. In other words, AIN feeds on the precise cocktail that forges virtual communities and propels YouTubers to microcelebrity status.
Lewis said that without an in-depth understanding of the dynamics that drive the AIN, any hope to combat the problem is destined to fail.
“Really, it is this fully fledged different media system with a totally different logic than mainstream media,” Lewis said. “So I think that any attempts to respond to it that rely on mainstream media sources of trust simply aren’t going to work.”
In order to explain exactly how the AIN functions, Lewis constructed a visualization. Each square represents a YouTube user, and each line represents at least one instance in which the two connected content creators appeared in a video together between January 1, 2017 and April 1, 2018. The darker and more central the square, the more connected the YouTube user. The bigger the square, the more likely the person is to connect one content creator with another through a mutual collaboration. Lewis studied 65 content creators across 85 YouTube channels.
“It’s not comprehensive, there’s others that we didn't include just because there are so many people involved with this,” Lewis told Motherboard.
Talk show hosts like Rubin and Rogan depend upon a large number of guest stars and organically connect with other content creators, which is why they are included within this network. Even though they aren’t at the very center of the AIN, with the greatest amount of connections between users, they often serve as a gateway to more extreme users, especially because both have had guests from the far right.
“The Youtube collaborative network really functions differently than an interview on a mainstream network would,” Lewis told Motherboard. “A lot of times, they signify social ties as much as formal interviews… It is really messy. The defense that people like Rubin use is that they’re journalists, they’re going to interview people whose opinions they disagree with.”
In essence, this strategy makes their guest stars more palatable to their audience as human beings, building the human-to-human trust that fuels the AIN.
Rubin tends to engage with these ideas of these individuals in a direct discussion, but he rarely challenges them on their views. Meanwhile, Rogan tend to feed his guest stars easy questions and ignore their more controversial views. Users may notice that Rubin engages with these controversial ideas while Rogan doesn’t (it’s worth noting that Rubin tours with Intellectual Dark Web hero Jordan Peterson.) However, it’s unclear whether this distinction matters. Users are still likely to look up more extreme fringe content, or have YouTube recommend this content, if a fringe figure is featured as a guest star. Interestingly—while many YouTubers do interview shows like journalists, when they are criticized, their fans often defend them by saying "they're not journalists" and thus have no obligation to ask hard questions.
“Even though people like Rubin and Rogan are the most formalized journalistic formats of any of the people in the network, they still draw a lot from YouTube culture and they still have this element of hanging out,” Lewis said. “A lot of Rubin’s guests he will introduce as his friends. They will advertise the channels of the people that are hosting. So a lot of times, if they aren’t explicitly pushing back or are critical of the people that they have on the shows, they can act as an implicit endorsement or even advertisement for other people’s content.”
While it’s very possible that these collaborations encourage YouTube’s algorithm to recommend videos from the guest star, that’s not necessarily the point. “It’s not just the algorithm,” Lewis said. “Even if that algorithm were not in the picture, if someone sees someone guesting on a channel they watch, then they are more likely to check out the channel of that guest.”
Still, the recommendation algorithm can direct users toward more extreme content. On Rogan’s YouTube page, one of the “Related Channels” is the official channel for Steven Crowder, a deeply conservative YouTube entertainer known for unintentionally starting the “Change My Mind” meme during a college demonstration designed to “own the libs.”
However, Lewis noted that talk shows aren’t the only point of entry to the Intellectual Dark Web. After all, a diverse set of communities cultivate a sense of social cohesion and community, and also have a habit of collaborating with more fringe YouTube users. In her research, Lewis identifies figures in the gaming and streaming community as another possible point of entry to fringe YouTube. Carl Benjamin, an anti-feminist Youtube personality, made a name for himself by capitalizing on the height of GamerGate in 2014. Users already embedded in the gaming community may have come across Benjamin organically, without the need for Joe Rogan to serve as a middle man.
According to Lewis, attacking the benefits of collaboration in the AIN might be one of the most viable ways to control the network.
“Some other scholars have really widely pointed to the radicalizing potential of the YouTube algorithm and the need to readdress content recommendation on the algorithm,” Lewis said. “I think that’s really important, but I also think it’s not going to solve this issue by simply looking at the algorithm. It’s also important for YouTube to also reassess its monetization structures, to reassess who they are reward for building followings, for them to assess their content moderation practices.”
Rebuilding YouTube’s algorithm could be a long-term solution, but the fix would be slow. Reassessing YouTube’s monetization structures could be a more immediate solution. YouTube has booted certain users, such as Logan Paul, from its premium advertising program, which allows higher and quicker profits for more popular users, after these users misbehaved on its platform (in Paul’s case, he posted a video which showed the body of a man who had killed himself.) However, even without Youtube’s premium advertising program, users can still profit from advertisements, create sponsored content, or sell merchandise to support their brands.
Another extreme way to attack the network would be deplatforming. YouTube also recently booted Alex Jones from its platform—not for hate speech or accusing the victims of the Sandy Hook massacre of being crisis actors, but for posting “graphic content.” For Jones, who relied on YouTube heavily, being deplatformed has been a critical blow for InfoWars. However, Jones had been violating the site’s content policy for years, and YouTube was clearly hesitant to use this nuclear option. It’s also not evident that deplatforming every member of the AIN—especially more mainstream figures like Rogan and Rubin—is the right way to go.
Lewis told Motherboard that she recognizes the fact that there is no one single solution, and the main purpose of her research was to shed light on the existence and dynamics of the AIN as a whole.
“I don’t pretend to have all of the answers; in fact, I think it’s really important that investigate a lot of different options,” Lewis said. “But what I would say is that it is multi-pronged problem, and it’s going to require a multi-pronged solution.”