YouTube’s business model is powering a network of far-right accounts, by channeling large audiences from mainstream conservative and funneling them to radical white nationalists, according to a new report published Tuesday.
The report by the Data & Society Research Institute, an independent nonprofit, contends that YouTube not only allows the spread of its more extreme content, but actively encourages it.
“In many ways, YouTube is built to incentivize the behavior of these political influencers,” the report’s author, Becca Lewis, said. “YouTube monetizes influence for everyone, regardless of how harmful their belief systems are.”
“The platform, and its parent company, have allowed racist, misogynist, and harassing content to remain online — and in many cases, to generate advertising revenue — as long as it does not explicitly include slurs,” Lewis added.
Data & Society Research Institute’s research tracked a group of 65 scholars before arriving at their conclusion. The field included media pundits and internet celebrities who promote a full range of right-wing political positions, from mainstream conservatism to overt white nationalism.
At the more mainstream end of the spectrum are figures like Jordan Peterson, Dave Rubin and Ben Shapiro, notes the report, while white nationalist Richard Spencer and alt-right vlogger Colin Robertson (known online as Millennial Woes) occupy the radical end of the scale.
Though there no direct links between Peterson and Spencer, they have both collaborated with Carl Benjamin (known as Sargon of Akkad), a British vlogger who gained internet fame during GamerGate by posting content critical of feminist game critics and academics. It is through these peer-to-peer collaborations that Lewis says radical ideas are spreading on YouTube, and picking up larger audiences along the way.
“By connecting to and interacting with one another through YouTube videos, influencers with mainstream audiences lend their credibility to openly white nationalist and other extremist content creators,” Lewis says.
Lewis says these “microcelebrities” have used a number of tactics to gain followers on YouTube, including, search engine optimization, gaming keywords and staging “strategic controversy,” or stunts.
YouTube’s business model encourages these vloggers to publish more content through lucrative ad-sharing deals. The Youtube Partnership Program (YPP) gives accounts that have over 1,000 subscribers and 4,000 watch hours per year the chance to pocket a percentage of the advertising run over their videos.
And fame on the vast video platform can also be converted into hard cash on other platforms, such as Patreon.
YouTube says that as long as channels stick within their Community Guidelines — which they “enforce rigorously” — then there is little they can do. The company also says it has “made updates over the past year to tighten our monetization policies and improve our enforcement against hate speech.”
But Lewis’ research suggests those steps are having little impact on the spread of radical far-right beliefs on the platform., “Much extremist content is happening front and center, easily accessible on platforms like YouTube,” Lewis said. “Publicly endorsed by well-resourced individuals, and interfacing directly with mainstream culture.”
Cover image: Silhouettes of mobile users are seen next to a screen projection of Youtube logo in this picture illustration taken March 28, 2018. REUTERS/Dado Ruvic/Illustration