Tech

A Popular YouTuber Read the Christchurch ‘Manifesto’ to Half a Million Subscribers

YouTube left the video online for over two days, allowing it to generate tens of thousands of views and spread to other channels.
GettyImages-1045961624
Image: Sean Gallup/Getty Images

A popular farming, agriculture, and welding DIY YouTuber with more than 600,000 followers suddenly began posting openly white nationalist content on his channel and said he has been trying to subtly "red pill" his audience over the course of months. The YouTuber posted a video of himself reading the Christchurch shooter's "manifesto" and a second video in which he describes himself as a white ethno-nationalist. The hard pivot comes after the YouTuber spent a decade posting videos about welding troubleshooting, tractor repair, and "planting oats."

Advertisement

The new uploads come months after an attacker killed 51 people at two mosques in Christchurch, New Zealand.

As Motherboard previously reported, an internal Google email said moderating the so-called manifesto of the Christchurch attacker would be "particularly challenging," and told moderators to flag all material related to the attack as "Terrorist Content." But YouTube left this particular video online for over two days, allowing it to rake in tens of thousands of views; the company also first demonetized the video and put it behind a content warning, but did not immediately delete it.

Because the video and a follow-up that was also removed were allowed to get so many views, the YouTuber's pivot to far-right content has become a topic of conversation among other far-right YouTubers, who are praising him. Motherboard is not naming the specific YouTuber who posted the videos in order to avoid directing more people to the person's channel.

The news shows not only the failure of YouTube to keep clearly offending content off of its platform, even when users have already reported it for violating the site's policies, but also how popular YouTubers can be in an advantageous position to spread messages of hate if they choose to.

Do you work on Google content moderation, or used to? We'd love to hear from you. You can contact Joseph Cox securely on Signal on +44 20 8133 5190, Wickr on josephcox, OTR chat on jfcox@jabber.ccc.de, or email joseph.cox@vice.com.

Advertisement

The YouTuber in question spent a decade posting 1,600 videos about welding and agricultural machinery, some of which have more than a million views.

While the YouTuber said that he doesn't "necessarily" support all of the attacker's actions, they veered off at several points to add his own thoughts in the video. In the video, he said he printed out the manifesto on the day of the attack, and was reading it for the purpose so others may have an easier time finding an "audiobook" version of the document.

After the Christchurch attacks, YouTube, Facebook, and Twitter faced a wave of users trying to upload sections of or links to the full manifesto. Over a month afterwards, some videos of the attack were still available on social media sites. Google previously said in an email that non-Educational, Documentary, Scientific, or Artistic (EDSA) sharing of the manifesto is against the company's Community Guidelines. This means if the video is not EDSA in context, it should be marked as terrorist content and likely removed.

But YouTube originally flagged the reading video as "inappropriate or offensive to some audiences," seemingly after a number of user generated reports against the clip. On Monday, days after the upload, YouTube removed the clip for violating its Terms of Service.

In a second video uploaded shortly after the reading, the YouTuber made his views much more explicit, and defended his earlier upload.

Advertisement

"I've been a far-right, ethno-nationalist since about 2014 or so," he said, as well as a series of other hateful statements that violate YouTube's policies. While writing this piece, the second video was still online, with the disclaimer that "The following content has been identified by the YouTube community as inappropriate or offensive to some audiences." On Monday, YouTube removed the clip, and replaced it with the text "This video has been removed for violating YouTube's policy on hate speech," and a link to the site's policies.

"This example shows yet again how influencers play a crucial role in spreading white supremacist propaganda. This YouTuber already had hundreds of thousands of subscribers, so he was in a position of broadcasting power to spread ideas to his fans," Becca Lewis, a research affiliate at Data & Society who has researched YouTube's role in spreading such material, said in an online chat.

"Since he already had a sizeable fanbase, this creator could also point his audience to other platforms for viewing the content after YouTube removed it. When platforms are slow to respond in their content moderation practices, creators can take advantage of their cross-platform influence," she added.

The video of the reading and subsequent fallout has spread across YouTube and other sites, including Reddit. Other YouTube channels with a more explicit focus on white supremacy and hate are discussing and aggregating the video, with some including snippets from the original YouTuber's video itself. One far-right YouTuber suggested that the man "may have just jumped started the white awakening."

After a Motherboard investigation showed Facebook banned white supremacy while allowing white nationalism, the tech giant decided to ban support of the latter. Although this particular video of the Christchurch manifesto did violate YouTube's policies, the company previously refused to commit to banning white nationalism in general.

Google did not respond to multiple requests for comment.

Update: This piece has been updated to include comment from Becca Lewis.

Subscribe to our new cybersecurity podcast, CYBER.