The Christchurch mass shooter was not radicalized on obscure fringe extremist websites like 4Chan and 8kun, but on YouTube and Facebook.
That is the conclusion of an almost 800-page report into the atrocity that took place on March 15, 2019, when Brenton Tarrant murdered 51 people and injured 40 at two mosques in the New Zealand city of Christchurch.
“We have no doubt that he visited right-wing internet forums, subscribed to right-wing channels on YouTube and read a great deal about immigration, far-right political theories, and historical struggles between Christianity and Islam,” the report states.
But it was YouTube where the shooter says he found his “inspiration,” rather than the more extreme websites.
He was “not a frequent commenter on extreme right-wing sites” he told investigators, adding that “YouTube was, for him, a far more significant source of information and inspiration.”
New Zealand’s prime minister Jacinda Ardern flagged this part of the report in her speech responding to its publication, and warned that “this is a point I plan to make directly to the leadership of YouTube.”
Critics of YouTube had highlighted the dangers the company’s algorithm posed, pointing out that it recommended ever more extreme videos, a process that helped radicalize people, particularly those on the far right.
YouTube initially denied there was a problem, and only after the Christchurch shooting, and investigations detailing the link between YouTube and far-right radicalization, did the company alter its algorithm to try to address the issue.
The report also revealed that the shooter donated money to the YouTube channel of far-right white nationalist Stefan Molyneux, and to the U.S.-based National Policy Institute, a white nationalist think tank run by Richard Spencer. And, he sent money to the Daily Stormer and the Right Stuff, two of the biggest white supremacist websites in the U.S., a month after the Unite the Right rally in Charlottesville in 2017.
Then, there’s the role Facebook played a role in shaping the shooter’s mindset in the years before he conducted the shooting, which he streamed on Facebook Live.
He was a member of several high-profile far-right extremist Facebook groups based in Australia, including the United Patriots Front, True Blue Crew, and the Lads Society. Facebook banned some of these groups, but members quickly reorganized on the platform under new names.
The report flagged a number of comments the shooter made in one of these groups, in which he discusses Mein Kempf, the autobiography of Adolf Hitler, talking about it in terms of a perceived threat from Muslim immigrants.
Within the relative safety of these private Facebook groups, the shooter also discusses specific Muslim individuals, and groups in New Zealand, that he believes are now replacing Catholics in New Zealand.
The report points out that the shooter is well aware of Facebook’s rules and knows how to stop short of crossing the line of openly endorsing violence, therefore ensuring his account is not blocked.
But his carefully crafted online persona didn’t fool everyone. In a Facebook message, his own mother called him a “neo-Nazi” because of his far-right rhetoric and shaved head. While he scolded his mother for making the comment, it was not because he didn’t like being labeled a neo-Nazi, but he was worried Facebook would detect the comment and ban him.
YouTube and Facebook did not immediately respond to requests for comment about the report’s findings.