FYI.

This story is over 5 years old.

Tech

How YouTube Drives Shane Dawson and Other Creators to Conspiracy Theories

YouTube structurally prioritizes audience feedback, which can drive content-makers to promote conspiracy theories or hate.
Illuminati eye and YouTube logo over fire.

At the end of last month, YouTuber Shane Dawson, who has over 20.5 million subscribers, posted a video titled “Conspiracy Theories with Shane Dawson.” The video supported malevolent conspiracy theories, including ones about the California wildfires that killed more than 80 people and destroyed more than 11,000 homes.

In the nearly two-hour video, Dawson presented a series of false narratives that the wildfires were caused by either direct energy weapons, kitchen microwave explosions directed by electric companies, or people burning down their houses for insurance money. He sprinkled these videos with disclaimers that these were “theories, not facts.” Shane dropped another video titled “Investigating Conspiracy Theories with Shane Dawson” yesterday. In this video, Shane explores the conspiracy that Chuck E. Cheese's re-serves leftover pizza to other customers (Chuck E. Cheese’s has since unequivocally denied this), and interviews a friend who alleges that she and her son almost became victims of human trafficking.

Advertisement

The theories about the California wildfire were presented using recycled footage and narratives from fringe, conspiracy theory-driven parts of YouTube by creators who have also been known to support anti-Semitic conspiracies theories. Dawson also tried to argue that iPhones secretly record everything that you say, which has been conclusively debunked. Dawson also promotes a conspiracy theory that the app Zepeto records its users, but the app does not use or store microphone data, according to iOS System Preferences and the app’s privacy policy.

When Motherboard looked at other conspiracy videos promoting “direct energy weapons” in California, one of them already had three comments claiming that they looked for the video because of Dawson. The same account has posted videos which claim that the Clinton family practices demon occultism.

Dawson also placed these California wildfire theories in the same video as verifiably true but unrelated things—like the existence of deepfakes (which Motherboard has covered extensively), the fact that supermarket layouts are designed to make you buy more items, and the fact that violence and suicide is common in children’s cartoons. These are not “conspiracy theories.” They’re just creepy. But their adjacency to false conspiracy theories can make them seem more legitimate by association. Dawson did not respond to Motherboard’s request for comment.

Dawson’s recent video was actually demonetized by YouTube a few hours after it was posted—but only for a day. As explained in the Verge, Dawson’s video exists in a grey area where conspiracy theories are presented as possibilities. By doing so, he avoided serious penalization by YouTube, and his video has also been rewarded by his audience. His first video has more than 30 million views at the time of publication; the second video has nearly 8 million views a day after it was posted.

Advertisement

Many YouTubers, including Dawson, have realized that fringe content—like conspiracy theories and far right political beliefs—are successful on the site. Because they are rewarded with engagement and views, YouTubers are incentivized to create videos that edge further and further to the extreme. This phenomenon doesn’t have an easy fix because it’s built into the structure and model of YouTube as a platform.

Becca Lewis, a political subculture researcher for internet research group Data & Society, told Motherboard that across the platform, there are incentives for YouTubers to create more extreme content.

“It can be almost impossible to parse out any one, individual influencer’s motivations,” Lewis said. “But I think it’s important to note that there are some trends on YouTube where you can see that inflammatory, or political, or conspiratorial content often drives viewership. And so there can be a direct financial motivation for content creators to make this kind of content.”

In her report “Alternative Influence: Broadcasting the Reactionary Right on YouTube,” Lewis explained that not only can YouTubers radicalize their audience, but the reverse can be true: audiences can radicalize creators and drive them to make more extreme content.

“The easy feedback systems on YouTube lead to discursive loops, in which influencers build audiences that ask for, or reward, certain types of content,” Lewis says in her report. “For many of the political influencers in the AIN, the more extremist content they make, the more of an extremist and dedicated audience they build.”

Advertisement

For a creator like Dawson, he may risk video demonetization with conspiracy videos, but he’s retained sponsorship deals with companies like Honey and SeatGeek, which he advertises in the videos themselves. Dawson’s conspiracy videos also get a very strong response from his audience, which can help build or reinforce their loyalty. Dawson has been making videos on YouTube for almost a decade, and his content has included vlogs, creator “challenges” like food tastings or product testings, and more recently, feature-length videos exploring particular topics or people. Yet two of his top ten videos of all time are both titled “MIND BLOWING CONSPIRACY THEORIES.”

Dawson isn’t the only YouTuber who started making more extreme content that was arguably in part driven by audience demand.

There’s also the case of Blaire White, a far right political commentator who focuses on critiquing feminism and non binary gender identities from the perspective of a transgender woman. As described in Lewis's Alternative Influence report, White began her channel with comparatively balanced political videos which conceded to certain feminist beliefs (like male privilege.) But as she received comments suggesting that she collaborate with established far right YouTuber Carl Benjamin (Sargon of Akkad), her videos took a hard right turn and pivoted to the political extreme.

Interestingly, White posted a video in September titled “Why I've Changed (The Truth)” in which she claimed that she would stop doing far right political content—not because her belief system changed, but because of harassment, and a sentiment that people within the far right YouTube community were faking their beliefs for viewership.

Advertisement

“An amount of them that I’ve met that have told me either directly or indirectly or in a roundabout way that’s very clear that they don’t believe everything that they say that they believe when they’re on camera,” White said. “A lot of them just don’t believe. They’re just actors.”

Similarly, YouTuber Moe Othman started his channel about six years ago by making personality-driven blogs. Then, two years ago, he started making conspiracy theory videos. Conspiracy videos would get, on average, between 50,000-80,000 views, while non-conspiracy videos would rarely crack 10,000 views. He eventually started creating exclusively conspiracy theory content, which have sometimes received hundreds of thousands of views. (Some of these videos promoted the false direct energy weapon conspiracy theory for the California wildfires.)

Earlier this year, Othman posted a video in which he said he would be pivoting away from conspiracy theory content. Why? He claimed it was making him sad.

“Making all of those Illuminati videos that I did, it caused a lot of sadness in me,” Othman said. “And it’s weird because I already knew about all of that. I know worse than the stuff that I made in my videos. So why did making Illuminati videos make me sad? It made me lose a lot of enthusiasm, and a lot of optimism.”

Despite posting that video, Othman has continued to make conspiracy theory-related content.

According to Lewis, this pivoting back and forth may reflect a delicate tension that YouTubers are trying to balance: how do you strategically brand to the extreme to maximize viewership, not alienate too many people, and also remain authentic?

Advertisement

“I think that you see people in these political spaces negotiating that tension over time,” Lewis said. “Particularly with political content, it’s easy to get shoehorned into a specific grouping. So a lot of them are flirting with these extremist ideas. But at a certain point, some want wider viewership than they would get if they become known as full-on white nationalist or full-on white supremacist. So you see a lot of them negotiating these boundaries and flirting with certain ideas, or posting certain kinds of content, and later deleting that content, and going on to different kind of content.”

Lewis said that when YouTubers build up an extreme audience, they get locked into the demands of that audience. After all, if a YouTuber completely radicalizes, their audience might get bored and unsubscribe. At that point, the YouTuber would have to build an audience all over again.

She pointed to Andy Warski as an example of this risk. Warski built a reputation for having “debates” about the merits of white nationalism with YouTuber Jean-Francois Gariépy, but after Warski departed from these collaborations and was accused of being a not “authentic” enough white nationalist, he’s struggled to build up an individual audience (even though he is still, undoubtedly, a far right figure.)

Of course, it’s very possible that for some people, the drive to create conspiracy-related videos comes from a place of genuine interest and belief. For instance, Dawson has created several conspiracy theory-related videos over the past couple of years. In a tweet, Dawson claims that YouTube deleted several of these videos. It’s noteworthy that this tweet was a reply to InfoWars’s Paul Joseph Watson, tweeted about Dawson’s feature-length conspiracy video. Dawson said “love ur stuff btw!!”

Dawson also hosted far right YouTuber Blaire White on his podcast “Shane and Friends” not once, but twice. On YouTube, a collaboration is often functionally equivalent to an endorsement, as explained in Alternative Influence.

YouTube announced that it would be taking steps toward improving recommendations on the site a few days after a Buzzfeed investigation depicted how the site leads people down a conspiracy theory rabbit hole—which has been a problem for years. But the forces driving YouTubers to create extreme content won’t be easily fixed.

After all, Lewis’s Alternative Influence report explains that audience feedback metrics like likes, dislikes, and comments are “directly built into YouTube’s interface.” Those metrics central to YouTube, and those metrics are subject to the politics of YouTube.

“YouTube the company has always positioned itself as this alternative to mainstream news, mainstream entertainment, [and] really the mainstream media as a whole,” Lewis said. “And so, you get these people on the far right that are explicitly anti-mainstream media because they think it has a liberal bias, and are using YouTube as a place to promote what they see as alternative narratives, and often those end up having a conspiratorial bend.”