News

How YouTube’s algorithm prioritizes conspiracy theories

A former YouTube engineer is speaking out against the company’s recommendation algorithm, saying that it is programmed in a way that could lead to the promotion of conspiracy theories.

Guillaume Chaslot, who has a Ph.D in computer science and worked on the YouTube recommendation algorithm in 2010, says that when he worked at the company, YouTube was programming its algorithm to optimize for one key metric: keeping viewers on the site for as long as possible to maximize “watch time.”

Videos by VICE

Chaslot says that conspiracy videos, like those about flat earth or autism and vaccines, were more likely to be recommended in YouTube’s recommendation algorithm because of that focus on watch time. In a statement to VICE News, a YouTube spokesperson says the company still considers watch time in its algorithm, but that it now also factors on another metric: user satisfaction.

Chaslot has since created a tool, dubbed AlgoTransparency, that he says shows conspiracy videos on YouTube are still some of the most likely to be recommended. VICE News meet with Chaslot to discuss how YouTube’s algorithm works, and how he plans to create another similar tool for the Facebook and Twitter algorithms.

This segment originally aired on March 5, 2018, on VICE News Tonight on HBO.