Advertisement
VICE News

YouTube thinks Wikipedia can solve its fake news problem. Wikipedia isn't so sure.

“Wikipedia is an encyclopedia, not a newspaper."

by David Gilbert
Mar 14 2018, 11:37am

YouTube announced Tuesday that links to Wikipedia articles will soon appear below select videos on its platform in a bid to stop the spread of fake news.

The company said the new feature would roll out “in the coming months” but offered little detail about the type of videos that would be targeted, apart from the example: “Nasa’s moon landing was a hoax.”

“We will show as a companion unit next to the video information from Wikipedia showing information about the event,” Susan Wojcicki, YouTube’s CEO, said on stage at during the SXSW conference in Austin.

Wojcicki said the platform is “always exploring new ways to battle misinformation on YouTube” and that Wikipedia would be just one of the third-party sources featured in the companion unit.

However, Wikipedia’s own breaking news page noted that the site is not well-equipped to handle this type of situation. “Wikipedia is an encyclopedia, not a newspaper. Our processes and principles are designed to work well with the usually contemplative process of building an encyclopedia, not sorting out the oft-conflicting and mistaken reporting common during disaster and other breaking news events,” the article said.

Even Katherine Maher, executive director of Wikimedia Foundation, the non-profit entity that owns Wikipedia, voiced concern, saying the partnership was agreed separate from her organization.

YouTube has come under significant pressure to clean up its platform in recent months, not only from regulators and lawmakers but also advertisers who don’t want their brands associated with fake news.

Last month YouTube was hammered after an Infowars video claiming some of the survivors of the Parkland school shooting were crisis actors hit the number one position on the site’s trending section days after the Feb. 14 massacre.

While YouTube says it has rolled out several updates to eliminate such videos from its search results, problems remain.

No Explanation

The company said Tuesday it couldn’t explain why another Infowars video claiming, without evidence, that Antifa was the “prime suspect” in the mysterious Austin package bombings appeared at the top of its search results.

Also Tuesday, the company failed to give a credible explanation to the Britain’s Home Affairs Committee about why it failed to delete a video showing a white supremacist speech from its platform. U.K. lawmaker Yvette Cooper called YouTube’s evidence “shockingly weak.”

Wojcicki was asked Tuesday how the company decided what was a credible and trustworthy source. She refused to say what factors go into the decision, claiming they "are usually complicated algorithms.” She did admit that factors such as the “number of journalistic awards” and the “amount of traffic” a source has are considered.

The news that YouTube was working with Wikipedia immediately raised the question about the trustworthiness of the open source online encyclopedia:

Others noted that the partnership could lead to mass exploitation of Wikipedia given that anyone can edit the pages.

Cover image: A finger touching the logo of the Youtube video sharing website on a computer screen. (Sergei Konkov\TASS via Getty Images)