This article originally appeared on Motherboard in the US
On Monday afternoon, as people around the world started tuning in to YouTube live-streams of a massive fire that’s destroying the Notre Dame cathedral in Paris, some viewers noticed something odd. Beneath some of the YouTube videos, the platform inserted an Encyclopedia Britannica link and preview for the September 11, 2001 terror attacks.
At least two YouTube live streams from two major media outlets—CBS and NBC—included this link.
This is YouTube’s fake news prevention feature at work—but instead of fact-checking a conspiracy theory, it’s making a connection that could spread more conspiracy theories.
YouTube’s “information panel,” launched last year, is meant to provide further context on videos that deal with controversial topics. According to YouTube’s description of the information panel (accessed by clicking “Why am I seeing this?” on the panel’s options), panels are meant to address “a small number of well-established historical and scientific topics that have often been subject to misinformation online.”
While it’s not yet clear what started the fire, only a few hours after it began burning, officials told BBC News that it could be linked to ongoing construction on the historic structure. Not officials have suggested this is a terror attack, or that arson was involved, but YouTube’s algorithm made that association, anyway.
I reached out to YouTube to clarify how this information panel came to populate alongside news coverage of the fire. A spokesperson said:
“We are deeply saddened by the ongoing fire at the Notre Dame cathedral. Last year, we launched information panels with links to third party sources like Encyclopedia Britannica and Wikipedia for subjects subject to misinformation. These panels are triggered algorithmically and our systems sometimes make the wrong call. We are disabling these panels for live streams related to the fire.”
If the algorithm is using machine-vision matching or other keywords, it’s possible it associated big, burning national landmarks with 9/11.
As Bob West, assistant professor at the Swiss Federal Institute of Technology and a researcher who studied “bullshit” enters platforms like Wikipedia, told me in March 2018 when YouTube announced the information panel: There are “so few of us” and so much garbage circulating online, and algorithms are employed to help. But when they go wrong, the results look like this. “You don’t want [algorithms] to become self-referential... If you take the human out of the loop completely the algorithm reproduces its own actions, and bites its own tail,” he said.
In this failure of the information panel to actually inform anyone of anything useful—and instead suggest a misleading cause for the fire—we have an example of YouTube biting its own tail, in real-time.
This article originally appeared on VICE US.