Tech

YouTube and Facebook Began Tracking Misinfo Immediately After Trump's COVID Diagnosis

Both platforms have struggled to handle QAnon and COVID-19 conspiracy theories. Now the President is sick.
Hours after President Donald Trump confirmed that he contracted coronavirus, QAnon followers were already crafting conspiracy theories to explain away Trump’s announcement early Friday morning. YouTube and Facebook, two of the largest disseminators of con

Hours after President Donald Trump confirmed that he contracted coronavirus, QAnon followers were already crafting conspiracy theories to explain away Trump’s announcement early Friday morning. YouTube and Facebook, two of the largest disseminators of conspiracy theories and misinformation, both say they immediately began tracking the spread of Trump-coronavirus conspiracy theories after his announcement.

Advertisement

In a statement to Motherboard, YouTube spokesperson Alex Joseph said “We wish the President and First Lady a speedy recovery. Within minutes of their diagnosis being made public, our systems began surfacing authoritative news sources on our homepage, as well as in search results and watch next panels regarding the President and COVID-19.”

Facebook said it immediately began tracking conspiracy theories and that it will work with fact checkers to label and remove disinformation. YouTube added that in addition to amplifying videos from trusted sources, it is attempting to reduce the spread of content that comes close to breaking—but does not break—its official community guidelines.

Despite these rules and these efforts, of course, the scale of these social media sites means that conspiracy theories persist, often gain wide traction, and are ultimately filtered to less extreme media outlets and posters. For years, Youtube, Facebook, and Twitter have struggled to tackle QAnon, white nationalists, and reactionary right-wing networks in general.

Perhaps no social media platform has had a bigger role in the empowering QAnon than Facebook. After widespread criticism over its failure to contain the cult, Facebook has promised to take serious steps towards fighting its growth. In September the New York Times reported that Facebook had failed to stem the tide of new QAnon groups and followers, but that its recommendation engine "pushed users toward the very groups that were discussing QAnon conspiracies." A review by The Associated Press found the social media company was failing to enforce "even the limited restrictions" put in place to stop the spread of QAnon propaganda.

Facebook told Motherboard that it was closely monitoring the situation for possible content violations and chances to do fact-checking. Given the magnitude and momentum of this problem, these solutions feel more in line with complaints and findings that Facebook is doing too little to contain the exponential spread of these materials through its networks.

YouTube, which has also struggled with QAnon conspiracies, has also tried to fight rampant covid-19 misinformation on its platform. In May, a study found a quarter of the most popular videos on the virus were misleading. A BBC investigation found of the 41 complaints filed by the UK's Center for Countering Digital Hate about anti-vaccination posts that violated YouTube’s guidelines between July and August, YouTube responded to none of them. And while YouTube has begun to excise various conspiracy theorist networks thriving on its platform, as one former Google engineer told Wired on for a story about YouTube’s recent crackdown: “The time to do this was years ago."