News

Facebook Stopped the 'Plandemic' Sequel But Not the Lies Behind It

In April, when the pandemic was peaking around the world, Facebook served up 460 million views to coronavirus misinformation sites.
FILE - In this Feb. 28, 2011, file photo, Director of research Judy Mikovits talks to a graduate student and research associate in the lab, at the Whittemore Peterson Institute for Neuro-Immune Disease, in Reno, Nev. Tech companies scrambled to take down

On Tuesday, Facebook demonstrated just how effective it can be at stopping the spread of misinformation when it blocked the sequel to viral coronavirus conspiracy theory video “Plandemic” from spreading online.

But hours later, a damning report revealed the true scale of health misinformation being shared on Facebook, leading the researchers behind the report to label Facebook “a danger to public health.”

Advertisement

The conspiracy video — which is officially called “Plandemic: Indoctornation” — was released on Tuesday, and contained the usual mix of anti-vaxx conspiracy theories and false claims, like that Bill Gates knew about the coronavirus before it happened. It repeated much of the misinformation seen in the original video, which spread virally on Facebook and other platforms when it was released in May.

This time around, however, the makers of the video flagged its release, which allowed social networks to make a plan.

“Given the previous Plandemic video violated our COVID misinformation policies, we blocked access to that domain from our services,” Facebook spokesman Andy Stone told VICE News.

“This latest video contains COVID-19 claims that our fact-checking partners have repeatedly rated false, so we have reduced its distribution and added a warning label showing their findings to anyone who sees it."

Twitter added a warning label to links to the video, while YouTube is removing the relatively few uploads it is seeing. LinkedIn even blocked the account of the group publishing the video.

The result is that the video is not easily accessible on mainstream platforms.

READ: The star of 'Plandemic' spent years flooding the vaccine court system with bad science

There are still some links to the video being shared on Reddit, but they’re seeing little engagement from users. On underground message board 4chan, users are sharing links to copies of the video hosted on BitChute, a video-hosting service known for accommodating far-right individuals and conspiracy theorists.

Advertisement

On BitChute, the video has been viewed over 30,000 times, but compared to the original, which racked up over 10 million views, the sequel has so far failed to gain much traction online.

Facebook’s success at stopping the Plandemic sequel from going viral hides a much bigger problem of health misinformation on the platform. According to the nine-month study by nonprofit activist group Avaaz, health misinformation on Facebook was viewed over 3.8 billion times in the past year, peaking during the COVID-19 crisis.

In April alone — when the pandemic was peaking around the world — Facebook served up 460 million views to these misinformation sites.

While there were hundreds of groups and publishers pushing health disinformation on Facebook, just 42 Facebook pages, followed by 28 million people, accounted for 800 million views alone.

These pages include RealFarmacy, one of the biggest websites spreading health misinformation; and GreenMedInfo, a website that presents health misinformation as science.

READ: Facebook’s algorithm is ‘actively promoting’ Holocaust denial content

The report also found that the top 10 misinformation sites garnered four times more views than equivalent content from the websites of 10 leading health authorities, such as the WHO and the CDC.

Just 16% of all health misinformation Avaaz analyzed features a warning label from Facebook, despite the content being fact-checked. “The other 84 percent of articles and posts sampled in this report remain online without warnings,” the report said.

Advertisement

"Facebook’s algorithm is a major threat to public health," Fadi Quran, campaign director at Avaaz, said in a statement. “Mark Zuckerberg promised to provide reliable information during the pandemic, but his algorithm is sabotaging those efforts by driving many of Facebook’s 2.7 billion users to health misinformation-spreading networks.”

Facebook, along with the other social media platforms, was quick to publicly acknowledge the threat facing users from COVID-19 hoaxes and misinformation — but the Avaaz report suggests that the steps it took to stop the threat have not succeeded.

Facebook said it “shares Avaaz's goal of limiting misinformation” and pointed out that its global network of fact-checkers applied warning labels to 98 million pieces of COVID-19 misinformation and removed 7 million pieces of content “that could lead to imminent harm.”

Facebook also highlighted the various resources it made available to its over 2 billion users from health authorities.

But it’s clear that the scale of the problem facing Facebook is overwhelming.

Last week, the American Journal of Tropical Medicine and Hygiene published a study showing that online misinformation about a coronavirus “cure” — ingesting highly concentrated alcohol — led to the deaths of at least 800 people worldwide in the first three months of the year, and the hospitalization of thousands more.

And now experts are worried that unless it takes action to improve how it handles health misinformation, Facebook could become the primary vector for those trying to prevent people from taking a COVID-19 vaccine.

“Facebook is rife with medical misinformation. In the middle of a pandemic, that makes the platform a public health threat,” Dr. João Miguel Grenho, secretary-general of the European Union of Medical Specialists, said in a statement. “Mark Zuckerberg must take immediate action to stand with us to stop this infodemic; otherwise the number of people poisoned against taking a vaccine will be too high for us to beat this pandemic.”

Listen and subscribe: Via Apple Podcasts | Via Spotify | Via Stitcher or anywhere else you get your podcasts.

Cover: In this Feb. 28, 2011, file photo, director of research Judy Mikovits talks to a graduate student and research associate in the lab at the Whittemore Peterson Institute for Neuro-Immune Disease, in Reno, Nev. Tech companies scrambled to take down a 26-minute documentary-style video called “Plandemic” of Mikovits promoting a string of questionable, false and potentially dangerous coronavirus theories. (David Calvert for AP Images, File)