News

Hairdryers, Incense, and Garlic Water: Facebook Is Still Letting People Spread Fake Coronavirus ‘Cures’

Facebook said it would fix its dangerous coronavirus misinformation problem, but a new report reveals how badly it's failing.
AP_17166691575520

Want the best of VICE News straight to your inbox? Sign up here.

Have you heard the good news? Coronavirus is destroyed by chlorine dioxide. And black people are resistant to it. Have you got a hair dryer? Good, because those can also be used to prevent coronavirus.

And if none of the above works, you can always cure the virus with one bowl of freshly-boiled garlic water.

This is just a tiny sample of the coronavirus-related misinformation that has flooded Facebook in recent weeks, and despite the company’s claims that it's responding quickly and aggressively to these issues, a new report lays out the huge shortcomings in Facebook’s ability to tackle this problem.

Advertisement

“Facebook sits at the epicenter of the misinformation crisis,” Fadi Quran, a campaign director with nonprofit rights group Avaaz, said in an emailed statement. “Its current anti-misinformation efforts remain slow and insufficient in limiting the spread of coronavirus misinformation, even when the content is flagged by the platform’s very own fact-checking partners.”

Between Jan. 21 and April 7, Avaaz conducted a survey of over 100 pieces of misinformation that had already been debunked by independent fact-checkers, many of them partners in Facebook’s own fact-checking program.

Despite being proven false, these posts were shared 1.7 million times and were viewed an estimated 117 million times across six languages.

READ: The Chinese government has convinced its citizens that the U.S. Army brought coronavirus to Wuhan

The revelations come weeks after Facebook said it would “remove COVID-19-related misinformation that could contribute to imminent physical harm,” and that “once a post is rated false by a fact-checker, we reduce its distribution so fewer people see it, and we show strong warning labels and notifications to people who still come across it, try to share it, or already have.”

But Avaaz’s report shows that Facebook is simply not living up to these promises.

At the time of the report’s publication on Thursday, 41% of the misinformation content Avaaz monitored remains on the platform without any warning labels, even though two-thirds of those posts were debunked by partners in Facebook’s fact-checking program.

Advertisement

The problem is even greater in non-English-speaking markets. Over half (51%) of non-English misinformation content had no warning labels. In Spain and Italy, the two European countries worst-hit by the pandemic, that figure rises to 68% and 70%, respectively.

READ: These videos show people burning down 5G cell phone towers over coronavirus conspiracy theories

Facebook also takes a long time to flag some content as misinformation. The researchers found that it can take up to three weeks to downgrade and issue warning labels on false content — and that’s after Facebook’s own fact-checking partners, as well as the World Health Organization and local health authorities, highlight the problems and issue corrections.

“The scale of this ‘infodemic’ along with Facebook's reluctance to retroactively notify and provide corrections to every user exposed to harmful misinformation about the coronavirus is threatening efforts to ‘flatten the curve’ across the world and could potentially put lives at risk,” the report’s authors said.

But Facebook says that Avaaz’s report is “not representative of the community on Facebook and their findings don't reflect the work we've done.”

The company said that 350 million people had viewed its COVID-19 Information Center, (though it has alerted more than 2 billion people to the resource).

A Facebook spokesperson added that it had applied its strongest warning labels — False or Partly False — to about 40 million posts on Facebook, based on around 4,000 articles by its independent fact-checking partners. In about 95% of those cases, users did not go on to view the original content, the company said.

Advertisement

But even when Facebook does act, it’s often too late.

One harmful misinformation post that claimed people can rid their bodies of the virus by drinking a lot of water and gargling with water, salt, or vinegar was shared over 31,000 times before eventually being taken down after Avaaz flagged it to Facebook.

But there were already 2,611 clones of that false and misleading post on the platform with over 92,246 interactions. Most of these cloned posts remain online and have no warning labels from Facebook.

Avaaz has for some time been pushing Facebook and other social media companies to implement a “Correct the Record” policy that would see the companies retroactively alert users who had already interacted with misinformation.

A new academic study by researchers from George Washington University and the Ohio State University, commissioned by Avaaz, shows that issuing corrections from independent fact-checkers every time a Facebook user has been exposed to misinformation can radically reduce belief in disinformation by an average of nearly half (49.4%) and as much as 61%.

On Thursday, Facebook announced as a result of Avaaz’s advocacy it would begin implementing a new system.

“We want to connect people who may have interacted with harmful misinformation about the virus with the truth from authoritative sources in case they see or hear these claims again off of Facebook,” Guy Rosen, Facebook’s head of integrity, told VICE News in an emailed statement.

Advertisement

But an image of the new system seen by VICE News does not appear to match the version envisioned by Avaaz. Below is Facebook’s version on the left and Avaaz’s version on the right.

facebook vs avaaz

Facebook/Avaaz

Rather than highlighting a specific piece of misinformation, Facebook’s alert looks like a generic message at the top of a user's News Feed, similar to the alerts the company has rolled out already.

Facebook would not say where the new measures would be rolled out, or when, beyond “the coming weeks.” Typically the company has launched its new features in its English-speaking markets before rolling them out to the rest of the world.

But as Avaaz’s report highlights, Facebook’s misinformation problems are worst in non-English-speaking countries.

One stark example is an Italian video in which a man who claims to be a medical doctor says that staying home is a useless defense against the coronavirus, and that our natural bacteria, combined with burning incense, are sufficient to fend against infection.

The video had racked up over 1 million views on Facebook as of April 7, according to Avaaz. At the time of publication, the video is still live on Facebook’s platform.

Listen and subscribe: Via Apple Podcasts | Via Spotify | Via Stitcher or anywhere else you get your podcasts.

Cover: Garlic cloves (Allium sativum) can be seen at the weekly market in Langenhagen, Germany, 13 June 2017. Photo by: Holger Hollemann/picture-alliance/dpa/AP Images