Thirteen months ago, Facebook announced its big plans to combat election misinformation.
In an October 2019 post, Facebook’s top executives confidently laid out the myriad ways the platform was going to prevent a repeat of what happened in 2016, when Russian government agents infiltrated the platform to sow confusion and chaos.
No, this time around it was going to be different. Facebook was going to be more transparent, limit the spread of misinformation before it went viral, and ensure no foreign government could interfere in the outcome of the 2020 election.
But reading that blog post now, two weeks after the election, it is clear the company failed to adequately account for one major factor: President Donald Trump, whose posts are now among the most popular on the entire platform.
At pretty much every turn, Facebook has failed to stop or even limit the ability of the president to spread baseless conspiracy theories and misinformation about the election results, and the measures Facebook did put in place had little to no impact on Trump’s ability to stoke fear and anger among Americans who believe the election was stolen.
Facebook’s own internal data shows that the labels the company placed on Trump’s posts failed to have any meaningful effect on their ability to be shared, according to internal discussions leaked to BuzzFeed News.
”We have evidence that applying these [labels] to posts decreases their reshares by ~8%,” a data scientist posted on Facebook’s internal social network, known as Workplace.
“However, given that Trump has SO many shares on any given post, the decrease is not going to change shares by orders of magnitude.”
Unlike Twitter, which has labeled Trump’s posts as inaccurate and removed the ability for them to be retweeted, liked, or commented on, Facebook has simply placed an informational label on posts from the president and all other candidates. The label does not indicate that what Trump has been saying is false and it doesn’t stop users from sharing it.
The result is that Trump’s posts — which in recent days have included a conspiracy theory about voting machines that came directly from the QAnon community — have become some of the most popular posts on all of Facebook.
According to disinformation researcher Natalie Martinez, who has been tracking Trump’s messages on Facebook since election day, Trump's page holds the largest share of the top 2,000 posts on the platform since election day, and by far the largest share of total interactions earned on top posts.
Martinez points out that more than two thirds (67 out of 97) of the top posts from Trump contain election-related misinformation with many of the posts now being labeled as misinformation.
Facebook spokesperson Andy Stone responded by saying the posts Martinez highlighted were labeled:
But Martinez pointed out that she was referring specifically to misinformation labeling that Facebook has used on other content and which hides the content of a post from users unless they click through to view it.
“I would call Facebook's strategy for combatting post-election misinformation a colossal failure if the sheer stupidity of that strategy did not clearly suggest they had no intention of succeeding in the first place,” Martinez tweeted Monday.
But how could Facebook have ever predicted that Trump would react in such a childish and dangerous manner to losing an election? Well, it could have listened to its own employees.
In June, after Trump’s incendiary “when the looting starts the shooting starts” post about the Black Lives Matter protests, a group of employees from Facebook’s Civic Integrity Group approached management, according to sources speaking to the technology-focused news site, the Information.
They wanted CEO Mark Zuckerberg and his team to implement a new strategy that would impose much stricter penalties on politicians and other public figures who breached the company’s policies.
Among the proposed sanctions was to limit the ability of the message to go viral by removing the ability to share or comment on the posts.
After considering the proposals for a couple of weeks, Zuckerberg decided to veto the suggestion, opting instead for the softer approach we have seen rolled out in recent weeks.
And we all know how well that’s working out.