Acknowledging what many have long suspected, Facebook admitted Thursday that nations have tried to use the platform as a propaganda machine, directing covert campaigns to influence public opinion by spreading misleading information and manipulating users.
But Facebook now understands how these “information operations” work, and it’s ready to fight back.
In a white paper penned by the social media giant’s security team, Facebook outlined just how subtle such operations can be — they include more than just spreading so-called “fake news,” a term the company cautions against using, since it’s become a meaningless catch-all. Not only have countries and non-state groups set up fake accounts dedicated to spreading hacked or false information, but they’ve also used these accounts to cajole Facebook users into believing specific narratives and thus further certain political or ideological outcomes.
Such operations have occurred, the company admitted, during the French and American presidential elections. Facebook also revealed that it had “taken action” against more than 30,000 fake accounts — or “false amplifiers,” which are often operated not by bots but by humans — in France.
And these false amplifiers do more than just spit out posts: They also interact with real users. “In several instances, we identified malicious actors on Facebook who, via inauthentic accounts, actively engaged across the political spectrum with the apparent intent of increasing tensions between supporters of these groups and fracturing their supportive base,” the paper explained.
Facebook also revealed that it had monitored “several situations” during the U.S. presidential election, including one in which groups had stolen data from outside of Facebook and then exploited the platform, sharing the data and pushing specific narratives “with the intent of harming the reputation of specific political targets.” That sounds pretty similar to when Clinton campaign chairman John Podesta’s email was hacked and his information published by Wikileaks.
Facebook didn’t directly name who the instigator of these information operations might be, though the company did note that “our data does not contradict” a U.S. Director of Intelligence report on Russian interference in the election.
However, the company also warned that Russia’s influence was likely not as widespread or influential as one might think, as “the reach of known operations during the U.S. election of 2016 was statistically very small compared to overall engagement on political issues.”
While Facebook outlined several ways that it will use to combat these propaganda campaigns in the future — such as curbing the spread of false news, strengthening Facebook’s security against hackers, and continuing to monitor and shut false amplifiers down — the paper also pointed out that the company can’t do this alone: “In the end, societies will only be able to resist external information operations if all citizens have the necessary media literacy to distinguish true news from misinformation,” the paper reads, “and are able to take into account the motivations of those who would seek to publish false news, flood online forums with manipulated talking points, or selectively leak and spin stolen data.”