FYI.

This story is over 5 years old.

echo chamber

Facebook Can’t Fix Political Divides With an Algorithm

Online platforms inadvertently helped create chaos in US politics, but Harvard researchers believe the solution must come from elsewhere.

This article originally appeared on Motherboard.

If you were concerned that American political discourse is very seriously broken, there's bad news: You're right, and the technological forces that shaped it can't be the ones substantively provide a fix.

A new report on disinformation, partisanship, and online media released on Wednesday by Harvard's Berkman Klein Center for Internet & Society takes a grim view of the state of political discourse.

Advertisement

"Our observations suggest that fixing the American public sphere may be much harder than we would like," the report's authors write, explaining that the problem is not simply that platforms like Facebook and Twitter have amplified false news. Rather, they say, many Americans, particularly on the conservative end of the political spectrum, have opted in to political communities and followed media outlets that reinforce beliefs they already hold and which tend to refer, for the most part, to an echo chamber of outlets that share their views. This creates a serious problem if you want to find a fix to the problem of disinformation online, because doing so both technologically and democratically seems basically impossible:

[T]he efforts to find a technological fix—through changes to the Facebook algorithm or a fact-checking app—are much less likely to be either effective or normatively justifiable if they mean intentional disruption of a class of political communication desired by its recipients and intended to forge a powerful political connection within a substantial wing of the American public.

This problem hinges on the difference between explicitly fake news—the type of stories created by Macedonian teens and others purely trying to make some money—and disinformation, which tends to involve interpreting facts misleadingly and allowing consumers to infer false conclusions rather than offering outright falsehoods. Analyzing approximately 2 million stories that circulated during the election, mapping which stories and outlets linked to one another, and studying where and how often stories were shared on Facebook and Twitter, the report's authors found that the latter played a much more significant role than the former, especially on the political right.

Advertisement

The report found news consumers on the right aren't the only ones who seek to have their views validated, but during the 2016 election season right-wing media organizations like Breitbart and Fox News generally acted to reinforce the tendencies of their audiences to reinforce their beliefs, while more centrist and left-leaning news organizations tended to moderate the same tendencies by linking to a more ideologically diverse set of sources.

"[T]he issue is even though people are forming false beliefs, they probably want to form those false beliefs because these stories are essential to constructing their political identities," Yochai Benkler, a professor of law at Harvard Law School and one of the authors of the report, told me in a phone interview.

"Then it becomes much harder to say, 'Oh Facebook and Google know what's true and Breitbart and Fox News don't, and so Facebook should get in the way of Breitbart getting to it's users.' I am not comfortable living in a constitutional system where that is the recommendation."

There's no legal issue with Facebook and other platforms trying to staunch the spread of misinformation, since companies aren't constrained by the First Amendment. However, Beckler would be concerned by Facebook trying to constrain "communications between willing speakers and willing listeners." While partisan news sites and highly-trafficked Facebook pages have proliferated over the past couple of years, Beckler sees the problem as one that's more fundamental to political conversation, particularly on the right. "That's not something that I think the American polity needs to outsource, or can afford to outsource, to Facebook and Google," he said.

"Yes they are the platforms on which it is happening," Beckler said. "But having them take a leading role in reversing the trend is highly problematic." This, Beckler said, is due to the scale of companies like Facebook and Google, as well as the fact that they're private entities without accountability to the broader public.