A few days after the election—in remarks that he now certainly regrets because they get cited all the time to illustrate his naivete—Mark Zuckerberg characterized the idea that Facebook was responsible for Trump’s election as “pretty crazy.” It was not that crazy. In short order, it became clear that Facebook had an incalculably large mess on its hands. Facebook’s multifaceted automated systems of sorting, parsing, and recommending content to each individual user had been abused to flood the system with unreliable information and distorted, emotionally slanted news content. Content farms, like ones run by teenagers in Macedonia, expertly worked over Facebook’s algorithmic preferences to build up large, loyal audiences who consumed articles on events that had not happened, or happened exactly as they described. Crucially, there was, and is, a far larger audience for these stories on the right side of the political spectrum than on the left.
“At the heart of the problem, anyway, is not the motivations of the hoaxers but the structure of social media itself. Tens of millions of people, invigorated by insurgent outsider candidates and anger at perceived political enemies, were served up or shared emotionally charged news stories about the candidates, because Facebook’s sorting algorithm understood from experience that they were seeking such stories. Many of those stories were lies, or “parodies,” but their appearance and placement in a news feed were no different from those of any publisher with a commitment to, you know, not lying.”
Elsewhere, in terms of meaningful platform changes, Facebook has dragged its heels in many respects, stalled by a persistent fear of conservative backlash. The Washington Post reported earlier this year that in the weeks following the election, Facebook launched an effort to root out propaganda on its platform. The effort hit a snag after Facebook’s head honcho in Washington, Joel Kaplan, opposed the effort because it would “disproportionately affect conservatives.” Kaplan, who served as counsel for the second Bush administration, runs Facebook’s Washington D.C. office. This theme comes up a lot in recent reporting on Facebook policy decisions. In May, the Wall Street Journal conveyed that a 2018 internal report found that the platform exacerbated polarization, but efforts to mitigate it had been shelved over concerns, expressed by Kaplan and others, that the efforts would be seen as biased against conservatives.
Years before Facebook’s creation, conservative media outlets and pundits laid the groundwork and created the framework that makes Facebook so effective.
Around the same time, in the late ‘80s, following the elimination of the Fairness doctrine, Rush Limbaugh skyrocketed to fame. “Rush Limbaugh doesn't write up a script, and then just reads it straight up. He might have an outline of what he wants to talk about that day, but he riffs. And then he'll get callers that call in and ask him questions, and he'll banter back and forth with them,” Bauer notes. “There's something about that banter, and that being able to speak off the cuff that lends itself to a sense of authenticity, of credibility, that is rooted in the values and nature of the person itself. Not in that person's knowledge or ability to be an expert in something.” In other words, the mere act of being able to speak about these issues at all, at length, regardless of accuracy, is seen as a strength of right-wing hosts like Limbaugh.Now, take a second to look at the vocabulary we’ve been using to discuss the changes in the news media over the second half of the 20th century. As conservative media grew, to get around “gatekeepers,” it brought with it content that was “emotional” and “authentic.” News anchors brought their “personal” experiences into it.Eerily similar language is used by social media executives to describe their own platforms. In 2016, for the launch of Facebook’s live-video product, Mark Zuckerberg told BuzzFeed that, “We built this big technology platform so we can go and support whatever the most personal and emotional and raw and visceral ways people want to communicate are as time goes on” (emphasis added). Facebook demands authenticity with policies as high-level as requiring that users provide their real names and a photo. It roots out “coordinated inauthentic behavior.” It has stood behind its targeted ad product, claiming it as a boon for small businesses who otherwise might not be able to afford ad campaigns or kept out by gatekeepers. The content that goes the most viral on Facebook is not intellectually stimulating, but emotionally stimulating—content designed to get you to share, whether it be a heartwarming video or a rage-inducing hot take. And like conservative media, battling against established, supposedly liberal outlets, Facebook is a similarly styled insurgent, taking on old-media standards of quality, format, and distribution.
The only type of content that seems to survive each algorithm shift is conservative media
UPenn professor and media historian Brian Rosenwald has spent a lot of time studying Limbaugh. The radio host, according to Rosenwald, “has developed what I call like a plausible deniability style, where you'll notice the people jump on him, and he'll say, ‘Well, I never said that.’ And so you go back and look at the transcript, and he didn't actually ever say it. He said, ‘Well, someone handed me this story’ or ‘I got this from this newsletter.’ And he's just passing it along. He's not saying he believes it, necessarily. But he's sharing something.” Facebook lets everyone be their own Rush Limbaugh, 24 hours a day, seven days a week, with next to no oversight.It’s vitally important to sketch out these links and similarities between conservative media’s style, aesthetic and rhetoric with what Facebook wants from its users because Facebook itself is unwilling to do so. Part of the power of right-wing media comes from its effectiveness at “working the refs”—invoking technicalities to push arguments into the mainstream or skirt censure—whether those refs are the established news networks of old, or the dominant social networks of the new millennium. This has proven particularly useful on Facebook. Working the refs requires understanding the rules, understanding where the line is, and sprinting up to it without going over. It also helps to have powerful allies in Congress, ones who can credibly threaten regulation, deploying similar strategies. For all of the worries about outright fake news and misinformation, a lot of conservative media in a variety of formats operates in a vague “just asking questions” mode that falls just short of defamation and offers plausible deniability. This approach, developed over the last 70 years, has effectively neutered any action Facebook might take to improve the overall experience on its platform—to make its information more reliable and to make interactions less contentious.A common tech maxim goes, “It’s not a bug, it’s a feature.” When Facebook concludes that certain platform changes, like ones reducing the reach of conspiracy theories and misinformation might “disproportionately” affect conservatives, that’s not a bug. It’s an indication that one political party—the right wing—traffics more in these formats and styles. The problems that Facebook is grappling with are not unique to Facebook: since the 1950s, conservative media has been very good at circumventing or infiltrating politically neutral territory and using it to expand its influence. Facebook is using poorly paid, overworked content moderators to try and combat seven decades of media patterns in a politically neutral way. It’s a fool’s errand, and one that if Zuckerberg ever cops to, he will have done so far too late.Rosenwald offered up this analogy for Facebook’s fixation on party-neutral enforcement: “It's like referees in a football game saying they want to call the same number of penalties on each team, and before the game, they decided they're going to do an experiment: they tell one team, ‘We're going to call the same number of penalties on these two teams.’ The team they let know then is incentivized to commit penalties on every play, because they know they're not going to get called for more penalties.”“So much of what the tech industry is based upon is that they're kind of like neutral technocrats,” Bauer said. “That's just not the way out of this particular crisis.”“This is the problem with this cyber-utopian or tech-utopian rhetoric that emerges from Silicon Valley, and its emphasis on the conduits of communication—the infrastructure, the technical—is it's always a technical problem,” Peck added. “But what that does is it dismisses questions of culture and specifically political culture.”“Most of the discourse and arguments around things like Facebook, don't take into account that we tend to point to these issues in our politics as though there are technological problems when they are social and political problems,” Hemmer says, “that not only aren't caused by these new technologies, but can't be fixed with technological solutions.”The first step towards a workable solution then is to acknowledge the political reality, and the accumulated years of knowledge and experience that would easily identify the underlying meaning of certain platform expressions, and justify changes to how Facebook addresses partisan media. For a company so focused on examining trends, this is one that it has willfully ignored.
“So much of what the tech industry is based upon is that they're kind of like neutral technocrats. That's just not the way out of this particular crisis.”