Quantcast
How We Can Make Social Media Less Blindingly Awful in 2017

We have the tools to make our online public spaces better, but will we use them?

I don't think there's a better way to understand the internet than to read Balk's Laws, a set of depressingly accurate pronouncements from the Awl's resident misanthrope Alex Balk. One is, "The worst thing is knowing what everyone thinks about anything." Another: "If you think the internet is terrible now, just wait a while."

To be on social media right now is to experience deep waves of cynicism and pessimism. Lindy West, a prominent feminist writer and heavy Twitter user, quit the platform this week, calling it unusable "for anyone but trolls, robots, and dictators." President-Elect Donald Trump last month tweeted things about a union leader that resulted in harassment from third parties, leading to some serious questions about whether Twitter should just ban its most famous user. Facebook is filled with fake news stories and hoaxes, and its latest feature, Facebook Live, most recently captured a horrific alleged hate crime. Nearly everything you read on social media—down to even a "liberal tears" mug that was promoted all over Twitter last month—is somehow a lie.

All this is on top of the everyday badness that usually goes unremarked upon because we're so used to it: the hateful invective that especially swamps people of color and women who have opinions in public, the way social media acts as an opinion-affirming cocoon where likeminded people tell one another that they're right and the other side is wrong, the sheer pointlessness of 99.9 percent of what gets tossed out in those little text boxes. 

It's enough to make you go offline, but not really (lol) because we are all addicted to the tiny little endorphin spikes of affirmation and interaction. So instead of logging off, I emailed Adrienne Russell, an associate professor at the University of Denver who specializes in how journalism and activism have evolved online, and talked with her about how toxic the relationship between the media and the public has become, and why she's optimistic despite everything.

VICE: There has been some talk about how Twitter should ban Trump because he is encouraging abuse or spreading hate speech. What do you think of that argument—is it silly on its face, considering he's about to be president?
Adrienne Russell: I don't think there is anything silly about this argument, but it is complicated. Twitter should hold all of its users, including Trump, to the same standards of conduct. Facebook should do the same. Trump's Facebook posts about banning Muslims from the US, among others, clearly violate Facebook rules against hate speech. Mark Zuckerberg claims that, because such speech is coming from a major political candidate, it is mainstream political discourse and thus acceptable. This is a morally bankrupt approach to running the platform that hosts the world's largest exchange of ideas and information. But this raises the very good question of whether we want Twitter and Facebook defining what can and cannot be said. Hardline freedom of speech advocates say absolutely not and that this could be more of a threat to public discourse than Trump and others spewing hate speech and misinformation online. 

"Treating the rise of Trump as the result of technological change ignores decades of conservative movement efforts to undermine trust in  government, journalism, science, and 'facts'"

Who do you think bears responsibility for fact-checking false statements made by politicians (a la Trump) or organizations? Should Twitter and Facebook work to make sure that prominent posts are, if not "true," then not straight-up misinformation?
I think it's the responsibility of journalists to fact-check statements made by politicians, but fake news is about more than lying politicians. I think social media platforms have the responsibility to curb the spread of fake news, and users have the responsibility to develop the skills to be able to identify fake news, so they are not fooled into believing and sharing false information. I'm sure you've seen the recent Pew survey that found nearly a quarter of Americans say they have shared a fake news story.

And I should say, too, that although it is true that social media platforms speed the circulation of rumors and lies, they did not create cynical publics who have so little faith in media and government that they make no distinction between journalism and fake news or between legitimate political leaders and populist opportunists. Treating the rise of Trump as the result of technological change ignores decades of conservative movement efforts to undermine trust in  government, journalism, science, and "facts" more generally. We should be thinking about that whenever we're talking about politics and the media.  

If Twitter et al do take a hard line on "hate speech," a la banning far-right hatemongers like Richard Spencer and Milo Yiannopoulos and their followers, does that run the risk of just partitioning social media so the "alt-right" congregate on their own sites and there's no properly public space on the web, just competing ideological clubhouses?
I think what you are describing—the competing ideological clubhouses—are already taking place on Facebook and to perhaps a lesser extent Twitter. FB algorithms, for example, create ideological bubbles, and this (as we have heard over and over again) is part of why the left was so shocked at Trump's rise and then his victory. But, yes, what you describe sounds plausible. If the white supremacists get kicked off social media platforms, they will probably start their own platform (and already have if you consider some subreddits). I don't think we can or should give up the idea of an online public space where productive political discourse can take place. I think we do have the tools to facilitate such a space, even if we don't have the skills to populate it.

If you were in charge of a Facebook-like platform, call it Russellpedia or something, what guidelines would you put in place to facilitate the sharing of ideas while making sure abuse and harassment weren't widespread? Would you attempt to police hate speech?
"Russellpedia"? Ha. I think not! 

I'm no entrepreneur, media executive, or free-speech attorney, but I guess I just frankly don't think it's that difficult to combat abuse and harassment and hate speech. You just have to work at doing it and doing it well, right? There are large gray areas, but it's not an impossible task. It's an ongoing task. And don't we do that to some degree in public spaces all the time? Media outlets have learned how to combat bad communication behavior in comments threads. It's not easy. It takes commitment and the willingness to pay people to work at it. But if you're Facebook, why can't you prioritize that and pay well-trained employees and lawyers to work at it, and take on the responsibility and whatever liability comes with it? To me, it just seems like one of the costs of doing business as the world's largest media platform. 

If we don't have the skills or tools to create a public space online where political discourse is productive, what are those skills and tools? Is it a matter of technology, or is it more a matter of people learning how to speak to one another? Or to put it another way, is there any way to make the public less cynical?
The public is cynical because it has been taught to be cynical. We learn our politics overwhelmingly through the news and talk radio media. So, what is the aim of the political information most Americans are consuming? Is it to serve the public interest? Is it to move people to act and work together as citizens? Is our news, on the most basic level, about encouraging shared responsibility or an obligation to work toward empathy in the service of greater justice or genuine liberty and more equal opportunity? It could not be any more clear after the years-long election season that we have all just endured that TV news is driven by profit and that the product it creates is shaped very specifically by that motive—and it is a very inadequate, even toxic product by the measure of how well it serves the public interest, and by that I include how well it works to inform the citizenry.       

"Facebook isn't really the problem. It's how we've learned to use it."

So what if the operating instructions for our media were different? The US will never see true public media on the scale of the BBC, but we might use the internet in new ways. What if there were a space for journalism where reporters really were rewarded for being activists on behalf of the truth—paid reporters but also all those social media users who want to fight propaganda and disinformation? That's an antidote in my opinion to cynicism. I think it's a sign of hope, for example, that young people are turned off by Facebook. I think it's hopeful that an influential youth movement is driven by DIY culture and aesthetics. Isn't it really way past time that the hive-mind created a public information space that isn't driven by profit? 

I think we do have the skills and tools to create a public space that fosters civil and productive debate through reliable accurate easily accessible information—in fact, I think that's true now more than ever before in history! I know that sounds absurd given the context. But we have the skills and tools today, we just lack the motivation. Indeed, we're being driven by powerful forces away from producing that kind of space. There is a great deal of powerful anti-motivation to do it. That's what we have to battle. It's not about inventing some new magical platform. That's already been done. The tools are incredible, and they're all around us advancing every day. We have to focus on battling the forces that promote ignorance and apathy.

I want to underline that I think there are reasons for a lot of hope. The activist groups I have studied for 20 years have used media in innovative and democratic ways to promote media literacy and civic literacy. That's been at the heart of the missions, the causes, they have taken up. I have seen that these groups embrace and promote a much larger range of values and identities and ideas about social relations than the range being promoted by corporate media. Education scholar Henry Giroux is great on this stuff. He says we have to work at unlearning the experience of rudeness and spectatorship and the wafer-thin approach to public life and social connection that is dominating American life today. I think that's obviously true. 

Facebook isn't really the problem. It's how we've learned to use it, or maybe how we are rewarded for using it in certain ways instead of in other ways. Any medium is only as good as the people using it—their skills make it good or bad. FOX News watchers are really bad news consumers. They've learned some terribly retrograde media habits, but media habits and use are based on other skills, other literacies. Still, beware: Your social media feed on some level is teaching you how to use media, and your literacies are only ever increasing or decreasing, never standing still! 

This interview has been edited and condensed for the sake of clarity.

Follow Harry Cheadle on Twitter.