How We Can Stamp Out Trolls and Make the Internet a Happier, Safer Place
To some unfortunate users, the internet is a minefield of harassment and hatred. But there are steps we can take to make it a lot friendlier.
"I feel pretty lucky that the worst I've encountered has been, 'You're a fat, unfunny cunt and you have bad hair.' It stings, but I've been able to move on. However, I'm a white, cis woman so I've got a lot of privilege in these spaces in terms of how I'm viewed and how people choose to speak to me."
Christina McDermott is head of social for the BPP Group, and a freelance writer. She enjoys video games, and regularly uses social media, making her just like millions of women in the world, right now, in both respects. But almost exclusively because she is a woman, she has been on the receiving end of what she considers, incredibly, merely mild abuse.
"I always feel on edge whenever I have a piece published. It ties into a lot of my own anxiety issues: 'What if someone disagrees with this? What will they do or say to me? Will they find an old tweet of mine where I've said something stupid and use it to discredit my point?' I know that I'm not the only woman to feel like this, and I feel exceptionally lucky that I have a great support network on the social spaces I am on to help out if it all gets too much."
That Christina feels she has the necessary support to get her through periods when "it all gets too much" is great. But surely, by now, so many years since the first email was sent (officially in 1971) and social media completely invaded our lives to the extent that people now buy mobile phones based on their compatibility with the most popular social networks over the basics of being able to make a call, we should have reached the situation where communication through technology isn't tarred by regular stories of discrimination, harassment, bullying, or worse.
You needn't Google far to find one: here's a recent report from local press in Hull confirming that the local police is struggling to deal with "cyber sex pests". And when IGN's Lucy O'Brien posted her review of Uncharted 4: A Thief's End on May the 5th, scoring the game 8.8/10 (a very reasonable result, and so very far from a negative mark), her mentions on Twitter, and those directed at IGN, immediately became batshit insane.
People pulled out the classic "feminazi" calls, and inundated her with what is basically a skipload of sexist bullcrap, because she gave a (marginally) higher score to Rise of the Tomb Raider, a video game featuring a female lead. The (over 10,000) comments on the piece itself have been heavily moderated – a good thing – but even now there's some that are entirely spiteful. Case in point: "Lucy O Brien is just an attention wh*re, this is not the first time she underscores excellent games for attention. She probably touches herself to all of this at the moment... and yes, she doesn't shave her armpits." Good one, "SausagePiano". Some incredibly boring dork even set up one of those online petitions to encourage IGN to not let her review a PlayStation game again. It's bombing, naturally, but what's the major malfunction of whoever even thinks that setting up such a thing is A Good Idea?
"Women are constantly having their lived experiences and traumas minimized on social media because hate speech is 'just a joke' or 'harmless trolling'," is Christina's view of the social media landscape, today; one borne out by what's happening to O'Brien. "But imagine having to deal with that day after day." With that in mind, I spoke to Dr Ellen Helsper, of the London School of Economics and Political Science. She's long researched new media audiences, conducting studies into digital inclusivity and penning a 2007 thesis titled "Internet use amongst teenagers: Social inclusion, self-confidence and group identity". I wanted to know how she feels we can improve this sorry state of affairs, where bullying would appear to be rampant across online communication and social media, and what the next moves are in order to make the internet a happier place for everyone.
"One thing that's clear, to begin with, is that we cannot see online and offline lives as two separate things – these things are now mashed," she tells me. "Hurting someone online has the same physiological effects, like stress, as 'real-life' bullying. Sending a sext can be every bit as hurtful as seeing a partner flirt with someone else in a bar. It's not meaningless, because the separation between our online and offline lives is so hard to make, because it's just not there."
"Women tend to not act antisocially, whereas men do use online platforms in a more antisocial manner." Dr Ellen Helsper
Dr Helsper proceeds to tell me that it's rare for internet users to adopt a wholly different persona online to how they act in a physical context, and also that "most of the bullying we see online is actually by people that the victims see in their offline lives". She adds, though, that this transparency of identity is platform dependant, and also that men and women generally act differently when afforded a degree of online anonymity. "Women tend to not act antisocially, whereas men do use online platforms in a more antisocial manner. And that's in the sense of bad behaviour and bullying, trolling, things like that, but also in the sense of using those platforms to stand up for rights that they would otherwise be unable to stand up for. 'Going against the system', is the way I'd put it."
Dr Martin Graff, head of research in psychology at the University of South Wales, has conducted studies in the field of online interaction, including looking at how relationships both form and are dissolved in digital spaces. When I ask him about the base causes of the online toxicity we see reported on, the harassment and the misogyny, he replies that "it's suggested that people are less inhibited online", and in terms of abuse received through online gaming, adds: "Some computer based games can have the effect of desensitizing people in terms of what they see as aggressive. They might also create the impression that the world is a more aggressive place than it actually is." Interacting with strangers on the other side of the world through an online video game is just as fraught with the risk of receiving abuse as posting a could-be-controversial tweet; miss a chance in Rocket League and, my god, the reactions from teammates and opposition alike can be caustic, to be polite.
Dr Graff calls Twitter et al, text-based communication, "lean media", and says that it's using these channels that people become less careful with the things that they say – which isn't the case when speaking to someone else in person, using "richer communication". To go back to Dr Helsper's point about our online and offline lives being fully intertwined at this point, she sees the conveying of consequences as essential to reducing antisocial, antagonistic behavior on the internet.
"(Some computer games) might also create the impression that the world is a more aggressive place than it actually is." Dr Martin Graff
"Trolls, generally, are not hiding – they just somehow feel their actions are acceptable," she says. "Until someone really confronts them with the consequences, or brings what they're doing closer to their own experiences, or their own friends and family, they won't change.
"A good intervention programme in bullying, at work or at school or any organisation, is never just a case of going up to the bully and telling them to shut up. It's sitting together and explaining: this is what it does to the victim. In relation to the online environment, there's been a massive shift where when bullying occurs, because it's often people you know – or think you know – there needs to be this really clear identifying of who the intermediary should be, and how they can intervene.
"If it's a kid at school, bullying you through WhatsApp and not the playground, schools are now set up, in the UK at least, to pick up on that, as well as parents. WhatsApp can't do much about it themselves because they do not have the context. They can't look at a message and know: that is bullying, Because a lot of what we say to one another depends on the relationship you have, and without that context you can't interpret it. So Facebook has been trying to do something along those lines. If you report something as being harassment, or bullying, they give you the option of getting someone involved to mediate. What we know from traditional bullying and harassment, that's what works. If it's clearly illegal, then the platform has to jump in – like child pornography, grooming, human trafficking. That means there needs to be an outside intervention. But most of the harassment that people are worried about is not actually illegal, not something you can be jailed for. When words are hurtful, the best solution is to sit people together, and bring some context – although that's not always possible."
Article continues after the video below
But speaking out against abuse collectively is a sure-fire step in the right direction. Dr Helsper has observed that the most vulnerable offline are also those less equipped to handle abuse when online. "If you're more psychologically vulnerable than the average, and you get picked on, then that's going to have a bigger affect on you," she says. "You might not have the tools, the energy, the capacity to counter that. And then those things have quite heavy consequences.
"The people who are most vulnerable are also the least likely to have these support networks, to have an environment in which they can go to someone and say: 'This is happening to me, I don't like it, what can I do?' If you take the example of a kid who's being bullied for some reason, it's usually by someone more popular, stronger, or who the victim would like to be like. And it's really hard for them to go to their parents, or to the school, because you don't want to be the kid who's not liked by the other kids. You want to be popular. And that's the same with adults. There are people who don't have a support network of friends, anyone who come and help them. When you see bullying and harassment in organisations, through social media, unless they're set up for that kind of behaviour to be reported and dealt with, then those vulnerable people are going to continue suffering."
"If people feel that something inappropriate is happening, or something harmful is being said about someone, but they don't react, that is a silent endorsement." Dr Ellen Helsper
The solution here is to avoid what Dr Helsper calls "the spiral of silence", where "certain voices that don't comply to the general norm [in an online space] get silenced". She continues: "If people feel that something inappropriate is happening, or something harmful is being said about someone, but they don't react, that is a silent endorsement. And it might be that nobody reacts because they worry about receiving the same treatment that they're acknowledging as wrong, even though it might be that most people on that platform also feel the same way. And if people would speak up, then the trolls will get pushed off those platforms."
Day in, day out, I see this kind of collective, proactive dismantling of inappropriate interactions on Twitter – to go back to the Lucy O'Brien petition, I've seen several people with sizeable Twitter followings post about how it's entirely ludicrous, and how the person who started it should, maybe, not have done so. But that has to happen more often, and not just on a platform like Twitter, but messageboards both large and small, across the internet. Simply deleting comments is not effective in the long term, says Dr Helsper, adding that it's better to "collectively create a different atmosphere", although that can be difficult when patterns of interactions are long established on particular platforms. "When people first come into a discussion forum, they try to figure out what the norms are, behaviour and etiquette wise. If interactions begin with negative behaviour, which is not countered by a moderator or the existing community, that gets taken as a kind of silent approval."
Online literacy therefore becomes incredibly important, in order to stamp out that negativity before it flourishes – and it's a kind of training that has to begin young. "We've focused a lot on teaching people the practical stuff – how to use these buttons, how to install drivers, how to make a spreadsheet," Dr Helsper says. "And that includes safety settings, and privacy settings. But now the big shift in the area I work in, in a professional and educational context, is about incorporating what we call social and content creation literacy. And that's a form of 'netiquette', essentially. How do we interact when we're using all these different platforms? What is appropriate?
New on Motherboard: This Subreddit Dedicated to Trolling Makeup Bloggers Will Make You Blush
"Before, it was just: here is a keyboard, this is how it works, and this is how you scan for a virus. In social media, we know how to block people, but we need better social skills than that. You can get carried away by the fact that technology changes so rapidly; but really, these skills are ones that we've always had to learn, in interacting with others. Now there's this added digital layer – but it's not like it was okay before to call someone something very nasty. Now with technology, somehow all of a sudden that's okay? So we need to make that link between the offline and online when training people, and explain this digital world. These things are not disconnected – what you do online has consequences offline."
It could be that, 20 years from now, when today's kids are grown up and running the show on the internet, and digital communication has progressed through what is still something of a teething stage, we'll all be able to enjoy a happier, friendlier social media landscape. But what's vital in achieving that is absolutely acknowledging that technology isn't this thing that happens beside everything else in our lives – it's completely absorbed by it, and correspondingly envelops so much of our time, money, attention and emotions. "If we leave the digital stuff as a separate space, then this generation will not grow up to properly conduct themselves online," says Dr Helsper. "You need to know the meaning of things – so this needs to be explicitly talked about, and it shouldn't be assumed that things will change with the next generation, that we just have to wait until all these old people die, and then everything will be okay. I've heard that said a few times, and it's not as simple as that."
"I don't think that banning comments sections or prosecuting abusive tweeters is the answer to the issues that we're facing with social media and online spaces today," Christina tells me, echoing both Dr Helsper's thoughts and my own. "It's as though these platforms are entering adolescence, and we're trying to control them in ways which won't benefit these spaces or just make people more wary of using them. But this requires change from everyone involved – users and developers alike. Otherwise, we're just going to keep making the same mistakes."