It's a struggle to keep video game communities civil when so much of games culture is toxic on its better days. Players should be able to seek out spaces where they can be accepted without fear of discrimination based on gender, sexuality, race, physical and mental ability, class, and more.
But increasingly, it is community managers who are responsible for keeping the peace online—for finding a balance between different kinds of gamers, and pushing harmful behaviour out. With concerns of online abuse and harassment growing, community managers need to step back and reevaluate practices, and it seems like they're starting to finally get how to do that.
The Community Management track was a new addition to this year's Game Developer's Conference (GDC), held at the beginning of March in San Francisco. It kicked off the week of panels with a one-day summit on developing, maintaining, and growing video game communities.
I manage a community of mobile and social gamers and was eager to learn how I could make my community more fun, while also moderating and preventing inappropriate or abusive behaviour at the same time. I expected I'd be chugging coffee to stay awake through buzzword-filled talks. But then something unexpected happened: community managers were proactively addressing toxicity in games spaces head-on.
In a talk titled "Unconventional Wisdom on Creating a Passionate Community," Ubisoft community manager Gabriel Graziani addressed the trials and tribulations that come with managing the massive Assassin's Creed community. It is currently comprised of over 10 million fans on the brand's Facebook page alone. Graziani discussed the hours of hard work the game's fans had done—from creating and editing YouTube Let's Play videos to designing immaculate cosplay—and that Ubisoft had featured on the page.
But soon Graziani's talk turned into what I would later describe to my peers as "stealth activism," where the basic rights and needs of traditionally marginalized groups are given a platform where they would otherwise be ignored.
Graziani said that paying fans with "exposure" troubled him, and told the story of video creator and YouTube personality Andrien Gbinigie. Gbinigie's Assassin's Creed gameplay videos led to a partnership with the multi-channel gaming and media networkMachinima, and he was later hired to work with Graziani's team.
He also acknowledged that women who submitted photos of themselves dressed in cosplay were at a higher risk of receiving hateful and inappropriate comments from other community members. When professional cosplayer Jessica Nigri teamed up with Ubisoft for regular media appearances dressed as Assassin's Creed characters, Graziani responded by heavily moderating comments on cosplay submissions. It was his way of making sure fans knew that women were needed and respected in online spaces.
He ended the panel with a fantastic statement: "Don't let trolls dominate the narrative, defend your community not to win an argument but to set an example."
Be still, my feminist heart.
During another panel titled "Managing Communities within the Culture Wars," game developers Gordon Walton and Rich Vogel, alongside designer and author Raph Koster, tiptoed around "the controversy that shall not be named"—the infamous hashtag created last summer to harass, doxx, and threaten women developers and critics, causing a disturbing rift in the game industry.
Anonymity and pseudonymity—while crucial for many marginalized groups who are often on the receiving end of online abuse—is also one of the reasons why harassment is so accessible. It's as easy for a developer of sex-positive games to set up a Twitter account as it is for a person who wants to verbally abuse her from across the globe.
Though Twitter attempted to tackle abuse with improvements to its blocking feature in 2014, it has received ongoing criticism of its safety policies since, leading to its CEO Dick Costolo's admission that more must be done. On other social networks such as Reddit, anonymous users form mobs to downvote content they dislike—a common silencing tactic.
As a result, community managers have started to provided private, secure channels where thorough reports can be made if users are being threatened, the panelists said. Another way to prevent users from threatening others, they said, is to incentivize and reward "good behaviour," and use positive language when dealing with "bad behaviour"—rather than calling and singling the offending user or users out.
Thankfully, social media services are taking small steps towards better dealing with and even preventing online abuse too. A week after this year's GDC, Twitter banned doxxing and revenge porn, following the footsteps of Reddit's similar ban. And Facebook's decision not to add a "dislike" feature is a conscious decision to make downvoting attacks less accessible, according to Koster.
But the most difficult part about managing toxic spaces comes down to three words I often cringe at saying: hearing both sides. As an activist, I voice my concerns unabashed when I see people being hurt, and that frequently requires singling abusers out. But as a community manager, I have to work with abusers to deter future harassment or threats, and encourage them to do better.
It's conflicting having to balance knee-jerk banning and productive coaxing. But when looking into these issues "dispassionately"—a term I was glad to hear the panelists say as opposed to "objectively"—clearer solutions can be made to change toxic spaces for the better.
Nevertheless, talks such as these are always bittersweet. No one wants to willingly place themselves in the middle of abuse. But knowing that managers are thinking of ways to end abuse is a step in the right direction. For victims who need online protection most, silence is too often the response.