Revenge Porn Is the Hydra of the Internet

More social networks are banning revenge porn, but it's not enough.

|
Mar 12 2015, 6:25pm

​Photo: ​Markus Wi/Flickr

Twitter is taking a belated stand against one of the abusive uses of its social network. Last night, without an official announcement or fanfare, the company up​dated its rules for all Twitter users to include new language specifically prohibiting revenge porn—that is, Twitter will no longer allow "intimate photos or videos that were taken or distributed without the subject's consent."

Twitter's ban on revenge porn follows other recent efforts by the company to counter abuse, including updates making i​t easier and faster for users to report abusive tweets and accounts, even if they're n​ot the victims of abuse themselves. It also comes just a few weeks after Reddit said it would begin taking down revenge porn, so long as the victims find out and email "contact@reddit.com."

So, is this a trend toward better policing of "involuntary porn" on social networks? Unfortunately, changing a website's content policies is the only the first—and by far the easiest—part.

Like the rest of life, the internet is full of people treating other people heinously. But the semi-anonymity afforded by the internet has made abusive behavior seemingly more widespread and more unaccountable. As two large social media companies with stated commitments to free speech and anonymity, Twitter and Reddit have been hotbeds for this type of activity.

Many of the participants in the nonsensical misogynistic "GamerGate" movement that peaked last year took to Reddit and Twitter when attacking women and others who spoke out against sexism in video games and the video games industry, making graphic rape threats and death threats and posting names, home addresses and other personal information of the targets of their abuse.

Reddit and Twitter were also both used as primary platforms for disseminating dozens of nude photos of mostly female celebrities that had been hacked and stolen off mobile and online accounts, an event better-known as "Celebgate," or "The Fappening" (the latter a crude reference to "fap," Reddit slang for masturbation).

While Twitter has long prohibited "direct, specific threats of violence against others," in its rules, critics have long complained that the company didn't do enough to patrol its platform, and Twitter CEO Dick Costolo even admitted "we su​ck at dealing with abuse" in a memo issued earlier this year. Reddit, for its part, previously took a more aggressively laissez-faire attitude toward the content and links that users posted on its forums, initially even resisting removing alleged child pornography material until stories about its r/jailbai​t community became mainstream news in America.

Like the rest of life, the internet is full of people treating other people heinously

The most recent efforts by both social media companies to purge their flagship websites of revenge form and other forms of abusive content may indeed be progress, but whether they work in practice or not ultimately comes down to enforcement.

As victims advocates previously pointed out to Motherboard in the case of Reddit, the responsibility of flagging and reporting revenge porn has been left entirely up to the victims. Twitter appears to be following the same exact route, telling B​uzzfeed: "We will ask a reporting user to verify that he or she is the individual in question in content alleged to be violating our policy and to confirm that the photo or video in question was posted without consent."

That means if someone posts nude images or video of you to without your permission to either website, it's your job to A) find about it somehow, B) contact Twitter or Reddit through their official channels and complain about it, and C) jump through whatever hoops they ask of you in order to prove you are who you say you are, and that you're depicted in the content.

Far from being proactive, the method for policing content on both is reactive: Twitter and Reddit have said they're going to act after an abuse has already been committed, by which point the damage may have already been done. Even if Twitter or Reddit remove revenge porn or other abusive content, someone else could have copied that content and posted it on other websites. There are even whole sites dedicated to archiving Reddit's po​sts so they can be accessed later if deleted on the main site.

More to the point, even if Twitter and Reddit successfully remove all the revenge porn posted by their users, that doesn't necessarily mean that content is actually off the internet. Twitter and Reddit are similar in that they consist primarily of text and links out to other websites where the actual content is hosted. Twitter has hoste​d user images for a couple of years now, and recently added the ability to po​st videos directly in tweets, but I can say anecdotally from my own use that most multimedia content on Twitter originates and appears to be hosted on outside sites.

Reddit doesn't allow any multimedia hosting, saying clearly in its F​AQ: "reddit is not a content host. You therefore need to host the content outside of reddit, and then submit a link to that content. You'll notice that most images are hosted on imgur.com, but any image hosting service can be used."

In other words, even if Reddit or Twitter takes down a link to an offending image or video, the content will still live on at the hosting website.

And what of those hosting sites? Imgur itself has long had a policy very similar to Reddit's and Twitter's new ones, saying clearly in its terms of service: "If someone else might own the copyright to it, don't upload it," and "If you see anything on our site that shouldn't be there because it violates our policies or for any other reason, please let us know by emailing us at abuse@imgur.com," but also absolving itself of any legal liability for harm: "Imgur has no duty to monitor any content on its site," the terms note.

Photobucket, another popular image hosting website, is much the same, with terms stating: "Photobucket may moderate Content. However, we are not responsible for what you have uploaded and we are under no obligation to modify or remove any inappropriate Content. We have provided some tools on the Site, such as links to 'flag as inappropriate,' so you can let us know if you find Content offensive." Both Imgur and Photobucket also allow private albums with unique share URLs, making it easy to hide content that you don't want getting out to the wider public. However, these private URLs can be and are often exposed through guesswork — and private nude images are circulated far and wide on a variety of online forums and in comments threads.

Given these realities, Reddit's and Twitter's bans on revenge porn and other forms of online abuse seem like they will make life only slightly more difficult for perpetrators. It's laudable that both companies are making bigger strides to curb abusive content on their own websites, but their case-by-case approach seems like a "Whac-a-Mole," solution to the problem—taking down offending material only when a victim raises the alarm. But how else can they differentiate between consensual amateur porn and non-consensual content? Some of Reddit's supposedly consensual adult sections, the popular r/gonewild for instance, ask users to upload "verifi​cation" photos showing the date, but even that's an imperfect solution, since photos can be easily manipulated.

Short of banning all types of adult content—consensual or not—the case-by-case approach to policing content that Reddit and Twitter are taking may indeed be the best solution for now. After all, many other big social websites have struggled with how to go about both protecting user privacy and freedom of expression.

Facebook, for example, includes a blanket ban on hate speech, speech that incites violence, and all types of nu​dity and pornography which it vigorously enforces through the use of algorithms and armies of human mo​derators. Its subsidiary Instagram has been notorious for its sporadic enforcement of an anti-nudity policy, specifically targeting women's bodies, which Chelsea Handler has protested as sexism. Google has a comprehensive list of ways to report abusive and offensive content on Google Plus, where all content that "contains nudity, graphic sex acts, or sexually explicit material," is prohibited. Still a number of accounts on Google Plus contain content that would fit under that definition. Tumblr has been long been the province of pornography blogs and explicit GIFs, with founder David Karp telling Stephen Colbert that adult content is "something we don't want to police." Yet Tumblr found itself embroiled in controversy after it was acquired by Yahoo and briefly made its adult-themed blogs unsearchable, before backtracking and regrouping them all under the label "NSFW."

All of that is to say that social media companies must walk a fine line when it comes to patrolling user content: too much enforcement reads as censorship and too little leads to abuse. But at least now, more of them are trying.