FYI.

This story is over 5 years old.

Sex

The ACLU Is Fighting to Keep Revenge Porn Safe and Legal for Pervs

Your vagina has less of a right to privacy than your credit card numbers or medical records. That's objectively fucked-up.

Photo via Wikimedia Commons

A federal ​bill that would criminalize revenge porn is currently on the table, with Representative Jackie Speier (D-CA) likely set to introduce it by the year's end. But in order to make it through the crucible of free-speech and liability concerns—the ACLU and other free-speech crusaders will fight the bill tooth and nail—the law will have to be both cautious and very, very specific.

As it stands, 15 states have laws on the books that criminalize nonconsensual porn—an emerging category of material that includes hacked selfies as well as consensually shot nudes later distributed without the subject's consent. But most of those laws have what the End Revenge Porn ca​mpaign refers to as "narrow applicability and/or constitutional infirmities."

Advertisement

That's basically what's happening with two laws—one criminal, one civil—in California right now.

On September 30, Governor Jerry Brown signed two important measures that would ostensibly protect people from being victimized by photo thieves looking to turn other people's privates into everyone else's business. The first thing Brown did was add selfies to a state pe​nal code (colloquially referred to as the Revenge Porn 2.0 Act) introduced last year by Senator Anthony Cannella (R-Ceres), criminalizing revenge porn regardless of who created the image. (The fact that California's criminal revenge-porn law was lacking such an obvious inclusion and needed amending only a year into its existence is indicative of how quickly nonconsensual porn is outpacing attempts to legislate.)

The same day that Brown amended California's criminal law, he also passed the first civil revenge-p​orn law in the US. It has its pluses and minuses.

The pros:
​ A plaintiff can use a pseudonym in court and have court records redacted as well, protecting privacy. Another good thing is its inclusion of oral sex. Most revenge-porn and sex-surveillance laws only cover photos in which the victim is nude or partially nude, which wouldn't do shit to stop your ex-boyfriend from disseminating a picture of you blowing him if you're wearing a shirt. But this new law includes sex acts with or without nudity.

The cons:
​ The law contains some weird language that effectively strips it of any potential to protect anyone. For example, it says that there's no liability for sharing sexualized photos under certain circumstances, like if the photo is a matter of "public concern," if it was shot in a public place where the subject had "no reasonable expectation of privacy," or if the photo was previously distributed by another person. That would mean that the law doesn't cover hacked celebrity nudes, creepshots, or sexts. So it's like: Then what's the fucking point?

Advertisement

The latter is especially problematic because it opens the door to legal arguments that anyone who sexted a photo to a partner was the "original distributor"—even if they had no intention for anyone else to ever see it. And it utterly erases the concept of downstream liability when victims' advocates say it is most needed. After all, a revenge-porn victim can sue her ex-boyfriend all she wants, but it says nothing to the thousands of other people sharing and reposting the offending images.

When is a nude photo a "matter of public concern"? When it's of a celebrity, like Jennifer Lawrence? And adding an exception about photographs taken in public invites a quagmire of excuses for upskirting, since the "reasonable expectation of privacy" loophole has already been cited to exonerate cell lurkers like creepshot photographer Christopher Cleveland, whose ca​se was thrown out by frustrated prosecutors in DC in October.

While advocates applauded the move to criminalize nonconsensual porn in both criminal and civil courts—thus expanding the prosecutorial options for victims—it quickly became clear that the civil law, too, is flawed. Almost to the point of impotence.

"The language kind of guts the law of its utility," internet privacy attorney Carrie Goldberg told VICE. Goldberg said that the liability exception regarding matters "of public concern" were likely targeted toward things like Abu Ghraib torture photos and protecting journalists who publish them as evidence.

Advertisement

"Any good lawyer for the [revenge porn] defendant is absolutely going to argue that it's a matter of public concern, but let's hope that's a losing argument," Goldberg said, before adding, "It would depend on the judge for sure."

Why include liability exceptions in a law designed to make "fappeners" and other pervs criminally liable? Because: free speech.

David Greene, senior staff attorney at  ​Electronic Frontier Foundation, told VICE that exceptions are required to make any law restricting free speech constitutional.

"Under First Amendment law, someone who publishes truthful information that is a matter of public concern must be protected," Greene told VICE. "I'm always very skeptical of laws that restricts free speech. That's regardless of whether you have a really good reason to restrict that speech."

But the US has plenty of laws that do restrict speech and protect private information. Federal HI​PAA laws protect people's sensitive medical information from being leaked by medical professionals and F​ERPA law protects the privacy of educational records. The list of laws ​protecting consumer credit information is so long it's almost ridiculous.

So why do the spate of emerging revenge porn laws keep colliding with concerns about free speech?

"It's absurd that there are not laws that protect us from having [pictures of] our genitals released," Goldberg told VICE. "Besides HIPAA and credit card laws, there are also laws against obscenity and hate speech, and laws against sexual harassment. [Those] were opposed in the beginning."

Advertisement

Even child pornography laws have traditionally been opposed by free speech advocates.

In 2002, the ACLU released a l​ist of online privacy-related state bills it was actively opposing in court; those included an Illinois bill criminalizing the act of posting a child's name and contact information on a porn site, a Rhode Island bill that banned online transmission of a child's information for the purpose of engaging in "unlawful sexual conduct," and a multitude of laws that criminalized the electronic transmission of child pornography.

Why would anyone oppose child-porn laws designed to protect kids from predators? Well, for starters, some of those laws contained hidden sections and attachments that expanded the criminalization to vague crimes like "obscene speech" or held internet service providers directly liable for things like the "dissemination of obscene material" and "pornographic material harmful to youth." Such slight adjustments in language made some laws applicable to a wide swath of electronic communications that could have nothing to do with children.

It should be stated that various ACLU chapters have taken a stance against child pornography as well as revenge porn. But the organization's primary mandate is to protect free speech and civil liberties; if a child porn or revenge porn law even slightly overs​teps into unrelated territory or threatens the privacy of online communications in general, ACLU lawyers can be expected to attack it.

Advertisement

Christopher Soghoian, principal technologist at the ACLU's Speech, Privacy and Technology project, told VICE that he couldn't comment on the laws and legislation regarding revenge porn. But from a technical perspective, he stressed the importance of keeping the government and third-party internet service providers from having policing power over the way we talk, post, and click online.

"If you want Google and Twitter to police what people are posting, you're creating the infrastructure for a surveillance state," Soghoian told VICE. "Once Google has the ability to recognize and remove any image on the internet, these are not the only requests that they receive. You're going to have the government coming along and saying that they want something removed."

Besides, explained Soghoian, it's technically impossible for companies like Twitter and Google to create the kind of automatic takedown mechanisms of which revenge porn victims' advocates dream. Such an ideal system might be able to recognize copies of an illegal image and instantly block them. But for programmers, it's not so simple.

"There's no magic way to prevent a particular photo from being subtly altered and reposted: You can slightly change the color of the image and it won't be picked up," Soghoian said. "There are some companies that have employed technical tools that look for similarities in content. But it's so easy to circumvent that it isn't effective. There are tools that scan email attachments looking for child porn, but it's sometimes as simple as placing it in a zip file and it will not be detected."

Soghoian has sug​gested ways that third parties could intensify security and privacy features, such as instituting a private photo mode similar to the privacy modes options accompanying browsers.

But between the technical difficulties inherent in policing online material, and in light of the potential affronts to civil liberties that arise when such policing is employed, it seems that the best way to shut down the whole nonconsensual porn thing is to legislate—and carefully.

Follow Mary Emily O'Hara on ​Twitter.