FYI.

This story is over 5 years old.

Vice Blog

The Supreme Court Heard Its First Social Media Harassment Case

On Monday the Supreme Court heard arguments in a case that could define whether or not online threats are taken seriously by the legal system.

Image by Cei ​Willis

On Monday the Supreme Court heard oral arguments in a case that could define whether or not ​online threats are taken seriously by the legal system. The case is ​Elonis v. United States​, and an impending SCOTUS decision could set the bar for future prosecutions of abusive partners and Gamergate trolls alike.

According to court papers, after Pennsylvania's Anthony Elonis's wife and two children left him in 2010, he began posting violent threats about her on Facebook. Elonis was no angel before that: After sexually harassing two of his female co-workers, he went on to post a Facebook photo of himself holding a knife to a female coworker's throat (taken at a Halloween event) with the caption "I wish." He was fired the next day.

Advertisement

After being fired, Elonis posted threats aimed at his former employer:

"Ya'll think it's too dark and foggy to secure your facility from a man as mad as me. You see, even without a paycheck I'm still the main attraction. Whoever thought the Halloween haunt could be so fucking scary?"

But it was the threats aimed at his wife that really hit home.

If I only knew then what I know now, I would have smothered your ass with a pillow, dumped your body in the back seat, dropped you off in Toad Creek, and made it look like a rape and murder.

After Elonis posted the above on his Facebook page, his ex-wife received a Protection From Abuse (PFA) order from the state in November 2010.

Tara Elonis later testified at trial that she was "very afraid for myself and my childrens' lives." She also testified that her ex-husband never listened to, or wrote, rap music—to clear up the puzzling lyrical way in which his crazy Facebook posts were written.

After making several more threats against his ex, Elonis went on a Facebook rampage of sorts, posting threats against everyone from police to FBI agents and saying he would carry out a school shooting at a nearby kindergarten.

The FBI had been regularly monitoring Elonis's online activity by this time, and he was arrested in December 2010 and indicted by a grand jury on five counts of threatening interstate communications: to his former employer, his ex-wife, the police and sheriff's departments, a kindergarten class, and, finally, a female FBI agent who questioned him at home.

Advertisement

Elonis refused to back down, challenging his conviction all the way to the Supreme Court on the basis of "free speech."

Now free speech advocates are piling on to the case, worried that a SCOTUS decision on the intent behind Elonis's threats could bleed into cases affecting everyone from journalists to protestors.

The case would determine whether the US cyberstalking law Section 875(c) could be expanded to require proof of "subjective intent" to threaten. As the law currently stands, a person can face federal charges, up to five years in prison, and a $250,000 fine for threatening to injure someone over the internet, via telephone, or any other kind of interstate or international communication.

The Supreme Court decision in Elonis will be vital for determining how online harassment and threats are prosecuted going forward, specifically impacting cases like Gamergate, where the majority of the threats against female gamers were made in third-party message boards and on Twitter.

In other words, the question at the center of the Elonis case is this: If a threat is made against your life, and it wasn't emailed directly but was seen by everyone on a social media site like Facebook—did the threat really take place? Of course it did; welcome to the 21st century, where every part of our lives is lived through our social networks.

"Abusers and stalkers perpetrate their crimes where we live our lives: which is with one hand on our phone and the other on a tablet. If we didn't spend so much of our time on social media, abusers wouldn't make threats there," said Cindy Southworth, founder of the Safety Net Technology Project at the National Network to End Domestic Violence (NNEDV). "This isn't about Facebook—it's the same behavior and crime as if it were in the home or the town square."

Advertisement

Some writers have positioned the Elonis case as being about "rap lyrics and free speech," ignoring the abusive context in which Elonis's threats were made; his wife had already received a protective order that, by law, must depend on previous incidents of abuse, and Elonis was fired from his job for sexually harassing female coworkers.

Once you've killed the family pet, that threat on Facebook has so much more meaning.

"This is about the context in which the threats were made," said Southworth, whose organization wrote en extensive amicus brief explaining the techie aspects of modern-day domestic violence.

"When we talk to victims, I don't know a single survivor of domestic violence or stalking who has only experienced one single tactic. They will threaten online and slash the tires," said Southworth. "What makes those online threats so scary is that they typically have physical violence happening at the same time. Once you've killed the family pet, that threat on Facebook has so much more meaning."

If Elonis is a sadly typical case of online domestic harassment, then why did associations representing every newspaper in the country weigh in with an am​icus brief citing concerns about free speech?

"Anytime the Supreme Court is asked to decide something, it's not just going to decide the one case before it, it's going to set a new rule for all speech," Gregg Leslie, Legal Defense Director at the Reporters Committee for Freedom of the Press, told VICE. "The court reaches beyond this particular statute and could say something about what type of speech is not protected by the First Amendment.

Advertisement

"If you're going to criminalize speech, you're going to have to make sure the person intended what he said," Leslie added.

If the law is moving toward making domestic violence (and abuse and rape) survivors somehow "prove" that their attackers intended to harm them, it could be even more difficult than it already is to not only get victims to come forward, but to prosecute perpetrators when they do.

Attorney Jennifer Long of Aequitas, a group that works with prosecutors in cases of gender-based violence and trafficking, told VICE that perpetrators of all kinds of crimes, from terrorism to domestic violence, use threats to silence witnesses and victims: "the bulk of intimidation is verbal."

"Oftentimes perpetrators are smart enough to do it through a third party or in a way that won't tie into a case but will still prevent a victim from doing something," Long said.

The transcript of today's Supreme Court oral arguments is a labyrinthine puzzle of semantic struggle over the phrase "true threat," with almost comic results. How many pages of discussing the intent behind Eminem lyrics do we really need to know that Eminem had nothing to do with Anthony Elonis being an abusive asshole to his wife?

Hopefully the court won't remain so hung up on definitions of intent and threat speech, nor get too distracted by the completely unrelated issue of rap lyrics, that it forgets what is really at stake: the safety of women who are increasingly attacked online by intimate partners as well as masses of trolls.

"A lot of online abuse is motivated by reasons other than the impact on the victim—online mobs do it for the lulz or one-upmanship and may not even know the subject of the abuse, as we saw in Gamergate and when celebrity nudes were hacked and distributed," Carrie Goldberg, attorney with the Cyber Civil Rights Initiative, told VICE. "What's essential is that the court prioritize the language and impact of the words—and not the speaker's intent."

Follow Mary Emily O'Hara on ​Twitter.