FYI.

This story is over 5 years old.

Tech

Which Tech Giant Will Build a Revenge Porn Database?

Facebook, Twitter, Microsoft and others use pooled data to stamp out child abuse imagery. Some revenge porn victims want something similar, but who is going to make it?
Image: Shutterstock

Tech giants, from Twitter to Facebook, have a weapon for eradicating child abuse imagery from their platforms. ‘PhotoDNA’, developed by Microsoft, automatically detects offending images by looking for similar or identical pictures. The key, though, is the sharing of that data: when child protection organizations flag a new image, the tech companies can spot it too.

But no tool that crosses between Facebook, Twitter, and other social media sites exists for a still pervasive problem: revenge porn, or non-consensually shared pictures. Revenge porn is spreading further afield and onto more mainstream platforms, such as gaming-focused chat service Discord and work communications firm Slack. So, why isn’t there a similar solution for revenge porn?

Advertisement

“Victims who have the same photos being shared repeatedly, or have been threatened or blackmailed could use a system like this to prevent being victimized again,” Katelyn Bowden, the creator of anti-revenge porn activism group Badass Army, and who has had her own intimate images traded online, told Motherboard.

Motherboard contacted seven companies—including Twitter, Facebook, Dropbox, and Google—many of which already use a system to detect child abuse material, to gauge interest in a revenge porn-focused equivalent.

“Discord is always open to working with peer organizations to develop ways to, as an industry, limit and ultimately get rid of this kind of illegal content,” a spokesperson from Discord wrote in an email. In the last few months, people sharing revenge porn images have moved from forums like Anon-IB onto Discord, where they set up chat rooms dedicated to particular colleges, states, or other locations.

“I would love for my images to be hashed so that I do not end up in the same situation."

Rachel, a member of Badass, who asked that only her first name be used, and whose images are already flagged on social networks because they were taken when she was underage, said in an email, “if the app Discord agreed to it and invited this sort of PhotoDNA scan [for revenge porn] to protect users, that would be a big deal and beneficial.”

But the rest of the companies did not respond or declined to comment. Dating services OkCupid and Tinder did not respond either.

Advertisement

Facebook in particular might be in a position to bootstrap a revenge porn database. Facebook already handles complaints of revenge porn generally, and last year the social network launched a pilot program in Australia, where those worried about the distribution of their intimate images could preemptively upload a copy to Facebook, which would then ‘hash’ the image.

Got a tip? You can contact this reporter securely on Signal on +44 20 8133 5190, OTR chat on jfcox@jabber.ccc.de, or email joseph.cox@vice.com.

A hash is a fingerprint of an image. In short, two identical images are likely to have the same hash, so if someone uploads the same image to Facebook, the company can block it before the picture is shared. PhotoDNA, the tool for child abuse material, can still detect an image even after it has been manipulated somewhat, and Facebook’s revenge porn tools may be similar: Facebook’s Chief Security Officer Alex Stamos previously tweeted at this reporter that there are algorithms to create a fingerprint of an image or video that are resilient to things like resizing.

“You can't make the fingerprint algorithm public (these things are too brittle), so you [the revenge porn victim] have to provide someone with a copy of the image (e.g. Facebook),” Nicholas Weaver, a senior researcher from the International Computer Science Institute at the University of California, Berkeley, told Motherboard in a Twitter message.

Advertisement

But to be clear, a database of image hashes is not a perfect solution for revenge porn victims.

“The true problem lies in the fact that when dealing with victims of revenge pornography, for this hash system to work you would need absolute participation from the victim. As we know victims of sex crimes are not statistically known for coming forward or making themselves vulnerable,” Rachel added. Some people may be hesitant about providing nude images to a company in the first place, even if the firm didn't keep copies of the images themselves.

That, and sites such as Anon-IB, the epicenter of revenge porn, are unlikely to deploy a PhotoDNA-style tool onto their own pages.

Of course, no security solution is 100 percent effective—PhotoDNA doesn’t apply to dark web child abuse sites, but that doesn’t mean Microsoft and other tech companies stop scanning their own services for offending images. As revenge porn continues to spread across platforms and services, there is clearly some appetite for more technical solutions.

“I would love for my images to be hashed so that I do not end up in the same situation,” Kelsey Bressler, Badass’ web developer told Motherboard in an email.