GitHub recently removed code from its website that used neural networks to algorithmically strip clothing from images of women. The multiple code repositories were spun off from an app called DeepNude, a highly invasive piece of software that was specifically designed to create realistic nude images of women without their consent.
The news shows how after DeepNude's creator pulled the plug on his own invention late last month following a media and public backlash, some platforms are now stopping the spread of similar tools.
"We do not proactively monitor user-generated content, but we do actively investigate abuse reports. In this case, we disabled the project because we found it to be in violation of our acceptable use policy," a GitHub spokesperson told Motherboard in a statement. "We do not condone using GitHub for posting sexually obscene content and prohibit such conduct in our Terms of Service and Community Guidelines."
The "Sexually Obscene" section of GitHub's Community Guidelines states: "Don’t post content that is pornographic. This does not mean that all nudity, or all code and content related to sexuality, is prohibited. We recognize that sexuality is a part of life and non-pornographic sexual content may be a part of your project, or may be presented for educational or artistic purposes. We do not allow obscene sexual content or content that may involve the exploitation or sexualization of minors."
At least one of the codebases was a knock-off of the DeepNude app rather than the original. The README file of one of the repostories read, "this is a work-in-progress open-source reimplementation of DeepNude based on reverse-engineering the original."
Another said, "This repo contains only the core algorithm, not the user interface."
Do you know anything else about DeepNude or deepfakes? Are you a victim of them? We'd love to hear from you. You can contact Joseph Cox securely on Signal on +44 20 8133 5190, Wickr on josephcox, OTR chat on firstname.lastname@example.org, or email email@example.com.
In June, Motherboard first reported the existence of DeepNude, whose anonymous creator was selling Windows and Linux versions of the app for $50. It worked by taking a clothed image of a woman, then swapping the clothes for naked breasts and a vulva using generative adversarial networks (GANs). Motherboard verified the program worked at the time.
A day after Motherboard's report, the creator stopped offering downloads of the program. "We don't want to make money this way,” he wrote in a statement posted to Twitter. "Surely some copies of DeepNude will be shared on the web, but we don't want to be the ones to sell it.”
Subscribe to our new cybersecurity podc__ast, CYBER.