Tech

Creator of DeepNude, App That Undresses Photos of Women, Takes It Offline

An app that algorithmically 'undressed' images of women was taken down by its creator, citing server overload and potential harms.
Screenshot from the DeepNude website.​

The creator of DeepNude, an app that used a machine learning algorithm to "undress" images of clothed women announced Thursday that he's killing the software, after viral backlash for the way it objectifies women.

On Wednesday, Motherboard reported that an anonymous programmer who goes by the alias "Alberto" created DeepNude, an app that takes an image of a clothed woman, and with one click and a few seconds, turns that image into a nude by algorithmically superimposing realistic-looking breasts and vulva onto her body.

Advertisement

The algorithm uses generative adversarial networks (GANs), and is trained on thousands of images of naked women. DeepNude only works on images of women, Alberto said, because it's easy to find thousands of images of nude women online in porn.

Following Motherboard's story, the server for the application, which was available for Linux and Windows, crashed:

By Thursday afternoon, the DeepNude twitter account announced that the app was dead: No other versions will be released and no one else would be granted to use the app.

"We created this project for users' entertainment months ago," he wrote in a statement attached to a tweet. "We thought we were selling a few sales every month in a controlled manner… We never thought it would become viral and we would not be able to control traffic."

When I spoke to Alberto in an email Wednesday, he said that he had grappled with questions of morality and ethical use of this app. "Is this right? Can it hurt someone?" he said he asked himself. "I think that what you can do with DeepNude, you can do it very well with Photoshop (after a few hours of tutorial)," he said. If the technology is out there, he reasoned, someone would eventually create this.

Since then, according to the statement, he's decided that he didn't want to be the one responsible for this technology.

"We don't want to make money this way," the statement said. "Surely some copies of DeepNude will be shared on the web, but we don't want to be the ones to sell it.” He claimed that he's just a "technology enthusiast,” motivated by curiosity and a desire to learn. This is the same refrain the maker of deepfakes gave Motherboard in December 2017: that he was just a programmer with an interest in machine learning. But as the subsequent rise of fake revenge porn created using deepfakes illustrated, tinkering using women's bodies is a damaging, sometimes life-destroying venture for the victims of "enthusiasts."

"The world is not yet ready for DeepNude," the statement concluded. But as these victimizing algorithms and apps show, there is no simple solution for technology like DeepNudes, and the societal attitudes that erase women's bodily autonomy and consent.