Creepy DeepFake Nudie App Goes Viral Disrobing Women With AI, Then Shuts Down

Man Holding Smartphone
The developer of the rather disturbing and depraved 'DeepNude' application, that is programmed to "undress" women using machine learning and AI technology, has shut down the operation following a backlash on social media. In a message posted to Twitter, the developer acknowledged that "the probability that people will misuse [the application] is too high." Ya think? 

How or why the developer may have ever thought otherwise is a mystery. In a lame attempt to justify the app's existence, the developer further stated that the software was created "for user's entertainment" and is "not that great, it only works with particular photos." Here is the full statement on why the app is no longer available...
DeepNude was built around pix2pix, an open source project developed two years ago by researchers at the University of California, Berkeley. It leverages machine learning through the use of a GAN, or a generative adversarial network, whereby two neural networks sort of compete with one another to generate a realistic looking image.

"The networks are multiple, because each one has a different task: locate the clothes. Mask the clothes. Speculate anatomical positions. Render it," the developer told Motherboard. "All this makes processing slow (30 seconds in a normal computer), but this can be improved and accelerated in the future."

DeepNude App is creepy and demeaning
Who thought this was a good idea? The app home page offers little information now.

DeepNude only works with pictures of women. Though the developer considers it to be a "slow" process, the effect is much quicker than manually manipulating a photo in a similar manner in Photoshop. It is also way faster than what it takes to generate a deepfake video.

The ramifications for something like this are obviously creepy, sick and frightening. This is the sort of thing that can be used for revenge porn, whereby in this case a nude photo of a woman could be posted without her consent. Though the naked parts of the subject are not of the actual person, it hardly matters.

"Yes, it isn’t your actual vagina, but... others think that they are seeing you naked," Danielle Citron, a law professor at the University of Maryland Carey School of Law, told Motherboard. "As a deepfake victim said to me—it felt like thousands saw her naked, she felt her body wasn’t her own anymore."

Deepfakes are emerging as a real problem in an era of fake news as well. Even outside the realm of naked photos and videos, we have seen some disturbing uses of the technology. It is not just imagery, either—a recent demonstration played audio clips of what sounded like the voice of Joe Rogan, but were actually computer generated.

The DeepNude app represents one of the worst implementations for machine learning. Unfortunately, even though it has been yanked offline, the copies that are out in the wild still work. More over, the developer is probably correct that someone else would have (and still will) create a similar app.