Several hours after making the app available, the software developer who created an app that uses Artificial Intelligence (AI) to make photos of dressed women appear nude, has taken it down. Called DeepNude, the app literally takes any photo of a woman and creates an almost realistic photo of her in the nude.

According to the anonymous creator of the app, which was also available in a premium paid version, that was because most nude photos online are of women. However, this is not really true and reveals the creator's misogyny as can be seen on the app's Twitter bio which reads "The superpower you always wanted."

"Here is the brief history, and the end of DeepNude. We created this project for user's entertainment a few months ago. We thought we were selling a Few sales every month in a controlled manner. Honestly, the app is not that great, it only works with particular photos. We never thought it would become viral and we would not be able to control the traffic. We greatly underestimated the request." reads a statement by the app's creator.


DeepNude, although it specifically targeted women, uses an AI technique that has been in use for several to create fake images and videos of mostly well known people. Named deepfakes, after the pseudonymous online account that popularized the technique, it uses deep learning after being "fed" with numerous amounts of photos that it needs to learn from to generate fake images and videos that look very realistic.

To be slightly more specific, DeepNudes uses what is known as a GAN (Generative Adversarial Networks). GANs are a deep leaning technique that, to put it quite simply, learns a skill, gets feedback on what is still missing, relearns the missing skill based on feedback, unlearns what is not necessary to mastering the core skill, get feedback. This process is repeated until the AI has mastered the skill.

DeepNude first launched as a website that shows a demo of how the it works. On 23 June 2019 downloadable Windows and Linux applications were made available. A day after that, the server hosting the applications crashed amid complaints that the app is an invasion of privacy and targets women unfairly.

Where to from here?

As mentioned, deepfakes have been around for several years. They have also, for several years, been used to generate fake pornography using well-known people's faces.

The technology exists and is unlikely to be completely eradicated given that it is software based. As such, banning it might only work in theory. On the other hand, there are ethical and moral implications of having such technology available to anyone as, like we are witnessing so far, it can be used for malicious purposes and prejudice certain sections of society.

"Despite the safety measures adopted (watermarks) if 500,000 people use it, the
probability that people will misuse it is too high. We don't want to make money this
way. Surely some copies of DeepNude will be shared on the web, but we don't want to
be the ones who sell it. Downloading the software From other sources or sharing it by any other means would be against the terms of our website. From now on, DeepNude will not release other versions and does not grant anyone its use. Not even the licenses to activate the Premium version. People who have not yet upgraded will receive a refund. The world is not yet ready for DeepNude."

Despite being taken off by the creator, it is important to note that versions of the software that were downloaded by hundreds of thousands of people are still out there and usable.

Share this via: