From the moment they learned about the application in the media and social networks, it took only 24 hours.
The authors of the application DeepNude, which “removes” clothes from photos using neural networks, said in their Twitter account about the closure of the service. The developers clarified that they did not think that the application would become so popular.
According to the creators, the world is not yet ready for DeepNude, since the probability of service abuse is too high.
Here is a brief history and the end of DeepNude. We created this project for user entertainment a few months ago. We thought that we would have several sales each month, which we will control manually. Honestly, the application is not too good, it only works with certain photos. We never thought that our service would become viral, and we would not be able to control traffic. We greatly underestimated the request.
Despite the security measures taken (watermarks), if it is used by 500 thousand people, the likelihood that people will abuse it is too high. We don’t want to make money that way. Of course, some copies of DeepNude will be published on the Internet, but we do not want to be the ones who sell this app. Downloading software from other sources and distributing it in any other way is contrary to the conditions of our website. From now on, DeepNude will not release other versions of the application and does not allow the use of existing ones, even the premium version.
We will return the funds to those people who have not yet had time to improve their version. The world is not ready for DeepNude yet.
DeepNude creators statement
The application started on June 23, but only 26 learned about the media. DeepNude allows you to make fake, but at the same time quite realistic nude pictures of women.
The author of the service, who introduced himself to Alberto, said that he had trained the neural network on 10 thousand photographs of naked women. He also planned in the future to create the function of “undressing” men.