Reddit user hacked into the premium version of DeepNude, “stripping” women in the photo

The creators removed DeepNude from the site, but the copies remained online. The application was hacked so that it generates a photo without watermarks.

User Reddit created a crack for the DeepNude application, which “removes” clothes from photos using neural networks. Its addition provides free access to the premium version, which generates a photo without any watermarks.

The creator of the crack said that he had spent almost four hours creating the extension. He found out that DeepNude stores the email address in the generated image metadata, probably in order to keep track of the creators of the “stripped” photo. Such surveillance is conducted only for those who paid a premium account.

The user also noted that the author of DeepNude did not come up with anything new – he took the libraries for machine learning and a set of photos of women, and then combined them into one program. Therefore, to make a similar application will be easy, says the creator of the crack. By the time of writing this note, he had deleted his account on Reddit.

Editor TJ checked the work of the crack: photos are created without huge watermarks, as in the free version, and even without the inscription “Fake”, as in the premium version. Earlier in DeepNude there was no option to generate photos without any watermarks. The hacked application also allows you to download the generated image.

Generated in the hacked version of DeepNude singer Olga Seryabkina photo

Photo generated by the hacked DeepNude version of Vera Brezhnev

Generated in hacked version of DeepNude photo by Taylor Swift

On June 23, an anonymous developer launched DeepNude, which “removes” clothing from photos of women using artificial intelligence. The media found out about him three days later – after that it gained popularity in social networks.

On June 26, the authors of the application announced on their Twitter account the closure of the service. The developers clarified that they did not think that the application would become so popular. They decided that the likelihood of abuse is too high, and “the world is not yet ready for DeepNude.”

Back to top button