DeepNude's makers tried to deep-six their pervy AI app. Web creeps have other ideas: Cracked copies shared online as code decompiled

This genie is definitely not going back in the bottle


From the department of closing the barn door after the horse has warped away at light speed, comes this latest news. Although the creators of DeepNude have torn down their slimy software that removes clothes from women in photos, the code continues to spread all over the internet.

Some devs are even trying to reverse-engineer the application to produce their own improved builds, or release it as open source.

The original version of DeepNude – a machine-learning app that replaced women’s clothing in photos with an approximation of what they would look like underneath, thus creating fake naked images of them – was seemingly abandoned last week after its developers, based in Estonia, realized, with the help of some online outrage, that their misogynistic creation could be misused. They took down their website, halted any further downloads and activations, and asked people to forget the whole affair, all within a few days of launching the thing.

“Downloading the software from other sources or sharing it by any other means would be against the terms of our website. From now on, DeepNude will not release other versions and does not grant anyone its use,” the coders said last week. Some in the AI world sighed with relief that the photo menace had gone almost as soon as it had arrived.

However, funnily enough, the team's halfhearted plea hasn’t stopped this software nasty from spreading nor stopped people using it. The application, available for Windows, Linux, and Android, was downloaded by hordes of internet pervs, who are now sharing the packages on file-sharing networks.

A Reg reader tipped us off that hundreds of users are slinging cracked copies of the premium version of DeepNude, which cost $50 apiece, around file-sharing sites and networks: a number of copies, unsurprisingly, contain malware. And that’s not all. Details on how to use the software, along with links to downloads, are being shared on YouTube, and versions sold in shady Bitcoin forums and touted across Discord, an instant-chat platform popular with geeks and gamers.

It appears programmers have fixed various bugs – the software was known to be crashtastic – and removed the automatic placing of watermarks on generated images. These marks labeled the doctored photos as "fake."

The Register poked around a few of these Discord servers. “We are happy to announce that we have the complete and clean version of DeepNude V2 and cracked the software and are making adjustments to improve the program,” announced one we found to its thousands of members.

The above message also appeared on another Discord server that was shut down on Monday, as reported by Vice. Another DeepNude-touting server we saw was home to more than 20,000 netizens eager to get hold of the software.

"The sharing of non-consensual pornography is explicitly prohibited in our terms of service and community guidelines," a Discord spokesperson told The Register.

"We will investigate and take immediate action against any reported terms of service violation by a server or user. Non-consensual pornography warrants an instant shut down on the servers and ban of the users whenever we identify it."

Cracking DeepNude

You don’t have to look far to find decompiled versions of the neural-network software, either, revealing in part the internal workings of the application. Coders have been reverse engineering the original DeepNude software, or building their own flavors from scratch after studying its operation, and posting their results on GitHub, for instance.

It appears DeepNude was written in Python that was subsequently compiled into executable binaries. It uses PyTorch, Qt, Numpy, and various other libraries to achieve its goal, and can accelerate its image generation using Nvidia GPUs via CUDA. The software's trained model is split over three files, totaling 2.1GB, that, in normal use, are downloaded once from a since-deactivated AWS S3 bucket after the app is installed. Copies of the software floating around file-sharing networks include this trained model so it can be used despite the cloud bucket deactivation.

Interestingly, the code slightly alters its data three times in an obfuscated manner, replacing short strings of bytes, before using it, probably to prevent other applications from using its trained model.

Its algorithms are based on pix2pix, a generative adversarial networks research project run by boffins at the University of California, Berkeley back in 2017. This image-to-image technology – described in more detail here – generates new pictures from input photos or sketches, drawing upon on its training data to paint the scene. For example, pix2pix can be trained to turn outlines of cats into fully rendered AI-generated cats, or black and white images into color snaps.

In the case of DeepNude, however, it’s bikinis to boobs.

The neural network would, we imagine, be trained on thousands of pairs of images: for each pair, one would be a woman in a swimsuit, and the other ideally the same woman naked in a similar pose. The model learns the outline and shape of women’s bodies, and what they should look like underneath. When it’s fed an image of a scantily clad woman, it substitutes the clothing for the naughty bits to make her appear naked. This works particularly well if the input image looks like something akin a modeling shoot, since a lot of the training data will involve pictures taken from porno picture albums.

Ironically, a group of researchers from the Pontifical Catholic University of Rio Grande do Sul, Brazil, built a similar model last year. Instead of removing the clothes, however, it actually pastes them on naked ladies. The DeepNude app, however, acts in reverse, swapping bras for breasts. ®


Keep Reading

Twitter: Our image-cropping AI seems to give certain peeps preferential treatment. Solution: Use less AI

Let's just go back to human-selected cropping, eh?

AI in the Enterprise: How can we make analytics and stats sound less scary? Let's call it AI!

Register Debate New names for old recipes

Got a problem with trust in AI? Just add blockchain, Forrester urges. Then bust out the holographic meetings. Welcome to the future

It takes 'grit' to send in a holograph to meetings instead of struggling with mute buttons yourself...

Linux Foundation projects on AI and data merge – because one of these concepts simply can't exist without the other

Open Source Summit Europe Combined org hoping to attract more industry support

AI in the enterprise: AI may as well stand for automatic idiot – but that doesn't mean all machine learning is bad

Register Debate Is AI just a rebrand of yesterday's dumb algorithms? We present the argument against this motion – and don't forget to vote

Microsoft builds image-to-caption AI so that your visually impaired coworkers can truly comprehend your boss's PowerPoint abominations

Better-than-before code to make Office more accessible

Google previews Document AI for parsing forms: Just a catch-up with AWS and Azure?

Convert a jumble of documents into neatly structured data - if you are lucky

UK reveals new 'National Cyber Force', announces Space Command and mysterious AI agency

Combined Ministry of Defence and GCHQ team has worked since April to 'transform cyber capabilities'

Biting the hand that feeds IT © 1998–2020