This article is more than 1 year old

No DeepNudes please, we're GitHub: Code repo deep-sixed as Discord bans netizens who sought out vile AI app

Including one of El Reg's Discord accounts. Oops!

GitHub has deleted a repository containing partial blueprints of DeepNude, the notorious AI-powered app that stripped clothes from women in photos to generate fake naked pics.

These fragments of decompiled Python source code revealed the inner workings of the software, and were generated from a copy of the DeepNude application that was briefly distributed and sold a couple of weeks ago. The material was placed in a now-removed GitHub repo seemingly to encourage others to build new variants of DeepNude using, as well as the app's neural network models, the decompiled algorithms as a guide.

And it wasn't the only repo removed from GitHub for containing DeepNude-based source code.

These takedowns come after Discord, a popular multiuser chat system used by gamers and geeks, banned several server instances selling or sharing the DeepNude software. It appears Discord has also banned users who sought out the app on its chat servers, or helped distribute it.

A Discord account used by The Register to – for journalistic reasons, honest – investigate the spread of cracked versions of the paid-for software was terminated on Monday night this week.

“Discord is focused on maintaining a safe and secure environment for our community, and your account has been flagged by the Discord community for violations of our Terms of Service and Community Guidelines,” the email from Discord, informing us our account was toast, read.

"Our team has reviewed the claim and taken action by disabling your account. Your account was directly posting non-consensual pornography or was involved in servers which were dedicated to such content."

El Reg has asked Discord for more details on this crackdown. The chat system biz previously told us that it was investigating any reported terms of service violations by a server or user.

“Non-consensual pornography warrants an instant shut down on the servers and ban of the users whenever we identify it," it warned. DeepNude's generated images of women wearing no clothes, digitally imagined from their clothed selfies, is a form of non-consensual pornography.

Outrage

The furor behind DeepNude began last month when it was discovered that a group of seedy developers had built a Windows and Linux desktop app, and an Android variant, that allowed anyone who got hold of the tool to ogle at what women may look like naked. Free and paid-for versions were available to download.

No deep-learning knowledge was required to use the app: just feed it a photo of a scantily clad woman, and it’ll automagically spit out the same image with the clothes replaced with what may lie underneath, or rather a neural network's best guess. It sometimes works well, depending on the lighting and framing, being trained on thousands of porno pics of women in various states of undress.

When this misogynistic monstrosity went viral, and the internet detonated with rage, its creators swiftly deleted the application from their website, begged people to stop using and copying it, and tried to pretend it all never happened.

A woman standing by a window

DeepNude deep-nuked: AI photo app stripped clothes from women to render them naked. Now, it's stripped from web

READ MORE

But it was too late: the premium build of software was cracked, and resold or distributed over file-sharing networks and via Discord servers. Computer science nerds also had a go at reverse engineering the code and, as we said, posted the algorithms and decompiled source on sites like GitHub for other perverts to use.

Eventually, someone reported the repos as inappropriate to GitHub, which swiftly removed them on the grounds of being “sexually obscene.” A GitHub spokesperson told The Register in an email last night: “We do not condone using GitHub for posting sexually obscene content and prohibit such conduct in our terms of service and community guidelines.”

Microsoft-owned GitHub’s acceptable use policies states: “Under no circumstances will users upload, post, host, or transmit any content to any repositories that is or contains sexually obscene content.”

The open-source code dormitory claims its staff are not manually scouring GitHub-hosted projects for DeepNude implementations, and instead are relying on developers to report any nasty stuff spotted.

“We do not proactively monitor user-generated content, but we do actively investigate abuse reports. In this case, we disabled the project because we found it to be in violation of our acceptable use policy,” a GitHub spokesperson said.

The Register found at one point multiple DeepNude-related repositories on GitHub, so it may be difficult to eradicate them all, especially if they keep popping up whack-a-mole style.

“We don’t disclose number of reports per project,” the spokesperson added. ®

More about

TIP US OFF

Send us news


Other stories you might like