No DeepNudes please, we're GitHub: Code repo deep-sixed as Discord bans netizens who sought out vile AI app

Including one of El Reg's Discord accounts. Oops!

GitHub has deleted a repository containing partial blueprints of DeepNude, the notorious AI-powered app that stripped clothes from women in photos to generate fake naked pics.

These fragments of decompiled Python source code revealed the inner workings of the software, and were generated from a copy of the DeepNude application that was briefly distributed and sold a couple of weeks ago. The material was placed in a now-removed GitHub repo seemingly to encourage others to build new variants of DeepNude using, as well as the app's neural network models, the decompiled algorithms as a guide.

And it wasn't the only repo removed from GitHub for containing DeepNude-based source code.

These takedowns come after Discord, a popular multiuser chat system used by gamers and geeks, banned several server instances selling or sharing the DeepNude software. It appears Discord has also banned users who sought out the app on its chat servers, or helped distribute it.

A Discord account used by The Register to – for journalistic reasons, honest – investigate the spread of cracked versions of the paid-for software was terminated on Monday night this week.

“Discord is focused on maintaining a safe and secure environment for our community, and your account has been flagged by the Discord community for violations of our Terms of Service and Community Guidelines,” the email from Discord, informing us our account was toast, read.

"Our team has reviewed the claim and taken action by disabling your account. Your account was directly posting non-consensual pornography or was involved in servers which were dedicated to such content."

El Reg has asked Discord for more details on this crackdown. The chat system biz previously told us that it was investigating any reported terms of service violations by a server or user.

“Non-consensual pornography warrants an instant shut down on the servers and ban of the users whenever we identify it," it warned. DeepNude's generated images of women wearing no clothes, digitally imagined from their clothed selfies, is a form of non-consensual pornography.


The furor behind DeepNude began last month when it was discovered that a group of seedy developers had built a Windows and Linux desktop app, and an Android variant, that allowed anyone who got hold of the tool to ogle at what women may look like naked. Free and paid-for versions were available to download.

No deep-learning knowledge was required to use the app: just feed it a photo of a scantily clad woman, and it’ll automagically spit out the same image with the clothes replaced with what may lie underneath, or rather a neural network's best guess. It sometimes works well, depending on the lighting and framing, being trained on thousands of porno pics of women in various states of undress.

When this misogynistic monstrosity went viral, and the internet detonated with rage, its creators swiftly deleted the application from their website, begged people to stop using and copying it, and tried to pretend it all never happened.

A woman standing by a window

DeepNude deep-nuked: AI photo app stripped clothes from women to render them naked. Now, it's stripped from web


But it was too late: the premium build of software was cracked, and resold or distributed over file-sharing networks and via Discord servers. Computer science nerds also had a go at reverse engineering the code and, as we said, posted the algorithms and decompiled source on sites like GitHub for other perverts to use.

Eventually, someone reported the repos as inappropriate to GitHub, which swiftly removed them on the grounds of being “sexually obscene.” A GitHub spokesperson told The Register in an email last night: “We do not condone using GitHub for posting sexually obscene content and prohibit such conduct in our terms of service and community guidelines.”

Microsoft-owned GitHub’s acceptable use policies states: “Under no circumstances will users upload, post, host, or transmit any content to any repositories that is or contains sexually obscene content.”

The open-source code dormitory claims its staff are not manually scouring GitHub-hosted projects for DeepNude implementations, and instead are relying on developers to report any nasty stuff spotted.

“We do not proactively monitor user-generated content, but we do actively investigate abuse reports. In this case, we disabled the project because we found it to be in violation of our acceptable use policy,” a GitHub spokesperson said.

The Register found at one point multiple DeepNude-related repositories on GitHub, so it may be difficult to eradicate them all, especially if they keep popping up whack-a-mole style.

“We don’t disclose number of reports per project,” the spokesperson added. ®

Similar topics

Other stories you might like

  • While the iPhone's repairability is in the toilet, at least the Apple Watch 7 is as fixable as the previous model

    Component swaps still a thing – for now

    Apple's seventh-gen Watch has managed to maintain its iFixit repairability rating on a par with the last model – unlike its smartphone sibling.

    The iFixit team found the slightly larger display of the latest Apple Watch a boon for removal via heat and a suction handle. Where the previous generation required a pair of flex folds in its display, the new version turned out to be simpler, with just the one flex.

    Things are also slightly different within the watch itself. Apple's diagnostic port has gone and the battery is larger. That equates to a slight increase in power (1.094Wh from 1.024Wh between 40mm S6 and 41mm S7) which, when paired with the slightly hungrier display, means battery life is pretty much unchanged.

    Continue reading
  • Better late than never: Microsoft rolls out a public preview of E2EE in Teams calls

    Only for one-to-one voice and video, mind

    Microsoft has finally kicked off the rollout of end-to-end-encryption (E2EE) in its Teams collaboration platform with a public preview of E2EE for one-to-one calls.

    It has been a while coming. The company made the promise of E2EE for some one-to-one Teams calls at its virtual Ignite shindig in March this year ( and as 2021 nears its end appears to have delivered, in preview form at least.

    The company's rival in the conference calling space, Zoom, added E2EE for all a year ago, making Microsoft rather late to the privacy party. COO at Matrix-based communications and collaboration app Element, Amandine Le Pape, told The Register that the preview, although welcome, was "long overdue."

    Continue reading
  • Recycled Cobalt Strike key pairs show many crooks are using same cloned installation

    Researcher spots RSA tell-tale lurking in plain sight on VirusTotal

    Around 1,500 Cobalt Strike beacons uploaded to VirusTotal were reusing the same RSA keys from a cracked version of the software, according to a security researcher who pored through the malware repository.

    The discovery could make blue teams' lives easier by giving them a clue about whether or not Cobalt Strike traffic across their networks is a real threat or an action by an authorised red team carrying out a penetration test.

    Didier Stevens, the researcher with Belgian infosec firm NVISO who discovered that private Cobalt Strike keys are being widely reused by criminals, told The Register: "While fingerprinting Cobalt Strike servers on the internet, we noticed that some public keys appeared often. The fact that there is a reuse of public keys means that there is a reuse of private keys too: a public key and a private key are linked to each other."

    Continue reading

Biting the hand that feeds IT © 1998–2021