Meta sues 'nudify' app-maker that it claims ran 87k+ Facebook, Instagram ads

Despite 'multiple enforcement actions,' Joy Timeline HK allegedly wouldn't stop

Meta has sued an app maker for advertising on Facebook and Instagram a so-called "nudify" app that uses AI to generate nude and sexually explicit images and videos of individuals without their consent.

The social media giant on Thursday filed the lawsuit in Hong Kong against Joy Timeline HK Limited. The company allegedly is behind the popular "nudify" app Crush AI, and placed tens of thousands of ads on Facebook and Instagram that promised to "erase any clothes" and display "nudy versions" of users' fully-clothed photos, in violation of Meta's policies.

Meta says it continually removed these and other violating ads, shut down Facebook pages and Instagram accounts promoting these apps, and blocked links to Joy Timeline HK's various websites so users couldn't access them from Meta-owned platforms. 

But despite taking "multiple enforcement actions" against the app maker since 2023, Joy Timeline HK continued running these not safe for work (NSFW) ads on Facebook and Instagram, the lawsuit says. 

"Given the steps taken by the Defendant to create multiple accounts to advertise the Nudify Apps in direct contravention of Meta's Terms and Policies … and even after Plaintiff Meta has taken steps to remove offending ads promoting the Nudify Apps, it is clear that unless restrained by a competent court, the Defendant will continue to publish such Violating Ads on Facebook and Instagram," according to the court documents.

These ads primarily targeted users in the US, Canada, UK, Australia, and Germany, according to the court documents. And as of February, over 135 Facebook pages displayed more than 87,000 ads for nudify apps, with at least 170 business accounts on Facebook and Instagram placing these ads, the lawsuit says.

One of these "depicted a woman in a black crop top and bottom," according to the lawsuit:

The image is also split, with the left side showing her in the clothing with an "NSFW" label, and the right side showing her with her top and bottom digitally removed, and the words "BRA OFF" and "PANTS OFF". The Violating Ad is accompanied by the captions "Upload a photo to strip for a minute"; "Upload a photo to generate a dance video" and "Everything you want is here".

"This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it," according to a Meta blog about the lawsuit. "We'll continue to take the necessary steps – which could include legal action – against those who abuse our platforms like this."

In addition to suing the app maker, the tech giant said it's taking other actions to prevent these types of explicit deepfake and AI-based services from advertising online. 

"When we remove ads, accounts or content promoting these services, we'll share information — starting with URLs to violating apps and websites — with other tech companies through the Tech Coalition's Lantern program, so they can investigate and take action too," according to the blog. 

Since March, Meta said it has provided more than 3,800 unique URLs to participating tech companies.

Plus, it's upped its own policing of advertisements for nudify apps, and "developed new technology specifically designed to identify these types of ads – even when the ads themselves don't include nudity – and use matching technology to help us find and remove copycat ads more quickly." ®

More about

TIP US OFF

Send us news


Other stories you might like