Meta is cracking down on ads that market “nudify” apps, which let people generate fake naked imagery of real-life people.
Facebook and Instagram are two of the biggest sources of advertising for these apps, which can be used to make AI revenge porn. Meta is taking legal action against one of the biggest offenders, Crush AI, by suing developer Joy Timeline HK Limited.
“We’ve filed a lawsuit in Hong Kong, where Joy Timeline HK Limited is based, to prevent them from advertising CrushAI apps on Meta platforms,” the company says.
In January, one report found that an estimated 90% of traffic to Crush AI came from Meta’s Instagram. CBS News also found “hundreds” of ads on Meta platforms for similar apps.
These services have been against Meta’s policies for a while now, but Crush AI continually avoided detection by making new profiles on the company’s ad tools and using domains that redirected to a different URL.
“This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it,” Meta says. “We’ll continue to take the necessary steps, which could include legal action, against those who abuse our platforms like this.”
Meta says it has developed new tools to catch repeat offenders; they should detect ads even when they don’t include nudity. “We’ve worked with external experts and our own specialist teams to expand the list of safety-related terms, phrases, and emojis that our systems are trained to detect within these ads.”
Meta also says it will make clearer efforts to share information with its rivals and partners through the Tech Coalition’s Lantern program. The hope is that bringing this data together will help authorities take action. Meta says this process is similar to how the company reports those “violating child safety activity.”
Recommended by Our Editors
Earlier this month, Meta was criticized by its own independently run Oversight Board, which found that Meta has under-enforced its own rules on advertisers using AI-manipulated videos of celebrities promoting scams. The most recent example is an AI-generated video of Brazilian soccer player Ronaldo Nazário endorsing an online game.
Meanwhile, it’s now illegal to distribute nonconsensual intimate images in the US after President Trump signed the Take It Down Act last month. Websites must now “make reasonable efforts” to take down content and copies within 48 hours of a notification from a victim.
Last year, Apple removed several generative AI apps from its App Store after a 404 Media investigation discovered they could be used to create nonconsensual nude images. A few months later, San Francisco sued 16 websites that use AI to help users “undress” or “nudify” photographs of women and girls. Grok on xAI also filled requests to “undress” people, though it generated them in bikinis or lingerie, not totally nude.
Get Our Best Stories!
Your Daily Dose of Our Top Tech News
By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up!
Your subscription has been confirmed. Keep an eye on your inbox!
About James Peckham
Reporter

Leave a Comment
Your email address will not be published. Required fields are marked *