It is now illegal to distribute nonconsensual intimate images in the US with a new law covering both real and artificially created imagery. The new bill was signed into law on May 19 by US President Donald Trump after passing through Congress with nearly unanimous support last month.
The bill means websites will need to remove reported imagery within 48 hours of a notification from a victim. It will be enforced by the Federal Trade Commission (FTC) and the services will need to “make reasonable efforts” to take down copies alongside the original images.
This won’t begin immediately. Online platforms now have up to a year to put into place a process to remove the images before the FTC starts enforcing rules. We may see some social media platforms introduce a process before the deadline is up.
People found to be distributing images are immediately subject to up to three years in prison, alongside fines. Many states previously had laws banning this type of abuse, but this marks the first time a US-wide law has been brought in.
AI tools for making deepfake pornography have grown in recent years. Elon Musk’s Grok AI tool was recently found to be able to generate images of fake pornography when a user asked to “remove her clothes” on various posts. A spate of dedicated deepfake pornography apps have also appeared in recent years.
Not everyone agrees this new law is the right way forward on “revenge porn”. Critics include the Cyber Civil Rights Initiative (CCRI), an organization which describes its mission as combatting “online abuses that threaten civil rights and civil liberties.”
Mary Anne Franks of the CCRI said, “The Take It Down Act also includes a poison pill: an extremely broad, easily abused takedown provision that will likely end up hurting victims more than it helps. It lacks adequate safeguards against false reports, is over- and under-inclusive, and gives false hope to victims.”
Recommended by Our Editors
At the time of the bill passing through Congress, the CCRI criticized its enforcement through the FTC. A spokesperson for the CCRI said, “Platforms that feel confident that they are unlikely to be targeted by the FTC (for example, platforms that are closely aligned with the current administration) may feel emboldened to simply ignore reports of NDII.”
The original Take It Down Act passed through Congress with a 409-2 vote in April.
Get Our Best Stories!
Your Daily Dose of Our Top Tech News
By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up!
Your subscription has been confirmed. Keep an eye on your inbox!
About James Peckham
Reporter

Leave a Comment
Your email address will not be published. Required fields are marked *