A new federal law criminalizes nonconsensual intimate imagery and gives covered websites, mobile applications, and other online platforms merely 48 hours to comply with requests to take down such materials. On May 19, 2025, President Trump signed into law S. 146, the Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act, better known as the Take It Down Act. Passing both the House and the Senate by wide margins, the new law was also championed by First Lady Melania Trump, who attended the signing ceremony.
The law criminalizes the publication and threatened publication of
-
explicit deepfakes, which the Take It Down Act refers to as "digital forgeries," defined as nonconsensual intimate imagery “that, when viewed as a whole by a reasonable person, is indistinguishable from an authentic visual depiction of the [subject] individual,” as well as
-
authentic “intimate visual depictions” of an “identifiable individual” who did not consent to the publication of such content.
Importantly, the scope of "digital forgeries" encompasses both AI-generated and non-AI-generated digital imagery. Exceptions include the publication of such content in connection with law enforcement investigations or disclosures made reasonably in good faith for legitimate medical or scientific purposes. Violators of the law’s prohibitions against publication may be imprisoned for up to two years, with threatened publication punishable by up to 18 months imprisonment. Elevated penalties apply to content depicting minors.
The impact of the Take It Down Act will be felt by a large universe of websites and other online services, which will be obligated to comply with the statute’s strict requirements for facilitating requests to remove regulated content. The law applies to any covered platform, defined as a public-facing website or other online service or mobile application which either
-
“primarily provides a forum for user-generated content,” including multimedia, or
-
which, in the regular course of its business, publishes, hosts, or otherwise makes available nonconsensual intimate imagery.
A plain reading of subpart (a) of the foregoing would seem to potentially put virtually every online message board, forum, and social media platform in scope for the new law.
The statute requires that, within one year of the law going into effect, all covered platforms must create a process (that is clearly and conspicuously disclosed to users) to allow individuals or their authorized representatives to make requests to remove nonconsensual imagery and, within just 48 hours of its receipt of a valid request, the platform must remove the imagery and use “reasonable efforts to identify and remove any known identical copies of such depiction.” With regard to the latter requirement, it is unclear whether such reasonable efforts might arguably include taking actions to remove copies that are known to the covered platform but outside its immediate control, e.g. imagery hosted by a third-party service, which integrates with the platform that received the request. The Take It Down Act does not create a private right of action and instead authorizes the FTC to enforce the law’s notice and take down provisions and to treat violations of these requirements as unfair or deceptive practices. The law does provide a safe harbor of sorts – platforms cannot be held liable for claims based on the platform’s good faith disabling of material that is claimed to be a nonconsensual intimate depiction, regardless of whether or not the imagery in question is ultimately found to be unlawful.
If you’d like to assess your business’s exposure and obligations under the Take It Down Act, please contact Benjamin Mishkin in Cozen O’Connor’s Privacy, Technology & Data Security team.