On May 19, 2025, President Trump signed into law the bipartisan “Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act“ (the “Take It Down Act“ or the “Act“).
- In general, the Act prohibits any person from “knowingly publishing“ without consent intimate visual depictions of minors or non-consenting adults, or any deepfakes, whether intimate depictions or not, that are intended to cause harm.
- Significantly, the Act also requires covered platforms (including public websites and mobile applications) to establish a notice and takedown process through which users can report such depictions, and to remove them in response to such a report.
- The Take It Down Act is the first federal law that limits the use of AI in ways that can be harmful to individuals.
Summary of the Act
Prohibitions
Authentic intimate images. The Take It Down Act prohibits any person from using an interactive computer service to knowingly publish authentic (i.e., non-AI) intimate visual depictions of an identifiable adult if the depiction:
- was obtained or created in a situation where the subject had a reasonable expectation of privacy;
- was not voluntarily exposed by the subject in a public or commercial setting;
- was intended to cause harm or causes harm; and
- is not a matter of public concern.
Note that a reasonable expectation of privacy means that, even if the subject individual consented to the creation of the image in question, it does not necessarily means that consent was provided for the image to be published.
Deepfakes. In the case of deepfakes (referred to in the Act as “digital forgeries“), the Act prohibits any person from using an interactive computer service to knowingly publish a deepfake of an adult without the consent of the subject where (i) what is depicted was not voluntarily exposed by the subject in a public or commercial setting; (ii) publication was intended to cause harm or causes harm and (iii) what is depicted is not a matter of public concern.
Images of minors. The Act imposes stricter exclusions for intimate visual depictions and deepfakes involving minors (defined as those under 18). A person may not use an interactive computer service to knowingly publish intimate visual depictions or any deepfake of an identifiable minor if there is intent to (i) abuse, humiliate, harass or degrade the minor or (ii) arouse or gratify sexual desire of any person. Note that the elements required for claims involving those over 18 (e.g., the reasonable expectation or privacy standard) does not apply when minors are involved.
Threats. The Act further prohibits threats to publish authentic intimate visual depictions or deepfakes. Any person who intentionally threatens to share such images for the purpose of intimidation, coercion, extortion, or to create mental distress is in violation of the Act.
Obligations of Platforms
The Take It Down Act requires that by May 19, 2026 “covered platforms“ comply with certain notice and takedown obligations with respect to intimate visual depictions and deepfakes.
“Covered platforms“ is defined to include public websites, online services and applications, and mobile applications that either (i) primarily provide a forum for user-generated content or (ii) are primarily designed to publish nonconsensual intimate visual depictions. Covered platforms do not include broadband Internet access providers, email services, or online services or websites with primarily preselected content where the content is not user-generated but curated by the provider.
Covered platforms must establish a clear and accessible process that allows individuals to notify the platform of the intimate visual depiction or deepfake and request its removal. The process set forth in the Act requires the following:
- An electronic signature of the individual depicted as well as a brief statement demonstrating the individual has a good faith belief that the depiction was not consensual and providing information sufficient for the platform to locate the depiction and contact the individual.
- The platform has 48 hours after notice to investigate the request and remove the material.
- Platforms must make reasonable efforts to remove duplicates or reposts.
Failure to comply with these obligations is considered a violation of the Federal Trade Commission Act (FTCA).
The Act limits liability for covered platforms that remove intimate visual depictions that should not have been taken down so long as the covered platform was acting in good faith. To avail themselves of the safe harbor, covered platforms should implement internal processes that document good faith compliance efforts, including the documentation of all takedown actions.
Exceptions
The Act includes various exceptions for certain legitimate disclosures of intimate visual depictions, including with regard to a law enforcement or intelligence agency investigation, or disclosures made in good faith made for a legal proceeding, medical treatment or education, and with respect to reporting unlawful conduct.
Penalties
Any person who (i) shares deepfakes or (ii) shares or threatens to share authentic intimate visual depictions of an adult is subject to fines (which under the FTCA could include civil fines, injunctive relief, and consumer redress) and up to two years of imprisonment. Any person who intentionally threatens to share deepfakes is subject to fines and up to 18 months of imprisonment.
Any person who (i) shares deepfakes or (ii) shares or threatens to share authentic intimate visual depictions of a minor is subject to fines and up to three years of imprisonment. Any person who intentionally threatens to share deepfakes of a minor is subject to fines and up to 30 months of imprisonment.
How the Act Affects State Laws
Nearly all states already have laws place protecting individuals from nonconsensual intimate imagery, and 30 states have laws directly addressing deepfake nonconsensual intimate imagery. However, there is variation in criminal classifications and prosecutions among states. The Take It Down Act does not preempt state law, thus providing victims with more than one avenue of protection. However, Congress is currently debating a 10-year moratorium on state AI regulation, prohibiting state and localities from enforcing laws that broadly regulate AI. Whether the moratorium will become law, or whether it would sweep in laws that overlap with the Take it Down Act, remains to be seen.
Key Points for Covered Platforms
Although covered platforms have a year to put in place the notice and takedown procedures that the Act requires, these entities are advised to start building out these processes sooner than later. This should involve not only implementing the mechanics of complying with the Act, including record-keeping to take advantage of the safe harbor, but also gaming out different takedown scenarios that might arise and how they will respond. Companies that already have procedures in place to comply with the notice and takedown processes provided under the Digital Millennium Copyright Act (DMCA) may be able to adopt some of those processes for the Take it Down Act.
We note that the Act does not specify a statute of limitations. Covered platforms will likely need to treat all requests to remove nonconsensual intimate images as subject to the takedown requirements set forth in the Act, regardless of when the content was first uploaded.
Summer associate Brooke Ridgway contributed to this article.
This memorandum is provided by Skadden, Arps, Slate, Meagher & Flom LLP and its affiliates for educational and informational purposes only and is not intended and should not be construed as legal advice. This memorandum is considered advertising under applicable state laws.