ARTICLE
4 July 2025

The TAKE IT DOWN Act Targets AI-Generated And Authentic Nonconsensual Intimate Images

NM
Nelson Mullins Riley & Scarborough LLP

Contributor

Flexibility, practical business sense, and tireless advocacy are among Nelson Mullins’ service hallmarks. Our growth over the past 120 years continues to be client-focused.

Our culture and multidisciplinary platform provide our community of clients trusted advice to meet a broad range of business needs and our team members an opportunity to be part of a Firm that values relationships, collaboration, thinking ahead, leadership within our profession, and helping those in need through pro bono and community service.

The Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act, the TAKE IT DOWN Act, a federal bipartisan effort to require...
United States Technology

The Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act, the TAKE IT DOWN Act, a federal bipartisan effort to require covered platforms to remove nonconsensual intimate images, was signed into law by President Trump on May 19, 2025. The TAKE IT DOWN Act requires platforms to establish takedown procedures within a year, and violations are subject to Federal Trade Commission (FTC) enforcement, including criminal penalties.

The Act is effective as of May 19, 2025. Covered platforms have until May 19, 2026 to implement a notice-and-removal process for nonconsensual intimate visual depictions, as required by the Act.

Scope and Applicability

The TAKE IT DOWN Act (the "Act") applies to individuals and "covered platforms". A "covered platform" is defined broadly as a "website, online service, online application or mobile application" that "serves the public" and "primarily provides a forum for user-generated content, including messages, videos, images, games, and audio files." Covered platforms also include any online service that, in the regular course of business, publishes, curates, hosts, or otherwise makes available "nonconsensual intimate visual depictions" (alternatively referred to as nonconsensual intimate images or "NCII"). NCII is intended to cover both authentic and digitally forged (e.g., AI-generated) content.

Certain entities are excluded from the Act's provisions, including broadband internet and email providers, and other online services, websites, or applications that primarily host content preselected by the operator rather than user-generated content. The exclusion does not apply to platforms that make NCII available in their ordinary course of business.

Additional exclusions are provided for necessary activities that might otherwise be considered violations of the Act, such as disclosures pursuant to law enforcement or intelligence agency investigations. Good faith disclosures of NCII are also permitted in specific contexts, such as disclosures to law enforcement, in legal proceedings or document production, in connection with medical education, diagnosis, or treatment, for the reporting of unlawful conduct, or when seeking support after receiving unsolicited intimate visual content.

Notably, the Act provides a safe harbor for covered platforms' good faith efforts to remove content even if the content is later determined not to be NCII in violation of the Act.

Key Provisions: Takedown Procedures

Covered platforms are required to implement a notice-and-removal process for reporting and taking down NCII. This process must allow individuals to notify the platform and request the removal of intimate visual depictions of themselves that were published without their consent.

A valid notice-and-removal request must include the following, in writing: a physical or electronic signature of the affected individual (or an authorized representative), identification of the specific intimate visual depiction, information sufficient to locate the depiction on the platform, a brief statement asserting a good-faith belief that the depiction was distributed without consent, and sufficient contact information for the individual. Covered platforms are required to provide clear and conspicuous notice of the notice-and-removal process directly on their platform.

Upon receiving a valid removal request from an identifiable individual or an authorized representative, a covered platform must remove the reported NCII within 48 hours and make reasonable efforts to remove all known identical copies of the content.

Enforcement

The Act criminalizes the publication of, and the threat to publish, both authentic and/or digitally forged NCII on interactive computer services. Individuals who distribute authentic or digitally forged NCII involving adults are subject to fines, imprisonment for up to two years, or both. If the depicted individual is a minor, the maximum term of imprisonment increases to three years.

Threats to distribute authentic NCII with the intent to intimidate, coerce, extort, or otherwise cause distress are subject to the same penalties. Threats to distribute digitally forged NCII may also face fines and imprisonment—up to 18 months for offenses involving adults and up to 30 months for offenses involving minors.

The FTC is responsible for enforcing the Act's notice-and-removal requirements as they apply to covered platforms. A covered platform that fails to "reasonably comply with" these obligations will be deemed in violation of the FTC Act's prohibitions on unfair or deceptive acts or practices (UDAP) under Section 18(a)(1)(B) of the FTC Act. Notably, the TAKE IT DOWN Act extends the FTC's enforcement authority to include nonprofit organizations, which are ordinarily outside the scope of the FTC Act.

Next Steps/Conclusion

Legal challenges to the Act's provisions are expected to implicate First Amendment questions as well as Section 230 of the Communications Act of 1934 (47 U.S.C. § 230). The Act regulates speech in the form of visual depictions on the basis of the content of those depictions. Although free-speech challenges to nonconsensual pornography laws have typically failed, the Act's broad notice-and-removal requirements present a different analysis for courts that could yield an alternate outcome. Moreover, challengers are likely to argue that Section 230 provides immunization from certain penalties, such as the prescribed UDAP penalties. It is unclear whether Section 230 immunity survives the Act.

Though there is broad support for federal legislation addressing NCII among lawmakers, civil society organizations, NCII victims, and the public, disagreement remains around whether the TAKE IT DOWN Act effectively targets the problem. The Act contains potentially problematic provisions and loopholes, mostly stemming from definitional issues related to NCII. The term is arguably both overly narrow (e.g., the terminology is based on definitions from existing federal legislation addressing authentic images and inadvertently may not apply to AI-generated content) and overly broad (e.g., the takedown provisions may apply to many kinds of content that are not "NCII" as otherwise defined by criminal law, opening the possibility of frivolous requests and/or removal of lawful content without sufficient protections in place). Additionally, NCII is addressed in various state laws related to privacy, deepfake reporting, and content moderation, creating overlapping legal regimes with different requirements and definitions.

In general, covered platforms should take proactive steps to implement and test a conspicuous notice-and-removal process for users to report and remove any nonconsensual intimate visual depictions prior to the May 19, 2026, deadline. However, the Act's passage is likely only the beginning of a process of legal challenges that will mold how the Act is interpreted and how its provisions apply to covered platforms.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More