ARTICLE
1 December 2025

The FTC And Utah Attorney General Focus On Online Safety: Takeaways For All Platforms

PC
Perkins Coie LLP

Contributor

Perkins Coie is a premier international law firm with over a century of experience, dedicated to addressing the legal and business challenges of tomorrow. Renowned for its deep industry knowledge and client-centric approach, the firm has consistently partnered with trailblazing organizations, from aviation pioneers to artificial intelligence innovators. With 21 offices across the United States, Asia, and Europe, and a global network of partner firms, Perkins Coie provides seamless support to clients wherever they operate.

The firm's vision is to be the trusted advisor to the world’s most innovative companies, delivering strategic, high-value solutions critical to their success. Guided by a one-firm culture, Perkins Coie emphasizes excellence, collaboration, inclusion, innovation, and creativity. The firm is committed to building diverse teams, promoting equal access to justice, and upholding the rule of law, reflecting its core values and enduring dedication to clients, communities, and colleagues.

The settlement resolved allegations that the defendants' platform failed to effectively police CSAM and NCM, which was an unfair and deceptive practice.
United States Utah Privacy
Janis Claire Kestenbaum’s articles from Perkins Coie LLP are most popular:
  • in Asia
  • in Asia
  • with readers working within the Banking & Credit and Consumer Industries industries
Perkins Coie LLP are most popular:
  • within Transport topic(s)

The Federal Trade Commission and Utah Attorney General recently announced a settlement with Aylo Group Ltd. and its co-defendants that places extensive obligations on the defendants to prevent the upload of and remove child sexual abuse material (CSAM) and nonconsensual material (NCM) on their adult content websites.

The settlement resolved allegations that the defendants' platform failed to effectively police CSAM and NCM, which was an unfair and deceptive practice. The stipulated order requires the company to implement a comprehensive CSAM and NCM prevention program to be assessed by an independent third party, along with information security and privacy programs that also require independent third-party assessments.

All providers at risk of hosting CSAM and NCM should be aware of this action by the FTC and the Utah State Attorney General to police online safety through their unfair and deceptive practices (UDAP) laws. Providers should review their own policies and procedures since we may see other state attorneys general utilize their UDAP authority to address such content.

While the order is directed to the defendants in the case, it potentially has implications for all platforms that may have CSAM or NCM on their platform or make representations about the presence of such content. This post focuses on the order's mandated CSAM and NCM prevention program. (For a broader update on recent FTC enforcement actions concerning children's privacy and online safety, see our post here.)

Allegations Regarding Deceptive Content Moderation

The FTC and Utah Attorney General alleged in their complaint that the platform made repeated misrepresentations to consumers, business partners, and the public regarding its efforts to prevent and remove illegal content with relation to its handling of CSAM and NCM. These include allegations that the defendants:

  • represented that all flagged content was reviewed and removed, when in fact, the defendants only reviewed content once it had 15 separate user flags;
  • represented that users uploading CSAM would be banned from all platforms, but the complaint alleged that many such users were not banned from all platforms and continued to upload illegal content;
  • represented that re-uploading of previously identified CSAM would be prevented through hashing or fingerprinting technology, but the complaint alleged that the technology was ineffective and re-uploads occurred;
  • that all content was reviewed by human moderators before publication, but the complaint alleged that moderation was cursory, with moderators reviewing only seconds of each video and policies limiting the removal of content regardless of its legality; and
  • repeatedly promised consumers that the defendants' policies, practices, and procedures worked together to ensure there would be no CSAM or NCM on their websites, but the complaint alleged the defendants' sites have contained and continue to contain such content.

Stipulated Order

The defendants entered into an order to resolve the allegations against them that obligates them to implement a comprehensive program to prevent the posting and proliferation of CSAM and NCM. This program includes the following obligations:

  • Written program documentation. The program must be documented in writing and provided to the defendants' board of directors or other governing body at least every three months.
  • Oversight. The defendants must designate a qualified employee to coordinate and be responsible for the program who reports to the CEO.
  • Training. Mandatory CSAM/NCM training for relevant employees upon hire and annually.
  • Risk assessments and mitigation. There must be at least annual assessments of internal and external risks that could result in the publication of CSAM or NCM and the design, implementation, maintenance, and documentation of safeguards for the identified internal and external risks.
  • Pre-publication detection. The defendants must utilize available tools and technologies to review content to determine whether it is actual or suspected CSAM or NCM before publication, such as via:
    • Fingerprint comparison. Comparing uploads to known or suspected CSAM or NCM "fingerprints"; and
    • Human moderation. To the extent the defendants use human moderators, the human moderators must be fluent in the language, must watch and listen to each file in its entirety (or review a complete transcript by a fluent translator) before the content goes live, and must be prohibited from receiving compensation based solely on the amount or quality of content reviewed.
  • NCMEC reporting within 72 hours. The defendants must implement policies, practices, procedures, and technical measures designed to ensure they report actual or suspected CSAM to the National Center for Missing and Exploited Children (NCMEC) within 72 hours after determining that the content is actual or suspected CSAM.
  • Content removal and takedown. The order contains a number of requirements concerning the removal and takedown of sexually explicit content, as well as user messages and comments that promote, solicit, or encourage CSAM or NCM, such as the following:
    • Content
      • The defendants must implement accessible methods for consumers to report possible CSAM or NCM for review and removal, including for individuals without an account; immediately suspend content identified via URL and suspend within 72 hours all content identified through a search by title or other information reported by a verified requestor; inform the requestor of the status of the removal request and notification to the requester of republication of the content; link to content removal request forms on home pages that include a clear and conspicuous link to the platform's review process.
      • Fingerprint or mark content determined to be CSAM or NCM to prevent future republication;
    • Comments/Messages
      • Allow users to flag or report comments or messages that promote, encourage, or solicit the creation, publication, or dissemination of CSAM or NCM or encourage, promote, solicit, or engage in child abuse or nonconsensual sexual activities.
      • Review reported content within three days and satisfy related obligations concerning the removal of the content and banning the user.
  • Streamlined process for law enforcement requests. The defendants must implement a process by which any law enforcement agency can request removal of suspected CSAM or NCM, with requirements to suspend content, search for and suspend all instances of the same content, and readily inform law enforcement of the status of each request.
  • CSAM and NCM prevention transparency report. The platform must publish a report twice a year on the home page of each covered service that provides a detailed account of the platform's implementation and enforcement of policies, practices, procedures, and technical measures to prevent the publication and dissemination of CSAM and NCM. The report must include descriptions of the business units involved in prevention efforts, the tools and processes used for age and consent verification, moderation, and reporting, as well as any updates or changes to these measures. Additionally, the report must present detailed metrics—both narrative and numerical—on uploads, removals, reports, enforcement actions, and the results of prevention efforts, using charts and graphs where appropriate.
  • Independent third-party assessments. For 10 years, the CSAM and NCM Prevention Program must be assessed by a qualified, independent, objective third-party professional who is subject to approval by the FTC. This is akin to the mandated independent assessments of comprehensive privacy and data security programs that the FTC has required for many years via its consent orders.

Key Takeaways

The case demonstrates that the FTC and at least one state attorney general see their UDAP prohibitions as tools for policing how platforms address CSAM and NCM. Further, the FTC recently gained authority to enforce the NCM takedown provisions of the TAKE IT DOWN Act (see our post here). With a host of new potential regulators scrutinizing their practices, platforms should consider taking the following steps:

  • Assess representations. Platforms should review their terms and other external communications to confirm they accurately represent how CSAM and NCM are sought to be prevented and addressed, particularly as more global safety laws require disclosures of a service's practices and procedures.
  • At a minimum, anchor to the statutory baseline. While FTC orders only bind the parties, they often reflect practices that the FTC believes the broader industry should adopt for compliance with Section 5 of the FTC Act. Here, however, 18 U.S.C. § 2258A, which governs platform obligations in connection with reporting CSAM, specifically provides that nothing in the statute imposes a duty on covered providers to monitor their users or affirmatively seek out CSAM on their service and to report CSAM only if and when detected.
  • Evaluate online safety practices. With a focus on online safety from regulators globally, evaluate where you comply across regulatory regimes and develop a set of best practices for users across jurisdictions.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More