- within Privacy topic(s)
- in United States
- within Family and Matrimonial, Transport and Real Estate and Construction topic(s)
The Federal Trade Commission (FTC) recently announced its first privacy enforcement actions of the second Trump administration with a flurry of activity in September.
- Three cases allege violations of the Children's Online Privacy Protection Act (COPPA): one against Disney Worldwide Services Inc. and Disney Entertainment Operations LLC (collectively, Disney); one against robot toy maker Apitor Technology Co., Ltd.; and one against the operators of the Sendit app, Iconic Hearts Holdings, Inc., and its CEO, Hunter Rice.
- A fourth case alleges violations of Section 5 of the FTC Act related to allegedly unreasonable content moderation by Aylo Group Ltd., the operator of Pornhub and other adult websites.
The FTC also announced an industry study of the impact of artificial intelligence (AI) chatbots on teens and children, using the FTC's authority under Section 6(b) of the FTC Act.
Below we summarize these actions and what they signal about the FTC's priorities and approach to privacy enforcement.
Disney
The FTC announced a complaint against and proposed stipulated order with Disney on September 2, 2025, regarding alleged COPPA violations arising from Disney's designation of videos on its YouTube channels. YouTube channel operators must designate uploaded videos, at either the channel or video level, as either "Not Made for Kids" or "Made for Kids." YouTube disables targeted advertising, as well as certain features such as auto-play and comments, for Made for Kids content.
According to the complaint, Disney chose to make designations at the channel level, rather than for each video. The complaint alleges that Disney published child-directed videos to some channels it designated as Not Made for Kids. Further, because Disney had enabled targeted advertising to run on Not Made for Kids videos, children's personal information allegedly was collected without the verifiable parental consent required by COPPA.
Under the order, Disney has agreed to pay a $10 million civil penalty, provide notice to parents and obtain verifiable parental consent consistent with COPPA, and implement an "Audience Designation Program" for 10 years to ensure that each video on its YouTube channels is appropriately labeled as Made for Kids or Not Made for Kids. The Audience Designation Program must be in writing, overseen by a qualified employee, and have its effectiveness assessed in writing at least annually. The program would no longer be required, however, if YouTube (1) removes the ability to designate videos as Made for Kids or (2) implements age-assurance technology to detect the appropriate age categorization for videos and either restricts the collection of personal information from children consistent with COPPA or enables Disney to restrict the collection of such information consistent with COPPA.
Apitor
On September 3, 2025, the FTC announced a complaint against and proposed stipulated order with China-based robot toy maker Apitor Technology Co. over alleged COPPA violations related to its collection of children's precise geolocation data. Apitor sells programmable robot toys designed for children ages 6 to 14, and users must download and use Apitor's mobile app to control the toy's movement. According to the complaint, Apitor required users of Android devices to enable location permissions on their device in order to use the app, resulting in children's personal information being collected without verifiable parental consent. The complaint further alleges that Apitor integrated a software development kit into its app through which a third party—a Chinese mobile developer and analytics provider—collected geolocation data, and that the agreements between Apitor and this third party gave the third party broad latitude to use the collected data, including for advertising and further dissemination.
The complaint alleges that Apitor violated COPPA by failing to provide direct notice to parents or notice in its privacy policy, or to obtain verifiable parental consent before collecting this geolocation information.
To resolve the allegations, Apitor agreed to a $500,000 civil penalty, suspended based on the company's inability to pay, and to injunctive provisions requiring it to provide notice to parents and obtain verifiable consent, retain children's personal information only as long as reasonably necessary to satisfy the purpose for which it was collected, and delete children's personal information at the request of a parent. Further, the order requires Apitor to delete all personal information associated with Android users or their parents unless it has provided direct notice and obtained verifiable parental consent. (For a more detailed discussion of Apitor, see our dedicated post on the case here.)
Iconic Hearts (Sendit App)
On September 29, 2025, the FTC announced a complaint against Iconic Hearts Holdings, Inc., and its CEO, alleging violations of COPPA, the Restore Online Shoppers' Confidence Act (ROSCA), and Section 5 of the FTC Act. The defendants operate Sendit, an anonymous messaging app designed for use on social messaging platforms. The app allows app users to solicit and receive anonymous messages from their social media contacts, e.g., by posting "ask me anything" links to which contacts can anonymously respond. According to the complaint, the app collected users' birthdates (giving the defendants actual knowledge that many users were under 13), but the defendants nevertheless collected personal information from such users without providing notice to parents or obtaining verifiable parental consent to collecting children's personal information.
The alleged ROSCA violations relate to Sendit's Diamond Membership program, which automatically renewed each week unless users canceled their subscription. According to the complaint, Sendit violated ROSCA by failing to clearly and conspicuously disclose that Diamond Membership was a recurring paid subscription. Instead, Sendit allegedly used small type size, noncontrasting colors, and nonprominent placement to disclose the purchase price and state that the charge would be imposed on an automatically recurring basis. The FTC alleged these factors led many consumers to be tricked into buying a subscription.
In addition, the FTC alleges that the defendants engaged in deceptive practices by sending users fake, often provocative anonymous messages to drive engagement and Diamond Membership subscription, as well as falsely representing that Diamond Members would learn who sent them these anonymous messages and obtain hints about the senders' identities. The FTC alleges that, in actuality, the defendants sent fabricated or useless information to Diamond Membership subscribers regarding the identity of senders, requiring them to pay additional money on top of their subscription to actually identify message senders. Further, according to the FTC, the use of fake, often provocative or sexual anonymous messages sent to children and teens for the purpose of inducing them to purchase subscriptions to reveal the identity of the sender was an unfair practice. The FTC seeks civil penalties and a permanent injunction.
Aylo Group Ltd. (Pornhub)
On September 3, 2025, the FTC and the Utah Division of Consumer Protection jointly announced a complaint against and proposed stipulated order with Aylo Group Ltd. and its co-defendants to resolve allegations that they failed to block and remove videos and photos featuring child sexual abuse material (CSAM) and nonconsensual material (NCM) from the adult content websites they operated, including Pornhub.com.
The complaint alleges that the defendants promoted publication of CSAM and NCM, including by (1) soliciting and publishing content from creators that often featured CSAM or NCM; (2) allowing users to submit their own content without age, identity, or consent verification requirements; (3) encouraging users to apply tags to their content suggesting that the content contained CSAM or NCM; and (4) creating playlists of videos with tags signaling that they contained CSAM or NCM.
The complaint further alleges that the defendants knew about and failed to take appropriate action to address CSAM and NCM on their websites, including when audits revealed their prevalence. For example, the defendants allegedly failed to audit their websites for CSAM and NCM until credit card companies threatened to terminate their relationships. According to the complaint, the audit resulted in suspensions of 35 channels, but the defendants restored all but five when the threat from the credit card companies passed. As another example, defendants' subscription website team allegedly repeatedly overrode attempts by their compliance team to remove videos featuring CSAM and NCM.
The complaint alleges that the defendants' distribution of CSAM and NCM was an unfair practice violating Section 5 of the FTC Act, with injury both to the victims in the uploaded content (who are revictimized by its distribution) and to viewers (who do not want to view or possess CSAM or NCM). The complaint also alleges several deception claims, such as the defendants falsely promising to ban users who uploaded CSAM, prevent re-uploading of CSAM, review content for CSAM or NCM before it went live, and quickly review live content if flagged for CSAM or NCM. The complaint further alleges that, after the defendants implemented a process to verify the identity of content creators who uploaded content, they deceptively failed to disclose that they collected and retained personal information from their identity verification vendor and misrepresented that user information would be protected when, in fact, employees emailed unencrypted copies of sensitive personal information and retained plain text copies in their emailed accounts for years.
As part of the proposed settlement, the defendants agreed to pay a $15 million civil penalty to Utah, with $10 million suspended based on the defendants' compliance with the other provisions of the order. The order also requires the defendants to establish and implement a comprehensive program to prevent the posting and dissemination of CSAM and NCM. The requirement includes:
- The documentation of the program and appointment of a qualified employee reporting to the CEO to oversee the program;
- The annual assessment of risks that could lead to the
publication or dissemination of CSAM or NCM and the implementation
of appropriate controls, including:
- Specific pre-publication age and identity verification requirements designed to ensure that the individuals depicted are at least 18 years old;
- Notice and consent checkboxes during upload of sexually explicit content to explain that the defendants will review content before publication and may report it to the National Center for Missing and Exploited Children or law enforcement, and that the uploader is waiving any privacy rights in such content; and
- Biennial third-party assessments for a 10-year period regarding the defendants' compliance with the program requirements.
The order also requires the defendants to establish and maintain a privacy and information security program, for which they must also undergo biennial third-party assessments over a 10-year period.
AI Chatbot 6(b) Orders
Section 6(b) of the FTC Act enables the FTC to conduct industry studies outside of its law enforcement function. On September 11, 2025, the FTC announced it was issuing Section 6(b) orders seeking information from seven companies about the impact of AI chatbot companions on children and teens. The orders seek a wide variety of information about subjects such as how businesses that offer AI companions monetize user engagement, process user inputs, test and monitor for negative impacts pre- and post-deployment, implement data collection and handling, and monitor and enforce rules, terms, and policies.
Key Takeaways
These matters highlight that the privacy and online safety of children and teens are among the highest consumer protection priorities of the FTC under the leadership of Chairman Andrew Ferguson, as well as the following points:
- Support for age assurance technologies. Chairman Ferguson signaled his strong support for age assurance technology in characterizing its potential emergence, which would lift the Audience Designation Program requirements in the Disney order, as "the future of protecting kids online." Similarly, the proposed Pornhub settlement would require Pornhub to verify directly or through documentation that content creators are over age 18 before their content can be published. While Chairman Ferguson has suggested his support for such technology before, it may play a more prominent role in FTC enforcement following the decision by the Supreme Court of the United States in Free Speech Coalition v. Paxton.
- Embracing the FTC's role in policing CSAM and NCM beyond the TAKE IT DOWN Act. As we have previously covered (see here), the TAKE IT DOWN Act enacted earlier this year vested the FTC with enforcement powers (and civil penalty authority) relating to NCM. The act prohibits the intentional publication, or threat of publication, of "nonconsensual intimate imagery," whether real or computer-generated, and requires covered platform providers to (1) remove images within 48 hours of a request and (2) make reasonable efforts to find and remove all copies of reported images. With the Pornhub case, clearly the fruit of an investigation initiated long before the enactment of the TAKE IT DOWN Act, the FTC is embracing the role of policing CSAM and NCM on the internet beyond the enforcement of the removal provisions of the TAKE IT DOWN Act by using its Section 5 authority to require platforms hosting such content to take affirmative steps to prevent the posting and dissemination of such content in the first place.
- Ongoing use of unfairness authority. Under prior FTC Chair Lina Khan, the FTC alleged novel unfairness theories and expanded its use of its unfairness authority in the privacy arena. Recent public statements by FTC speakers emphasize that they do not intend to use Section 5 as a vehicle to effectively legislate new categories of privacy violations. Nonetheless, the unfairness claims in the Pornhub and Sendit cases indicate that unfairness claims in the privacy and broader consumer protection arena have not disappeared.
- China data-transfer concerns. The Apitor case may signal that the FTC is focusing enforcement efforts on Chinese companies and/or businesses that share data with them. The FTC highlighted both that Apitor is based in China and that, as Bureau of Consumer Protection Director Christopher Mufarrige said, "Apitor allowed a Chinese third party to collect sensitive data from children using its product, in violation of COPPA." This concern may surface as well in connection with the FTC's exercise of its new enforcement authority under the Protecting Americans' Data from Foreign Adversaries Act.
- A shift toward shorter time frames for order compliance obligations. The FTC appears to be heeding the call to shorten the length of at least some provisions of its consent orders, including its comprehensive privacy and information security program requirements. The privacy and information security and CSAM/NCM programs in the Pornhub order, and the associated biennial assessment requirements, last 10 years—half of the typical 20-year period for comprehensive programs in earlier FTC orders.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.