On September 11, 2025, the Federal Trade Commission announced that it has issued orders under Section 6(b) of the FTC Act to a group of companies operating consumer-facing AI chatbots that portray companionship or emotional connection. The aim: to gather detailed information about how such chatbots are designed, deployed, and monitored — especially with regard to children and teenagers. (ftc.gov)
Here's what we know so far, and what businesses in this space (or those considering moving in) should take into account.
What the FTC Is Asking
The inquiry seeks materials and responses in several key areas:
- How the companies monetize user interactions.
- How inputs from users are processed, and how outputs (chatbot responses) are generated.
- The criteria used to design, develop, and approve the "characters" or personalities of the chatbots.
- What pre-deployment and post-deployment testing and monitoring is done for possible harms (emotional, psychological, safety, etc.), especially to minors.
- Measures in place to restrict or limit use by children and teens, or otherwise mitigate potential negative effects.
- How users — and parents — are informed about features, capabilities, intended audiences, disclosures about risks, data collection / handling, etc.
- How terms of service, age restrictions, community guidelines are enforced.
- Whether and how personal information from users' conversations is used or shared.
Regulatory Authority: Section 6(b)
The orders are issued under the FTC's 6(b) authority. 6(b) allows the FTC to conduct broad studies that are not tied to a specific law-enforcement case. It gives the Agency the power to require companies to report data, to preserve documents, to provide testimony, etc., primarily for investigative research and policy purposes.
Points of Consideration for Businesses in the AI Companion Space
While the FTC inquiry doesn't (yet) impose new rules, it signals areas of regulatory focus. Businesses developing or operating AI chatbots that simulate companionship — particularly ones likely to be used by younger users — might want to consider the following proactively:
Testing & Monitoring: Put in place rigorous pre-launch testing, and ongoing monitoring of interactions, for possible emotional or psychological risk, especially for minors.
Age Restrictions / Access Controls: If minors are involved, think through how to enforce age restrictions, limit certain content, or gate features appropriately.
Transparency & Disclosures: Ensure users (and parents/caregivers) are clearly informed about what the bot does, what data it collects, what its capabilities & limits are, and what risks might exist.
Data Handling Policies: Be prepared to document precisely how inputs are collected, stored, shared, and how user privacy is protected (or not).
Terms of Service & Enforcement: Carefully craft terms, community guidelines, usage rules, and have processes to enforce them in real life.
Advertising & Monetization: Be ready to describe how monetization works; if features or design are driven by user engagement (or other commercially-motivated metrics), regulators will want to see how those pressures are managed.
What to Watch Next
- The responses to the 6(b) orders could lead to new regulatory proposals or guidance from the FTC.
- Litigation or enforcement actions might follow if disclosures, data practices, or safety / age-related protections appear insufficient.
- Public scrutiny will likely increase — both from advocacy groups and from media — particularly when it comes to minors, emotional impact, privacy.
Conclusion
The FTC's inquiry underscores its growing focus on how AI companion chatbots are designed, marketed, and used — particularly by young people. Companies in this space should expect heightened scrutiny and be prepared to show how they manage risks, protect users, and communicate clearly about their products.
This alert provides general coverage of its subject area. We provide it with the understanding that Frankfurt Kurnit Klein & Selz is not engaged herein in rendering legal advice, and shall not be liable for any damages resulting from any error, inaccuracy, or omission. Our attorneys practice law only in jurisdictions in which they are properly authorized to do so. We do not seek to represent clients in other jurisdictions.