ARTICLE
24 September 2025

Too Much Chatter? AGs Continue Criticism Of AI Chatbots

KD
Kelley Drye & Warren LLP

Contributor

Kelley Drye & Warren LLP is an AmLaw 200, Chambers ranked, full-service law firm of more than 350 attorneys and other professionals. For more than 180 years, Kelley Drye has provided legal counsel carefully connected to our client’s business strategies and has measured success by the real value we create.
Following a multistate AG letter to AI companies relating to protecting children, California Attorney General Rob Bonta and Delaware Attorney General Kathleen Jennings met with OpenAI and sent a separate letter to its board.
United States Technology
Kelley Drye & Warren LLP are most popular:
  • within Compliance, Tax and Intellectual Property topic(s)
  • with readers working within the Environment & Waste Management industries

Following a multistate AG letter to AI companies relating to protecting children, California Attorney General Rob Bonta and Delaware Attorney General Kathleen Jennings met with OpenAI and sent a separate letter to its board. The letter states that as a Delaware nonprofit with California headquarters, the company's recapitalization plan "is subject to review" to protect beneficiaries and the nonprofit mission. The letter reiterates concerns previously outlined in the multistate correspondence regarding inappropriate chatbot interactions with children and cites additional news reports linking similar interactions to recent suicides. The AGs evoke OpenAI's charitable mission, and note that "before we get to benefiting [humanity], we need to ensure that adequate safety measures are in place to not harm." The AGs urge the company to "amplify safety" throughout the continued dialogue between their offices.

Relatedly, soon after, AG Bonta announced his support of California's Leading Ethical AI Development (LEAD) for Kids Act, AB 1064. The Act would prohibit companion chatbots from being available to children unless the chatbot, "is not foreseeably capable of," among other things:

  • encouraging certain harm to themselves or other illegal activity,
  • providing therapy,
  • sexually explicit interactions, and
  • validating the child over factual accuracy or the child's safety.

The bill defines companion chatbots to include generative AI "that simulates a sustained humanlike relationship" by retaining user sessions/interactions to personalize and facilitate ongoing engagement, asks unsolicited, emotional questions, and sustains personal ongoing dialogue. It excludes customer service bots, research or tech support systems, or internal business systems.

The legislature's findings recite recent instances of teen deaths by suicide after interacting with chatbots – which it states "are not incidental but the direct result of design choices by companies that intentionally simulate social attachment and emotional intimacy," and "are designed to exploit children's psychological vulnerabilities." The legislature declares, "[a]llowing children to use companion chatbots that lack adequate safety protections constitutes a reckless social experiment on the most vulnerable users."

If the bill passes, businesses must follow an actual knowledge standard until 2027, and then must have made a reasonable determination that a user is not a child to permit use of a companion chatbot. Under the Act, the AG may seek penalties of up to $25,000 per violation. In addition, a parent may bring a civil action for damages and other relief.

California already has an existing law requiring disclosures of the use of chatbots generally, including in the context of customer service. And other states have more specific AI chatbot laws, requiring among other things increased transparency for customers when they are interacting with AI. Be on the lookout for our continued reporting on government enforcement in the AI space.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More