ARTICLE
19 September 2025

California's Chatbot Bill May Impose Substantial Compliance Burdens On Many Companies Deploying AI Assistants

CM
Crowell & Moring LLP

Contributor

Our founders aspired to create a different kind of law firm when they launched Crowell & Moring in 1979. From those bold beginnings, our mission has been to provide our clients with the best services of any law firm in the world through a spirit of trust, respect, cooperation, collaboration, and a commitment to giving back to the communities around us.
California Governor Gavin Newsom has until October 12, 2025, to sign into law a first-in-the-nation bill that will, if enacted, likely impose significant regulatory obligations and litigation risk on companies deploying AI chatbots in California.
United States California Technology

What You Need to Know

Key takeaway #1

If SB 243 passes, liability risk arising from the use of chatbots increases.

Key takeaway #2

Careful review and proactive compliance planning are essential.

California Governor Gavin Newsom has until October 12, 2025, to sign into law a first-in-the-nation bill that will, if enacted, likely impose significant regulatory obligations and litigation risk on companies deploying AI chatbots in California.

Last week, the California Assembly and California State Senate adopted Senate Bill (SB) 243, which aims to regulate "companion chatbots" by targeting AI systems that engage users in ongoing, human-like social interactions. While its authors intend for the law to address risks associated with emotionally engaging chatbots targeting children, the bill's definition of "companion chatbot" may cover more ground—potentially capturing website chatbots and virtual assistants that serve a variety of seemingly innocuous purposes.

Key Issue: The Definition of a "Companion Chatbot"

SB 243 defines "companion chatbot" as an AI system with "a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a user's social needs, including by exhibiting anthropomorphic features and being able to sustain a relationship across multiple interactions."

The law expressly excludes bots "used only for customer service, a business' operational purposes, productivity and analysis related to source information, internal research, or technical assistance." (emphasis added)

But this language leaves room for interpretation. If a chatbot's use extends beyond these exclusions—such as by engaging users in ongoing dialogue, offering personalized recommendations, or supporting social/emotional needs—it could fall within the law's scope. Put another way, many bots may be primarily used for an excepted purpose, such as for customer service, but also in a manner "capable of meeting a user's social needs," thus falling within the ambit of the bill.

Potential Examples of Covered Chatbots

  • Website Chatbots with Persistent Profiles: Many companies deploy chatbots that remember previous user interactions, offer personalized greetings, and provide tailored recommendations. If these bots maintain ongoing relationships or appear "friendly," they could be considered companion chatbots.
  • Customer Engagement Bots Offering Emotional Support: Some brands use chatbots to check in on users, offer wellness tips, or provide encouragement. These bots may go beyond customer service and touch on users' social or emotional needs.
  • Virtual Shopping Assistants: E-commerce sites increasingly use AI assistants that help users navigate complex choices, remember preferences, and engage in multi-session dialogues. If the assistant's interaction feels anthropomorphic or relationship-building, the AI assistant may be covered.
  • Financial Wellness Bots: Banks and fintech firms sometimes offer bots that help users manage stress, set goals, and provide ongoing motivational feedback. If these chatbots sustain relationships and meet social needs, they may be covered.
  • Education Platforms with "Study Buddy" Chatbots: Some online learning platforms use chatbots that support students emotionally, encourage persistence, and maintain ongoing dialogue. These bots could qualify as companion chatbots.

Why This Matters: Potentially Significant Litigation Risk and Liability Exposure

SB 243 requires operators of companion chatbots to comply with disclosure, notice, and regulatory reporting obligations. And in some cases, companion chatbot operators must build protocols to limit certain types of dangerous conversations with the chatbot.

Critically, SB 243 also allows private lawsuits against operators for violations, with damages set at the greater of actual damages or $1,000 per violation, plus attorney's fees and costs. This means individual consumers can sue operators and developers, and damages can be massive.

Moreover, such lawsuits could invite the risk of not just consumer class actions, but also state attorney general and other executive office enforcement actions. This is especially so given that state attorneys' general have expressed concerns about the use of chatbots, and the Federal Trade Commission (FTC) has launched an inquiry into AI chatbots acting as companions.

This law could create substantial risk for companies whose chatbots might fall within the definition of "companion chatbots," especially if the chatbot's functions are not strictly limited to customer service, operational, or technical support.

Plaintiffs may argue that any chatbot exhibiting anthropomorphic features or sustaining a relationship across multiple interactions meets the definition, exposing businesses to potentially costly litigation and compliance burdens.

Recommended Actions

If SB 243 becomes law, companies operating chatbots in California should:

  • Review all current and planned chatbot deployments for features that could be interpreted as "companion" functions.
  • Contact Crowell to assess exposure and consider limiting chatbot functionality or clearly documenting its primary purpose as customer service or technical support.
  • Monitor regulatory developments and prepare for compliance with notification, reporting, and audit requirements if covered.

Bottom Line: The broad definition of "companion chatbot" in SB 243 means that many websites and chatbots could be swept into the law's scope, exposing companies to private actions and significant damages. Careful review and proactive compliance planning are essential.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More