ARTICLE
9 April 2026

Washington State Passes AI Companion Chatbot Law

SM
Sheppard, Mullin, Richter & Hampton LLP

Contributor

Businesses turn to Sheppard to deliver sophisticated counsel to help clients move ahead. With more than 1,200 lawyers located in 16 offices worldwide, our client-centered approach is grounded in nearly a century of building enduring relationships on trust and collaboration. Our broad and diversified practices serve global clients—from startups to Fortune 500 companies—at every stage of the business cycle, including high-stakes litigation, complex transactions, sophisticated financings and regulatory issues. With leading edge technologies and innovation behind our team, we pride ourselves on being a strategic partner to our clients.
Washington state has joined a handful of other states with laws that require disclosure to consumers if they are interacting with a chatbot. Other, broader, chatbot laws exist in California, Maine, New Jersey and Utah.
United States Washington Technology
Sheppard, Mullin, Richter & Hampton LLP are most popular:
  • within Cannabis & Hemp topic(s)

Washington state has joined a handful of other states with laws that require disclosure to consumers if they are interacting with a chatbot. Other, broader, chatbot laws exist in California, Maine, New Jersey and Utah. Unlike the new Washington law, though, they are not aimed at AI companions. Washington's law will go into effect on January 1, 2027, and has provisions aimed at both adults and children.

The law applies to those who provide AI companions, rather than chatbots more generally. An AI companion chatbot is defined to include an AI system that creates a "sustained human-like relationship with a user." It does not include systems that are used only for customer service, productivity, education, in-game chats, and the like. Under the law, companies will need to disclose to users that they are interacting with AI, not a human. That notice has to appear bot at the beginning of the interaction, as well as every three hours during "continued interaction." Those who provide these platforms must also have protocols for detecting potential suicidal ideation or expressions of potential self-harm.

Additional requirements exist if the user is a minor. These include avoiding "manipulative engagement techniques." Examples of these are prompting the child to come back to the AI companion to get more support or giving the child excessive praise. Violations of the law will be considered unfair practices under the state's consumer protection law. Which provides for a private right of action.

Putting It Into Practice: While common uses of chatbots -like customer service or in-game chat- are excluded from the law, it shows a growing regulatory concern with customer-facing AI tools that mimic human conversations yet have little or no human intervention and oversight. Companies may want to take this as a reason to inventory their current AI chatbots and see where some may be serving as companions – or might do so in the future.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

[View Source]

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More