ARTICLE
14 May 2025

(A)Identify Yourself: State Bills Would Require Notification When Interacting With AI

MB
Mayer Brown

Contributor

Mayer Brown is a distinctively global law firm, uniquely positioned to advise the world’s leading companies and financial institutions on their most complex deals and disputes. We have deep experience in high-stakes litigation and complex transactions across industry sectors, including our signature strength, the global financial services industry.
As the communication capabilities of artificial intelligence ("AI")-powered chatbots and automated voice assistants ("callbots") improve, it is becoming increasingly difficult to tell human from machine.
United States Alabama Hawaii Illinois Maine Massachusetts Consumer Protection

As the communication capabilities of artificial intelligence ("AI")-powered chatbots and automated voice assistants ("callbots") improve, it is becoming increasingly difficult to tell human from machine. At the same time, businesses are rapidly adopting chatbots and callbots for consumer communications. Consumers, therefore, are more likely than ever to find themselves chatting with bots, sometimes unknowingly.

In an effort to reduce the number of consumers communicating information with AI-powered systems when they believe they are interacting with human agents—including sharing potentially sensitive financial or other information—several states have introduced legislation this year that would require entities to disclose when consumers interact with these chatbots and/or callbots in commercial transactions. Specifically, Alabama, Hawaii, Illinois, Maine, and Massachusetts all introduced bills in 2025 that would make failing to provide the required notification an Unfair or Deceptive Acts or Practices (UDAP) violation. These bills would subject companies using chatbots and callbots in their consumer communication strategies to risk of investigation or enforcement by Attorneys General or similar government officers and, in some cases, to risk of private actions.

Below is an overview of each state's bill, presented not as a comprehensive survey of the current regulatory landscape for AI-powered communications, but as a signal of current and upcoming trends in increasing regulation of this aspect of consumer transactions. While the exact parameters of each bill differ, the driving principal is consistent: if a reasonable consumer would be misled to believe they are communicating with a human, they must be notified that they are communicating with AI.

Alabama

Under Alabama House Bill 516, it would be a deceptive act or practice to engage in a commercial transaction or trade practice with a consumer through a chatbot, AI agent, or other technology that engages in a textual or aural conversation that may mislead or deceive a reasonable person to believe they are communicating with a human, and either (i) the consumer is not notified in a clear and conspicuous manner that the consumer is communicating with a non-human, or (ii) the consumer may otherwise reasonably believe they are communicating with a human.

Hawaii

Under Hawaii House Bill 639, it would be an unfair or deceptive act or practice in a commercial transactions or trade practice to use an AI chatbot or other technology capable of mimicking human behavior, and that engages in textual or spoken conversation with a consumer that may mislead or deceive a reasonable person to believe they are engaging with a human without first disclosing to the consumer in a clear and conspicuous fashion that the consumer is interacting with a chatbot. However, small businesses—which are presently undefined—that unknowingly utilize AI chatbots in their operations will not be deemed to be engaged in an unfair or deceptive act or practice, unless the small business has been provided clear and adequate notice of the requirements under the bill and fails to comply after being afforded a reasonable opportunity to do so.

Illinois

Under Illinois House Bill 3021, it would be an unlawful practice—whether or not a consumer is actually misled, deceived, or damaged—to engage in a commercial transaction or trade practice with a consumer in which the consumer is communicating with a chatbot, AI agent, or other technology that engages in a textual or aural conversation, and both (i) the communication may mislead or deceive a reasonable consumer to believe that the consumer is communicating with a human, and (ii) the consumer is not notified in a clear and conspicuous manner that the consumer is communicating with an AI system, and not a human.

Maine

Under Maine House Paper 1154, it would be an unfair trade practice to use an AI chatbot or any other computer technology to engage in a commercial transaction or trade practice with a consumer in a manner that may mislead or deceive a reasonable consumer into believing that the consumer is engaging with a human if either (i) the consumer is not notified in a clear and conspicuous manner that the consumer is not engaging with a human, or (ii) the consumer reasonably believes that the consumer is engaging with a human.

Massachusetts

Under Massachusetts Senate Bill 243, it would be an unfair and deceptive act or practice to engage in a commercial transaction or trade practice with a consumer of any kind in which the consumer is interacting with a chatbot, artificial intelligence agent, or other computer technology that engages in a textual or aural conversation that may mislead or deceive a reasonable person to believe they are engaging with a human—regardless of whether such consumer is in fact misled, deceived or damaged—unless the consumer is notified in a clear and conspicuous manner that they are communicating with a computer, rather than a human.

Conclusion

With states focusing on ensuring that consumers know with whom—or what—they are communicating, it is important that companies deploying AI-powered chatbots and callbots in the consumer financial space stay up-to-date on pending legislation, and ensure that their use of bots is ready to comply with future state laws.

Visit us at mayerbrown.com

Mayer Brown is a global services provider comprising associated legal practices that are separate entities, including Mayer Brown LLP (Illinois, USA), Mayer Brown International LLP (England & Wales), Mayer Brown (a Hong Kong partnership) and Tauil & Chequer Advogados (a Brazilian law partnership) and non-legal service providers, which provide consultancy services (collectively, the "Mayer Brown Practices"). The Mayer Brown Practices are established in various jurisdictions and may be a legal person or a partnership. PK Wong & Nair LLC ("PKWN") is the constituent Singapore law practice of our licensed joint law venture in Singapore, Mayer Brown PK Wong & Nair Pte. Ltd. Details of the individual Mayer Brown Practices and PKWN can be found in the Legal Notices section of our website. "Mayer Brown" and the Mayer Brown logo are the trademarks of Mayer Brown.

© Copyright 2025. The Mayer Brown Practices. All rights reserved.

This Mayer Brown article provides information and comments on legal issues and developments of interest. The foregoing is not a comprehensive treatment of the subject matter covered and is not intended to provide legal advice. Readers should seek specific legal advice before taking any action with respect to the matters discussed herein.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More