The recent decision in Moffatt v. Air Canada, 2024 BCCRT 149, represents a milestone in the expanding field of digital interactions and accountability. The case grapples with whether a company can be held liable for misleading information provided by an automated chatbot on its website. The decision held that a company can be liable for negligent misrepresentations made by a chatbot on a publicly available commercial website. The decision represents an incremental development of the law which previously has focused on the liability of persons for their pre-programmed automated tools.

Summary of the decision

Jake Moffatt sought a refund from Air Canada, claiming he relied on incorrect advice from the airline's chatbot regarding bereavement fares. The chatbot suggested that Mr. Moffatt could retroactively apply for bereavement fares, a statement later contradicted by another page on Air Canada's website. Mr. Moffatt sought compensation for the difference between the regular and bereavement fares.

To prove the tort of negligent misrepresentation, Mr. Moffatt had to show that Air Canada owed him a duty of care, its representation was untrue, inaccurate, or misleading, Air Canada made the representation negligently, Mr. Moffatt reasonably relied on it, and Mr. Moffatt's reliance resulted in damages.

Air Canada argued that it could not be held responsible for the chatbot's misleading information and claimed that Mr. Moffatt did not follow the proper procedure for bereavement fare requests. It asserted that the chatbot, despite being part of the Air Canada website, should be considered a separate entity and, thus, absolved it of any liability for its inaccuracies.

The tribunal held that "given their commercial relationship as a service provider and consumer" "Air Canada owed Mr. Moffatt a duty of care." It also found that the Chatbot gave Mr Moffatt inaccurate information.

The tribunal held that Air Canada was liable for the misleading information provided by the chatbot. According to the tribunal generally "the applicable standard of care requires a company to take reasonable care to ensure their representations are accurate and not misleading." It made a finding that Air Canada breached this duty "as Air Canada had failed to exercise reasonable care to ensure the information's accuracy".

I find Air Canada did not take reasonable care to ensure its chatbot was accurate. While Air Canada argues Mr. Moffatt could find the correct information on another part of its website, it does not explain why the webpage titled "Bereavement travel" was inherently more trustworthy than its chatbot. It also does not explain why customers should have to double-check information found in one part of its website on another part of its website.

The tribunal further found that Mr. Moffatt was reasonable to rely on the chatbot's information, that he would not have flown last-minute had he known he would need to pay the full fare, and that he suffered damages as a result of the inaccurate information.

The tribunal rejected Air Canada's argument that the chatbot was a separate entity, stating that as a part of Air Canada's website, the airline was responsible for all information provided, including that from the chatbot.

Air Canada argues it cannot be held liable for information provided by one of its agents, servants, or representatives – including a chatbot. It does not explain why it believes that is the case. In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions. This is a remarkable submission. While a chatbot has an interactive component, it is still just a part of Air Canada's website. It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot.

Comments on the decision

This decision is important as it reaffirms the well established principle that generally organizations are responsible for the acts or omissions of the computer systems they use and for misrepresentations they make to the public, irrespective of whether it comes from a human representative or an automated chatbot.

The Air Canada case may be the first case to affirm this principle in the context of information provided by chatbots, but there is significant caselaw dealing with programmed computers to the same effect. For example, a Hong Kong court which relied on a decision of a British Columbia decision, which in turn relied on my book, Sookman Computer Internet and e-Commerce Law, summarized the law on this point this way;

This brings the arguments back to a more fundamental question (when shed of the modernity and complexity of the internet): as a matter of general tort principle, should or should not a person/entity remain responsible in law for acts done by his/her tool, and what are the limits of such liability (if any)?. . .

In a lengthy judgment, Punnett J said [in Century 21 Canada Limited Partnership v Rogers Communications Inc and anor doing business as Zoocassa Inc]:

A machine or a computer and the software that runs it has at some point been constructed and programmed by an individual. As noted by Sookman at 10.5:

. . . an electronic agent, such as a computer program or other automated means employed by a person, is a tool of that person. Ordinarily, the employer of a tool is responsible for the results obtained by the use of the tool since the tool has not independent volition of its own. When computers are involved, the requisite intention flows from the programming and use of the computer.

I agree with this statement. Liability is not avoided by automating the actions in question.1

A similar holding was arrived at in the United States case, State Farm Mutual Auto Insurance Co. v. Bochorsf,2 a case in which an insurance company was held to be bound by a contract to renew an insurance policy formed by its pre-programmed computer. The court had little trouble in attributing the action of its computer to the business:

Holding a company responsible for the actions of its computer does not exhibit a distaste for modern business practices. . .a computer operates only in accordance with the information and direction supplied by its human programmers. If the computer does not think like a man, it is man's fault.

What is unfortunate about the Air Canada case is the lack of information about why the tribunal found Air Canada to be negligent. The case noted that "Air Canada did not provide any information about the nature of its chatbot". It stated only that "generally speaking, a chatbot is an automated system that provides information to a person using a website in response to that person's prompts and input." Air Canada might have tried to defend the case by explaining how the chatbot was trained and tested to argue that while the message provided was inaccurate, it was not negligent. I suspect that future cases will focus carefully on this. Future cases involving AI systems will also need to canvass who is responsible for outputs of AI systems including generative AI systems from among the numerous possible AI actors and users.

Air Canada also did not appear to rely on any website terms and conditions that may have attempted to disclaim liability. It is well known chatbots and AI systems including generative AI systems sometimes give inaccurate information. These "hallucinations" and the liability therefore are commonly disclaimed in terms of service.

Despite the questions about whether Air Canada was indeed negligent, the decision underscores the importance of ensuring the accuracy of information across all customer interfaces and highlights the potential legal liabilities arising from negligent misrepresentation in the use of chatbots and AI systems more generally.

The decision also represents an incremental development of the law which previously has focused on the liability of persons for their pre-programmed automated tools.3

Footnotes

1. Quoted in Sookman, Computer, Internet, and eCommerce Law at 11.3(a).

2. 453 F. 2d 533 (10th Cir. 1972).

3. Prepared with the assistance of Microsoft CoPilot. The blog is mine, mistakes are CoPilot's!

This article was first posted on www.barrysookman.com

To view the original article click here

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.