ARTICLE
20 May 2025

Factum Containing Fake Case Law Likely Generated By AI Submitted In Ontario Litigation (Ko v. Li)

GR
Gardiner Roberts LLP

Contributor

Gardiner Roberts is a mid-sized law firm that advises clients from leading global enterprises to small & medium-sized companies, start-ups & entrepreneurs.
ChatGPT and other generative AI (artificial intelligence) platforms have the potential to provide litigants with an efficient tool to research and understand the legal concepts and governing authorities...
Canada Ontario Technology

ChatGPT and other generative AI (artificial intelligence) platforms have the potential to provide litigants with an efficient tool to research and understand the legal concepts and governing authorities that are applicable to their disputes. AI may provide valuable assistance in this regard. However, there have been several notorious cases, mainly in the United States, where parties have failed to verify whether or not the cases that have been generated by the AI tool are real. Courts depend on counsel appearing before them to provide the relevant—and real—cases that are relied upon when they advocate for their clients.

To date, there have been few reported incidents of misuse of AI in Canadian court proceedings. Perhaps inevitably, however, the issue of failing to verify the veracity of cases provided to a court has now arisen in an Ontario matter: Ko v. Li, 2025 ONSC 2766 (CanLII).

The decision arose during a motion to set aside a divorce order in an estates and family law proceeding. In a factum filed for the motion, counsel for one of the parties cited and hyperlinked cases dealing with the issues of duress, mistake, and procedural irregularity. However, the hyperlinked citation for one of the cases directed the reader to an entirely different case on the CanLII.org website which had nothing to do with the issue being addressed. Further, a hyperlink under another case cited in the factum led to the internet error message: 404 Error - Page not found / CanLII cannot find the page you requested.

During submissions, counsel referred the motion judge to the cases cited to support her client's arguments. However, when the hyperlinks did not go to the cited cases, the judge searched the CanLII website and was unable to find the cases.

Counsel was also unable to provide the motion judge with correct citations to the cases cited in the factum or to provide hard copies of the cases during her submissions in court.

The motion judge asked counsel if the factum was prepared by AI such as ChatGPT. Counsel advised that her office did not usually do so but that she would confirm with her clerk.

After the hearing, the motion judge reviewed the factum again and reviewed other cases cited on another issue on the motion relating to removal of an estate trustee. In the factum, counsel referred to one of the cases as an example where the court had removed a trustee for failing to account and for conduct that eroded the beneficiaries' confidence. However, the actual result in the case was the opposite as the court did not remove the estate trustees.

The hyperlink for a second cited case directed the reader to the CanLII.org website to another case that had nothing at all to do with the removal of a trustee. The motion judge was unable to find the case cited in the factum on the CanLII.org website.

In the motion judge's view, the circumstances appeared similar to situations in which people have had factums drafted by generative AI applications that created fake legal citations or "hallucinations" to support legal arguments. It appeared that the factum may have been created by AI and counsel may not have checked to make sure that the cases were real or supported the propositions of law for which she submitted them to the court in writing and then again orally.

The motion judge considered a British Columbia decision, Zhang v. Chen, 2024 BCSC 285 (CanLII), where the court stated that it was an abuse of process to cite fake cases and was tantamount to making a false statement to the court: "Unchecked, it can lead to a miscarriage of justice". In the B.C. case, however, counsel had discovered that the cases were fake before the hearing, apologized to all, and withdrew her factum.

In the case at hand, counsel actively relied on two of the suspicious cases as part of her submissions in open court. In the few days since the oral hearing, the motion judge had not received any follow up communication from counsel explaining the circumstances, correcting her factum, or otherwise acknowledging an issue.

The motion judge noted it is a litigation lawyer's "most fundamental duty" not to mislead the court. Lawyers have a duty to faithfully represent the law to the court, not to fabricate case precedents, and not to mis-cite cases for propositions that they do not support. Further, lawyers have a duty to use technology, conduct legal research, and prepare court documents competently, and to supervise staff and review material prepared for court.

With regard to the use of AI, the motion judge succinctly stated: "It is the lawyer's duty to ensure human review of materials prepared by non-human technology such as generative artificial intelligence".

As at the time the decision was released, the full facts were not yet known and it has yet to be determined whether there was an explanation for the unknown cases cited in the lawyer's factum. The motion judge was concerned, however, that with the sudden advent of AI, this has quickly become a very important issue:

"The court must quickly and firmly make clear that, regardless of technology, lawyers cannot rely on non-existent authorities or cases that say the opposite of what is submitted."

In the result, counsel was ordered to show cause why she should not be cited for contempt and will have a fair opportunity to submit evidence to explain what happened. Whether counsel will provide a satisfactory explanation to address the motion judge's concerns will be addressed at the hearing. Until then, counsel is protected by the presumption of innocence and other procedural rights as discussed by the Court of Appeal for Ontario in R. v. Cohn, 1984 CanLII 43 (ON CA).

The decision illustrates the potential consequences and concerns that may arise if AI is relied upon to generate authorities supporting legal positions and the supposed authorities are not verified. In the current Ontario Rules of Civil Procedure, counsel must certify in writing that they are satisfied as to the authenticity of the cases that they have cited in their factum. At a minimum, this would require checking that hyperlinks in the factum go to real cases and that the cases cited address and support the submissions being made. A PDF version is available for download here.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More