ARTICLE
14 May 2025

Do You See What I See? Fake AI Cases Can Result In Real Contempt

ML
McMillan LLP

Contributor

McMillan is a leading business law firm serving public, private and not-for-profit clients across key industries in Canada, the United States and internationally. With recognized expertise and acknowledged leadership in major business sectors, we provide solutions-oriented legal advice through our offices in Vancouver, Calgary, Toronto, Ottawa, Montréal and Hong Kong. Our firm values – respect, teamwork, commitment, client service and professional excellence – are at the heart of McMillan’s commitment to serve our clients, our local communities and the legal profession.
The increased use of artificial intelligence ("AI") in the legal profession has been a significant innovation.
Canada Technology

The increased use of artificial intelligence ("AI") in the legal profession has been a significant innovation. AI tools can deliver efficiency and create value for clients. At the same time, AI has brought new pitfalls, which, if ignored, may result in lawyers finding themselves in contempt of court. The Ontario Superior Court of Justice released a decision by Justice Myers1 this week discussing the risks of AI tools generating fabricated cases2 and the professional obligations lawyers have in making accurate representations to the court. The decision serves as a strong reminder to counsel that their professional obligations include verifying the legal authorities they rely on, and that the innovations offered by AI cannot be relied on to replace counsel's own legal analysis. Reliance on AI hallucinated cases in submissions to the court can constitute a breach of professional obligations and may amount to contempt in the face of the court.

The Ko v. Li Decision

With the advent of AI and the potential for AI hallucinations, Justice Myers took the opportunity to "quickly and firmly make clear that, regardless of technology, lawyers cannot rely on non-existent authorities or cases that say the opposite of what is submitted."3

The motion before Justice Myers dealt with two estates applications and one family law application.4 However, Justice Myers' endorsement first addressed the "serious issue" that the applicant's submissions appeared to contain AI hallucinations.5

The AI hallucinations in the factum included:

  • a citation hyperlink that directed to the wrong case;6
  • citations to cases that had nothing to do with the issue being argued;7
  • a broken hyperlink that led a "404 Error" page on CanLII;8 and
  • a citation to a case that reached a conclusion opposite to the one purported in the submissions.9

The cases cited in the factum and used during oral submissions could not be found on CanLII, Westlaw, Quicklaw or Google.10 Counsel was also unable to provide copies of the cases from the printed materials relied on while making her submissions.11

Based on these deficiencies, Justice Myers noted that AI appeared to have generated the factum and counsel "might not have checked to make sure the cases were real or supported the propositions of law which she submitted to the court in writing and then again orally."12

Justice Myers added that "[i]t should go without saying that it is the lawyer's duty to read cases before submitting them to a court as precedential authorities. At its barest minimum, it is the lawyer's duty not to submit case authorities that do not exist or that stand for the opposite of the lawyer's submission...It is the litigation lawyer's most fundamental duty not to mislead the court."13

While Justice Myers found there may have been a grave breach of duties and contempt of court, there has been no final finding in this case. The lawyer is presumed innocent and will have an opportunity to submit evidence on the issue.14

Best Practices

Lawyers using AI in their practice should consult sources available to the profession, such as the LSO's Technology Resource Centre, which has up-to-date resources including:15

Regardless of AI use, lawyers must verify all citations they rely on. The Ontario Rules of Civil Procedure require lawyers to certify the authenticity of the authorities relied upon in a factum:16

A factum shall include a statement signed by the party's lawyer, or on the lawyer's behalf by someone the lawyer has specifically authorized, certifying that the person signing the statement is satisfied as to the authenticity of every authority cited in the factum.

The Federal Court has also published a notice to parties and the profession on the use of AI in court proceedings, a key element of which is disclosure of AI use to the court.17

When asked if AI can hallucinate cases, ChatGPT confirmed that AI can generate "information—such as a court case citation, summary, or legal principle—that appears real but is entirely fabricated or inaccurate."18 ChatGPT further explained that common AI hallucinations in law include generating fake citations, distorting facts or rulings, or providing incorrect jurisdictional links. ChatGPT advises to:

  • always verify legal information from trusted databases or court websites;
  • ask the AI tool to cite a source or confirm whether a case is real; and
  • use AI tools designed specifically for legal research that are integrated with verified legal content.

Notwithstanding ChatGPT's advice to get AI to confirm authenticity, the ultimate responsibility rests with counsel to do so.

Takeaways

Courts and tribunals have previously grappled with the "fake cases" issue and emphasized the seriousness of it.19 This recent Ontario decision takes it a step further and cautions lawyers that reliance on a fake case may result in a finding of contempt as against the lawyer.20

The Ontario decision also serves as a reminder of lawyers' duties to the court, to their clients, and to the administration of justice:

  • to faithfully represent the law to the court;
  • not to fabricate case precedent and not to mis-cite cases for propositions that they do not support;
  • to use technology, conduct legal research and prepare court documents competently;
  • to supervise staff and review material prepared for counsel's signature; and
  • to ensure human review of materials prepared by non-human technology such as generative AI.21

Footnotes

1 Ko v Li, 2025 ONSC 2766 ["Ko"].

2 The Law Society of Ontario's white paper on Licensee use of generative artificial intelligence refers to hallucinations and notes that generative AI tools can provide responses that include information that is fabricated or otherwise inaccurate but which appears authentic (page 6).

3 Ko at paras 26-27.

4 ibid at para 34.

5 ibid at para 1.

6 ibid at para 5.

7 ibid at paras 5 and 13.

8 ibid at para 6.

9 ibid at para 11.

10 ibid at paras 7 and 25.

11 ibid at para 8.

12 ibid at para 14.

13 ibid at paras 21-22.

14 ibid at para 31.

15 Law Society of Ontario, "Technology Resource Centre: Using Technology".

16 R.R.O. 1990, Reg. 194, r 4.06.1(2.1).

17 Federal Court, "Notice to the Parties and the Profession" (May 7, 2024).

18 ChatGPT's response to inquiry "Can artificial intelligence hallucinate case law?" (May 8, 2025).

19 For example, Zhang v Chen, 2024 BCSC 285: "Citing fake cases in court filings and other materials handed up to the court is an abuse of process and is tantamount to making a false statement to the court. Unchecked, it can lead to a miscarriage of justice." (para 29); also see decisions of the Trademarks Opposition Board: Industria de Diseño Textil, SA v Sara Ghassai, 2024 TMOB 150 at para 6; Monster Energy Company v Pacific Smoke International Inc., 2024 TMOB 211 at para 16.

20 Ko at para 31.

21 ibid at paras 15-20.

The foregoing provides only an overview and does not constitute legal advice. Readers are cautioned against making any decisions based on this material alone. Rather, specific legal advice should be obtained.

© McMillan LLP 2025

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More