ARTICLE
29 October 2025

Fake Law, Real Trouble: The Risks Of AI Errors In Court Submissions

TM
Torkin Manes LLP

Contributor

Torkin Manes LLP is a full service, mid-sized law firm based in downtown Toronto. Our clientele ranges from public and private corporations, to financial institutions, to professional practices, to individuals. We have built our firm from the ground up—by understanding our clients’ business needs, being results-oriented, practical, smart, cost-effective and responsive.
The recent Ontario Superior Court decision, Halton (Regional Municipality) v. Rewa et al., 2025 ONSC 4503 ("Halton v. Rewa"), demonstrates growing judicial concern...
Canada Technology
Roland Hung’s articles from Torkin Manes LLP are most popular:
  • with Finance and Tax Executives and Inhouse Counsel
  • with readers working within the Banking & Credit, Insurance and Securities & Investment industries

The recent Ontario Superior Court decision, Halton (Regional Municipality) v. Rewa et al., 2025 ONSC 4503 ("Halton v. Rewa"), demonstrates growing judicial concern regarding the use of artificial intelligence ("AI") in the courtroom. In the criminal case of Halton v. Rewa,a self-represented defendant, Mr. Rewa, relied on non-existent cases generated by AI in his submissions to the Court. Concern arose specifically with respect to Mr. Rewa's use of AI, which when used to draft court documents, can waste judicial resources at the cost to those who submitted them.

AI-Generated Submissions: "Let's see if I get caught"

Mr. Rewa admitted to using AI to generate his materials and cited his inability to hire counsel and the resulting challenges of being a self-represented litigant. The judge identified several fabricated authorities irrelevant to the legal issues cited in the defendant's submissions as "typical examples of AI-generated arguments and AI hallucination", designed to support Mr. Rewa's position.

Additionally, the AI submissions were part of the defendant's troubling pattern of behaviour. As we have seen in other cases, the Court has been forgiving with one-time instances of putting forward AI-hallucinated materials (please see our article here). However, in Halton v. Rewa, the defendant had previously filed materials containing non-existent cases and had been warned by the Court against doing so again in future submissions.

Notably, the judge stated, "one also does not need to be a lawyer to read through a case to verify if it stands for the suggested proposition." Rather than taking proactive steps to alert the Court of his error, come with 'clean hands' and correct it, Mr. Rewa stayed silent. The judge remarked that the defendant's conduct reflected a "let's see if I get caught" attitude rather than an honest mistake.

Judicial Response: Maintaining the Integrity of the Justice System

In Halton v. Rewa, the Court emphasized that misleading submissions undermine the administration of justice, regardless of whether they are advanced by counsel or a self-represented party. This decision follows earlier rulings such as Ko v. Li, where the Court expressed concern about how court submissions containing fabricated citations and propositions contribute to a miscarriage of justice and waste of judicial resources. Although the Court allowed Mr. Rewa to correct his motion, Mr. Rewa was also ordered to pay costs. The Court made it clear that future use of AI-generated, non-existent case law would not be tolerated.

Takeaway: The Use of AI in Canadian Courts

Canadian courts are increasingly encountering AI-generated submissions. To date, the Court has seen AI-generated factums cite both non-existent caselaw and caselaw that is clearly irrelevant to the legal issues being argued in support of propositions.

The message here is clear: AI is no substitute for legal research and drafting conducted by humans. Legal submissions built on hallucinated AI-generated authorities will impact a litigant's credibility, waste judicial resources and potentially result in cost consequences to those who rely on them. Whether represented or self-represented, the bare minimum for every person submitting authorities to the Court is that they confirm those authorities exist, read the cases before submitting them and ensure they stand for the propositions they allegedly support. Parties should avoid relying exclusively on AI in preparing legal materials; judges are alive to its potential for misuse.

The authors would like to thank Torkin Manes' Articling Student Ilar Haydarian for her invaluable contributions in preparing this insight.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More