In a follow up to the decision we discussed in Factum containing fake case law likely generated by AI submitted in Ontario litigation (Ko v. Li), the Ontario Superior Court of Justice declined to order any sanctions against a lawyer who had submitted fake case law to the court that had been generated by AI on the basis that she had already gained notoriety from the widespread publication of reports concerning the show cause order in both the legal and the general press: Ko v. Li, 2025 ONSC 2965 (CanLII).
In the motion proceedings leading to the decision, the lawyer had delivered a factum containing submissions to the court concerning the law that applied to the issues in dispute. The factum referred to several non-existent or fake precedent court cases that were then relied upon in oral argument in open court. The lawyer was unable to advise whether the factum had been prepared using generative AI.
The motion judge gave notice to the lawyer that by citing fake case precedents she may have violated her duties as legal counsel and required her to show cause why she should not be held in contempt of court.
Prior to the next court date, the lawyer delivered a letter to the court in which she explained that her factum had indeed been prepared by her staff in part with the use of generative AI, namely ChatGPT. The errant cases were indeed AI "hallucinations" as the motion judge had surmised. The lawyer apologized and asked that no finding of contempt of court be made against her, advising of several protocols that she would implement in her practice to ensure that the error was not repeated.
In the subsequent decision, the motion judge noted that another issue with the factum that had been submitted was that it did not comply with Rule 4.06.1(2.1) of the Ontario Rules of Civil Procedure, which codifies the duties of counsel to cite law honestly and without misrepresentation:
A factum shall include a statement signed by the party's lawyer, or on the lawyer's behalf by someone the lawyer has specifically authorized, certifying that the person signing the statement is satisfied as to the authenticity of every authority cited in the factum.
While this may not have been the root cause of the issues, the motion judge note that a special rule had been created to address the new phenomenon of AI hallucinations. The failure to include the certification may have led to the lawyer sidestepping the very rule that was intended to prevent the misuse of AI.
Further, the motion judge raised the possibility that the lawyer could be liable for personal costs under Rule 57.07 of the Rules of Civil Procedure which can be imposed where a lawyer for a party has caused costs to be incurred without reasonable cause or to be wasted by undue delay, negligence or other default.
In response, the lawyer submitted that she had no disciplinary history with the Law Society of Ontario or the court during her 30 years in practice. She acknowledged that she was not comfortable with technology such as generative AI and was shocked when she discovered that the cases on which she relied could not be found. She accepted responsibility for her staff and reiterated her regret at failing to review the cases in her factum before relying on some of the cases in court. She acknowledged that she was an officer of the court with duties to uphold the integrity and honour of the justice system.
The motion judge took notice of the lawyer's notoriety following the release of the endorsement requiring her to show cause, as the lawyer had been bombarded by calls from reporters and colleagues. Her son in Florida saw press reports about the decision online.
The motion judge noted that in some similar cases in the United States, courts have ordered lawyers to pay monetary sanctions typically in the range of US$5,000. However, the US courts have different roles than courts in Canada regarding the regulation of the lawyers who appear before them and can more readily impose sanctions against counsel who appear before them.
In the case at hand, the lawyer had recognized her responsibilities and proposed steps to guard against committing the same lapse again. The motion judge's view was that there was no public interest served in proceeding with a show cause hearing. Rather, the motion judge confirmed with the lawyer that the endorsement would make the facts of her expressions of regret and accountability public, that she would fulfill a commitment to take continuing professional development courses, and that she would not bill her client for the research, factum writing, and attendance at the motion.
In the motion judge's view, the publicity surrounding the case had otherwise served both to publicly denounce inappropriate conduct and as general deterrence to the bar and others who might rely on AI for legal submissions.
While the decision arose from the unchecked reliance on generative AI, it was not the use of AI itself that was the main concern. Rather, the issues involved go to the fundamental duties of counsel when making submissions to court. Counsel are obliged to be satisfied that the submissions of law do not misstate the law or cite incorrect (or fake) authorities. As noted by the motion judge, it does not really matter if the factum was drafted by AI, a clerk, or a law student. Counsel must satisfy themselves that the authorities relied upon exist and support the arguments made. A PDF version is available for download here.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.