- in Middle East
- in Middle East
- in Middle East
- in Middle East
- in Middle East
- in Middle East
- within Criminal Law, Environment and Real Estate and Construction topic(s)
Introduction
Artificial intelligence and court advocacy has produced its first significant judgment in the Qatar International Court and Dispute Resolution Centre (QICDRC). In the recent judgment of Jonathan David Sheppard v Jillion LLC [2025] QIC (E) 3, the Court has issued a stark warning to practitioners regarding the use of generative AI in legal research.
While the Court acknowledged that the use of AI is to be welcomed for its potential to increase efficiency, it has drawn a hard line regarding the responsibilities of counsel. The judgment establishes that the citation of non-existent, AI-generated authorities can constitute not merely a procedural error, but a contempt of court.
This article will consider the implications for practitioners, in-house counsel, and firms operating within the QFC jurisdiction.
The Facts: A "Dubai Lawyer" and the Phantom Precedents
The dispute began as a standard employment claim. However it quickly escalated into a procedural crisis when a Dubai-based lawyer, assisting the Defendant (a friend), filed an application for an extension of time.
To support the application and a subsequent jurisdictional challenge, the lawyer cited two specific cases that purportedly hailed from the QFC courts:
- Al Khor International School v. Gulf Contracting Co. (QFC 2022) - cited for the principle that extensions are granted for exceptional circumstances.
- Doha Bank v. KPMG (QFC App 2019) - cited for the principle that fairness overrides rigid deadlines and that jurisdiction is interpreted narrowly.
The opposing Claimant, acting in person, attempted to locate these authorities but could not. When the Registry ordered the lawyer to provide copies of the judgments, the lawyer initially stalled, claiming he was "unable to access judgement copies" but would file the defence anyway to avoid delay.
Only after a direct order from the Registry did the lawyer admit that the cases were found via "Google research" (which the Court identified as AI-generated hallucinations) and that the citations were erroneous.
The Evidence: The Anatomy of an AI Hallucination
The lawyer provided screenshots of the research to the Court. These images, reproduced in the judgment, bear the classic hallmarks of generative AI "hallucinations": confident, plausible-sounding summaries of cases that simply do not exist.
For instance, the AI tool invented a detailed narrative for Doha Bank v. KPMG, discussing an "opt-in jurisdiction" and a conflict between contractual clauses and statutory authority. For the Al Khor case, the AI conveniently explained that the full text was "not publicly available through standard legal research databases" due to its "confidential nature", a disclaimer that should have immediately raised red flags for any qualified practitioner.
The Finding: From Inadvertence to Intentional Contempt
The critical turning point in this judgment was the Court's refusal to accept the lawyer's defence of inadvertence.
The lawyer argued that the errors were due to "copy paste errors" and reliance on secondary sources while working late hours. However, Lord Thomas of Cwmgiedd, President of the Court, found that the conduct amounted to 'intentional contempt' under Article 35.2 of the QICDRC Rules.
The Court's reasoning was two-fold:
Recklessness as Intent
By late 2025, it is difficult to see how any lawyer could be unaware of the necessity of checking AI-generated cases against official jurisprudence. Proceeding without such checks amounts to reckless conduct.
Misleading the Court
The lawyer compounded the error by telling the Registry he was "unable to access" the judgments, rather than admitting they did not exist. This was legally classified as giving information that is false or misleading intended to obstruct the Court.
Global Context: The Gatekeeping Role of Advocates
The QICDRC explicitly positioned this judgment alongside major international decisions, confirming that Qatar is aligned with global judicial standards on AI integrity.
The judgment cites:
- USA: Mata v Avianca Inc, the seminal case where lawyers cited fake cases generated by ChatGPT.
- England & Wales: R (Ayinde) v London Borough of Harringay, where Dame Victoria Sharp emphasized the "gatekeeping role" of attorneys to ensure the accuracy of their filings.
- Canada and Australia: Citing Zhang v Wei Chen and Murray v State of Victoria to demonstrate the worldwide prevalence of this issue.
The QICDRC reiterated the English position: lawyers have a professional duty to check the accuracy of AI research by reference to authoritative sources (e.g., official Law Reports) before placing it before a court.
Sanctions and the "Hard Line"
Despite finding contempt proved, the Court exercised mercy regarding the penalty.
- No Penal Sanction: the lawyer was not fined.
- Anonymity: the lawyer was referred to only as the "Dubai Lawyer" to avoid "public disgrace" that would disproportionately impact his career, given this was a first offence in this jurisdiction.
However, the warning for the future is unambiguous:
"...any citation of any case or other authority to this Court which has not been verified by an examination by the advocate... will be considered a breach of the conduct required... The sanctions will include the full identification of the lawyer or law firm and the consequent public disgrace."
The Court also announced that a new Practice Direction regarding the use of AI is being released for consultation.
Practical Strategy for Firms and In-House Counsel
This judgment should prompt legal teams operating in Qatar and the wider region to review their use of AI. It is not enough to simply "be careful" - formal verification processes must be established.
We recommend the following procedural steps:
The "Human Firewall" Policy
Implement a strict policy that no AI-generated text is pasted into a court submission without human review. The advocate must be the firewall between the software and the Court.
Verification Audit Trails
When junior lawyers or paralegals conduct research, they should be required to provide the raw PDF or official link to every case cited. A summary alone is no longer sufficient evidence of a case's existence.
Primary Source Primacy
As noted by the Court, authoritative sources are the official court judgment databases, not third-party summaries or AI outputs. In the QFC, this means verifying directly against the QICDRC's searchable website.
Training on "Hallucinations"
Ensure all legal staff can recognize the stylistic "tells" of AI hallucinations, such as vague references to "confidentiality" explaining a lack of sources, or highly specific quotes that align too perfectly with the argument being made.
Conclusion
The Sheppard v Jillion judgment is a welcome development that protects the integrity of the QFC legal system. It clarifies that while AI is a powerful tool for efficiency, it cannot replace the professional obligation of verification. The Court has fired a warning shot - the next practitioner to cite a "ghost case" will face public naming and professional sanction.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.