ARTICLE
17 August 2025

"Free" Generative AI: Managing The Hidden Costs By Improving AI Literacy

CO
Carroll & O'Dea

Contributor

Established over 120 years ago, Carroll & O’Dea Lawyers offers expert advice and strong advocacy for clients. With a commitment to high-level service and legal expertise in all areas, they blend tradition with modern skills.
This case demonstrates the critical need to improve AI literacy in any organisation about the inputs and the generative AI solution.
Australia Technology

What can you or your organisation learn from the reported case of a Sydney lawyer who submitted court documents that contained fictitious cases and quotes generated by hallucinating AI? i

Board directors, senior management, and key decision makers in almost every organisation have the legal obligation to act diligently, competently, and in the interest of the applicable stakeholders. To meet that obligation, if they are going to use AI, they will need AI literacy or lots of luck!

Made Up Laws: Fictitious Cases and Fictitious Quotes

What did the lawyer do?

The lawyer represented an individual seeking judicial review of a tribunal decision to refuse the individual's visa application. The lawyer filed documents with the Court (an amended application and an outline of the lawyer's submissions) which cited fictitious cases and fictitious quotes from the Tribunal. When the Minister's lawyers brought the false cases and quotes to the Court's attention, the lawyer amended the documents to remove the false content and apologised for the mistake. The lawyer admitted that he used ChatGPT to draft the documents and submitted them without further review as the results 'read well'. It is evident that the generative AI hallucinated and produced false cases and quotes.

What are the concerns?

Lawyers have a paramount duty to the Court and the administration of justice ii. Further, by law, lawyers must not deceive or knowingly or recklessly mislead the court iii. Applications and outline of submissions are key documents that assist the Court in the administration of justice. Fictitious cases and fictitious quotes undermine the foundation of Australia's common law (or judge made law) system as they can create false precedents, which may have a lasting impact.

The lawyer was referred by a Federal Court to the Office of the Legal Services Commissioner (OLSC), the regulator of the legal profession, to consider whether the lawyer's conduct fell below professional standards.

Hidden Costs of Using the Generative AI

For the lawyer

The lawyer suffered the following costs from his misuse of the generative AI:

  • Ordered to pay $2,980 for the Minister's costs, which was attributed to the costs thrown away due to his conduct iv;
  • Expended significant time and cost on amending the documents;
  • Expended significant time and cost to defend his conduct, including briefing lawyers and counsel
  • Damaged his reputation and career of 27 years

The lawyer may face further consequences depending on the outcome of the OLSC's review.

For the stakeholders

The stakeholders (the lawyer's client, the opposing counsel, the court, and the public) suffered the following costs from the lawyer's use of the generative AI:

  • Distracted from the substantive issues, which could jeopardise the client's case
  • Expended significant time and costs on verifying the documents
  • Delayed the substantive proceeding as the final hearing had to be rescheduled to deal with the concerns
  • Diminished the public's trust in the legal profession

The learnings are applicable not only to lawyers but also to Board Directors, senior management, and key decision makers and advisers.

How is your or your organisation's AI literacy?

This case demonstrates the critical need to improve AI literacy in any organisation about the inputs, the generative AI solution, and the outputs. Individuals or organisations with high level AI literacy would have implemented systems to manage the situation, such as:

  • Using an enterprise version of generative AI from a reputable service provider with enforceable rights recorded in contracts;
  • enforcing minimum service levels and AI confidence levels;
  • systematically reviewing the generated outputs for accuracy;
  • embedding appropriate review to capture AI hallucinations.

How confident are you in your knowledge or competence in critically assessing how to use generative AI to create competitive advantage and manage the risks?

Footnotes

i Valu v Minister for Immigration and Multicultural Affairs (No 2) [2025] FedCFamC2G 95 (31 January 2025) (austlii.edu.au)
ii Legal Profession Uniform Law Australian Solicitors' Conduct Rules 2015 r3.
iii Legal Profession Uniform Law Australian Solicitors' Conduct Rules 2015 r19.1
iv Valu v Minister for Immigration and Multicultural Affairs [2025] FedCFamC2G 94 (31 January 2025) (austlii.edu.au)

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

See More Popular Content From

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More