In the recent Fair Work Commission decision Mr Branden Deysel v Electra Lift Co. [2025] FWC 2289, Deputy President Slevin applied a critical lens to the use of ChatGPT by the Applicant who was seeking an extension of time to make an application to deal with contraventions involving dismissal.
One factor the FWC examines in considering an extension of time is the merit of the substantive application that would be allowed to proceed. This is where, as Deputy President Slevin observed, artificial intelligence had an unhelp role to play in this matter (at paragraph 6):
"As to the merits of the claim, Mr Deysel confirmed during the conference that he had used an artificial intelligence large language model, ChatGPT, in preparing his application. So much was clear from the deficiencies in the application which failed to address the matters required to make good a claim that Part 3-1 of the Fair Work Act had been contravened. The application also included an extract from advice given by ChatGPT which was that various employment and other statutory obligations had been contravened by the Respondent. The advice suggested that Mr Deysel commence various legal actions against the Respondent, including making application under s.3 65 of the Act. I can see no basis for this advice."
The Deputy President continued his critique of ChatGPT and its use in this context (at paragraph 7):
"Chat GPT also advised Mr Deysel to consult a legal professional or union representative to determine the appropriate course of action. He did not do so. Mr Deysel simply followed the suggestion made by ChatGPT and commenced the proceedings. The circumstances highlight the obvious danger of relying on artificial intelligence for legal advice. The result has been Mr Deysel commencing proceedings that are best described as hopeless and unnecessarily wasting the resources of the Commission and the Respondent in doing so."
To put the criticism in context and perspective, the use of AI by the Applicant did not make a material difference to the outcome of the extension of time application. General Protections claims involving dismissal need to be brought within 21 days of dismissal unless there are "exceptional circumstances". This application was brought 919 days after the end of the Applicant's employment, which resulted from his resignation, rather than a termination by the employer. The Applicant's submissions that he was lacking awareness of his workplace rights and was concerned about retribution from his former employer were rejected. In relation to the former, the FWC has regularly held that ignorance of rights is not a reason to excuse delay. Considering the latter, it was held there was no evidence supporting the asserted concern. Further, the Respondent cited the prejudice it would suffer if called upon to respond to events from over 2 years ago, in circumstances where it was never put on notice that the termination would be challenged. This extension of time application was never going to be successful, irrespective of the use of Chat GPT, although it is conceivable that the "advice" given to the Applicant by AI, for which the Deputy President found there was "no basis", may have led to, or emboldened, a decision to bring the application before the FWC.
Observations
Some observations:
- This decision illustrates the risks and dangers of using ChatGPT (or other AI applications or models) for the preparation of FWC applications and responses.
- These risks and dangers are particularly acute in a jurisdiction such as the FWC where principles of fairness, which require value judgments, need to be carefully considered. The work of the FWC does not lend itself to a formulaic approach, slavish to previous cases that may ostensibly have similar facts. It is a trite proposition that each case turns on its own circumstances.
- While it made little difference in this case it is only a matter of time before an otherwise meritorious application or response is undermined by the misuse of AI apps or models. Courts and tribunals frequently dealing with self-represented litigants need to be actutely aware of the possibility it is being used. If it is, and it leads to asound case being poorly argued or presented, that is the fault of the party seeking to rely on the technology. The FWC should give no concession or latitude to any party for having made that decision.
- AI may have a role to play in FWC proceedings but, at this stage in its development, it needs to be used judiciously and its output treated with ahealthy degree of scepticism.
For further information please contact:
Michael Byrnes, Partner
Phone: + 61 2 9233 5544
Email: mjb@swaab.com.au
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.