11 May 2026

AI in the Courtroom

The recent decision in O’Driscoll v Rehab Co Mobile Limited [1] marks a significant development in New Zealand law as the Employment Relations Authority awarded increased costs against a self-represented party (that is, a person representing themself), in part for referencing cases that didn’t exist or could not be found.  This serves as a timely reminder about the risks associated with the use of Generative AI in the courtroom.

Generative AI chatbots, like ChatGPT and Copilot, are increasingly being used by self-represented parties to do the work of a lawyer, like research and drafting submissions.  While AI can be helpful to understand legal principles in a general sense, it is not a replacement for a lawyer. 

In the past year, we have increasingly encountered self-represented parties on the other side of a dispute who have cited fake cases, used AI to draft their pleadings, and sent emails to the Court that begin with Chat GPT’s telltale phrase: “Certainly! Here’s a professional courteous draft reply you can use”. 

While these tools may have appeared to simplify the process for the self-represented party, their misuse created more work for the other parties involved. For example, the other parties had to spend considerable time attempting to locate cases that ultimately did not exist, and additional time trying to decipher poorly constructed arguments and unnecessarily long pleadings. 

While there is no law outright prohibiting the use of AI in the Courtroom, the Courts of New Zealand issued Guidelines in 2023 [2] regarding its use. These Guidelines caution against the use of Generative AI due to its limitations, including its inability to understand unique fact patterns and its potential to breach confidentiality.  Importantly, while not explicitly mentioned in the Guidelines, Generative AI is prone to generating fabricated legal authorities and cannot be relied upon to provide accurate or relevant information.

At the same time, the Courts also issued separate Guidelines on the use of AI for lawyers. [3] While AI could assist the legal profession, there are numerous risks and ethical issues that need to be managed, [4] particularly with respect to Generative AI chatbots. The Guidelines for Lawyers emphasise the professional obligations, particularly to uphold the rule of law, maintain confidentiality, and to not mislead the Court.

Case law on how the Courts will treat AI in the Courtroom is now starting to emerge.

The first comment was in 2024 when the Court of Appeal withdrew a memorandum filed by Mr Wikeley, who represented himself, that was clearly drafted by Generative AI and referred to fake cases. [5] The Court of Appeal made no further comments on Mr Wikeley’s use of AI in its decision but referenced the Guidelines. 

However, O’Driscoll v Rehab Co Mobile Limited is the first decision where there have been financial consequences for a self-represented party who used Generative AI. [6] Mr O’Driscoll continued to rely on fake cases despite having been referred to the Guidelines. [7] As a result, Mr O’Driscoll had to pay an additional $2,000 in costs (among other reasons) to Rehab Co Mobile due to the extra costs they incurred from his misuse of AI.

While it may be tempting to rely on Generative AI due to the perception that it makes accessing legal information and advice cheaper, this is not the reality.  The decision in O’Driscoll v Rehab Co Mobile Limited, may set a precedent for other New Zealand Courts to impose financial penalties on those who misuse AI. 

In addition, the Supreme Court recently issued a decision indicating that reliance on false citations and unverified outputs of AI applications, may in serious cases, amount to obstruction of justice or contempt of court. [8] 

These decisions send a clear message that the judiciary will not tolerate the submission of fabricated legal authorities.  By implication, they highlight the limitations of Generative AI and the importance of obtaining credible and accurate legal advice.

Disclaimer

The above information is of a general nature only.  The information in this article does in no way constitute legal advice and all readers should contact a law firm for advice relating to their specific circumstances.

References

[1] [2026] NZERA 106.

[2] Guidelines for Use of Generative Artificial Intelligence in Courts and Tribunals – Non-Lawyers. 7 December 2023. Courts of New Zealand. See 20231207-GenAI-Guidelines-Non-Lawyers.pdf.

[3] Guidelines for Use of Generative Artificial Intelligence in Courts and Tribunals – Lawyers. 7 December 2023. Courts of New Zealand. See 20231207-GenAI-Guidelines-Lawyers.pdf.

[4] Lawyers and Generative AI. March 2024. New Zealand Law Society. Lawyers-and-AI-Guidance-Mar-2024.pdf.

[5] Wikeley v Kea Investments Ltd [2024] NZCA 609 at fn 187.

[6] Above n 1.

[7] At [10].

[8] Jones v Family Court at Whangārei [2026] NZSC 1 at [7].

Back to top