The use of AI tools in Cayman Islands legal proceedings – warnings for litigants and attorneys

Walkers
Contact

Walkers

Key takeaways

  • Anyone using AI tools must take personal responsibility for the accuracy of the material produced in Court.
  • Attorneys may face personal consequences (including wasted costs orders) if the work product they put forward to the Court is not accurate.
  • As officers of the Court, attorneys have a duty to point out when their opponent is at risk of misleading the Court in reliance on AI technologies.

Why should litigators be cautious with AI tools in legal cases?

On 28 January 2025, Justice Asif KC ("Asif J") handed down his judgment in Bradley and Another v Frye-Chaikin [2025] CIGC (Civ) 5 in which the Grand Court of the Cayman Islands (the "Court") commented for the first time in a written judgment on the use of AI tools to produce submissions, including the obligations of litigants and their attorneys when using AI tools in legal proceedings.

Background

In support of the Defendant's application for a stay of an order pending the hearing of her appeal, she put before the Court and relied on submissions which contained a number of apparent errors or misunderstandings. For instance, the Defendant's submissions repeatedly referred to procedural rules and cited reported cases which did not exist. This led to the Court spending many hours considering those submissions, seeking to identify whether the cases were real and corresponding with the Defendant to get to the bottom of how her submissions had been produced.

The Defendant explained that when preparing her written submissions she had asked an unknown person to help her at the law library of the University of Michigan, who had typed something into the computer and generated the text. Whilst the Defendant stated that she did not know whether any AI tools had been used, Asif J had no hesitation in concluding that whomever assisted the Defendant in preparing the submissions had used an AI tool to do so and, importantly, had failed to check the accuracy of what was produced.

Warnings from the Court when using AI

During the course of his judgment, Asif J echoed the sentiments of Harber v HMRC [2023] UKFTT 1007 (TC) whereby the English Court stressed that reliance on fictious AI authorities could cause a great deal of harm to the reputation and the efficient running of the judicial system. That judgment, in turn, referred to the well-known New York case of Mata v Avianca, Inc. (2023), in which Judge Kevin Castel sanctioned two attorneys for submitting a brief drafted by ChatGPT that had referred to non-existent case law authorities.

Whilst the Court has confirmed that there is nothing inherently wrong with using technology (including AI tools) to make the conduct of legal disputes more efficient and their resolution speedier, Asif J stated that: "it is vital that anyone who uses such an AI tool verifies that the material generated is correct and ensures that it does not contain hallucinations". In other words, that any statutes, procedural rules and case law authorities that are referred to exist, say what they are asserted to say and that principles of law are accurately stated.

Asif J commented that anyone using AI tools must take personal responsibility for the accuracy of the material produced and be prepared to face personal consequences (which could include wasted costs orders) if the work product they put forward to the Court is not accurate. This is because failing to take such obvious precautions gives rise to many harms, including:

• wasting the time of the opponents and the Court;
• wasting public funds and causing the opponent to incur unnecessary costs;
• delaying the determination of other cases;
• failing to put forward other correct lines of argument;
• tarnishing the reputation of judges to whom non-existent judgments are attributed; and
• impacting the reputation of the Courts and legal profession more generally.

As the use of AI tools in the conduct of litigation increases, Asif J stated that it is vital that all counsel involved in the conduct of cases before the Court are alive to the risk that material generated by AI may include errors and hallucinations. With respect to the obligations of attorneys, Asif J cautioned that:

"Attorneys who rely on such material must check it carefully before presenting it to the court. But equally, opponents should be astute to challenge material that appears to be erroneous, as was the case here. As officers of the Court, in my view, an attorney’s duty to assist the Court includes the duty to point out when their opponent is at risk of misleading the Court, including by reference to non-existent law or cases."

Observations and takeaways

Whilst the potential issues caused by using AI tools are not new, this judgment makes it clear that the Court will not hesitate to sanction attorneys who fail to take adequate steps to satisfy themselves as to the accuracy of information generated by AI technology.

Although the use of AI technologies presents opportunities to provide efficient and innovative approaches to some aspects of litigation, it also has limitations and flaws. Attorneys must not only exercise extreme caution when using such AI tools themselves, but should also remain vigilant when reviewing material drafted by their opponents. This will become increasingly relevant as the use of AI technologies becomes more prevalent in the context of litigation proceedings.

[View source.]

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations. Attorney Advertising.

© Walkers

Written by:

Walkers
Contact
more
less

PUBLISH YOUR CONTENT ON JD SUPRA NOW

  • Increased visibility
  • Actionable analytics
  • Ongoing guidance

Walkers on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide