Barrister warns of risks as litigants-in-person turn to ChatGPT

Avatar photo

By Angus Simpson on

7

AI legal advice gone wrong

A barrister has publicly raised concerns about the increasing use of ChatGPT in legal drafting after a litigant-in-person (LiP) seemingly relied on the AI tool to draft witness statements and skeleton arguments.

Posting on LinkedIn this week, barrister William Rees-Mogg shared his experience of a recent small claims hearing where the litigant-in-person (LiP)—someone without legal representation—was “unable to support her own ‘evidence’ in court.” The LiP admitted to using ChatGPT to draft an inaccurate witness statement, according to the barrister, whilst “several other discrepancies in her other statements could be explained the same way”.

The skeleton argument ChatGPT drafted did cite real cases but made mistakes “immediately obvious to any lawyer” although “hard to spot for someone without training”.

The 2025 Legal Cheek Chambers Most List

For Rees-Mogg — who is a member of Whitestone Chambers which has a rather eye-catching website — the real worry was that ChatGPT gives non-lawyers confidence in wrong legal advice.

“[T]he LiP had produced a skeleton argument, which appeared also to have been written with AI assistance,” he wrote. “The skeleton cited relevant authority, with significant (but plausible) inaccuracies. This goes to show the risk of using generative AI for legal purposes as matters stand. The errors in that skeleton would be immediately obvious to any lawyer (or anyone with the knowledge base to find a copy of the relevant case and read it for themselves) but would be hard to spot for someone without training. This is arguably much more dangerous than generative AI producing an obviously wrong answer which can be spotted by a lay person immediately.”

ChatGPT isn’t just getting LiPs into awkward situations. Last month, an Australian lawyer was referred to a legal complaints commission after he admitted to using ChatGPT.

Earlier this year, the Bar Council issued guidance stating that while the use of AI is not ‘inherently improper,’ barristers should exercise caution and carefully assess the associated risks.

7 Comments

Lauren

Sorry – but isn’t the image under this headline AI generated? Incredibly contradictory / ironic to publish an article on the dangers of AI use and then utilise said danger. I might be wrong though!

Ok mate

Are you concerned people might think it’s a real image? 😂

Ermm

Perhaps more worrying is she seems to be equating the use of an AI generated image on a Legal Cheek article to using AI for the purposes of drafting legal submissions and documentation in a court of law.

EDS

Adequately regulate your own family before seeking to regulate the general population, Willy.

What's the problem?

Seems a bit ridiculous to think “legal Generative AI needs thorough regulation to protect Litigants in Person from issuing claims on the basis of AI ‘advice’ which is superficially attractive, but fundamentally wrong in law.”

What are we going to protect against? The fact that not everything read on the internet is true? That is trite and any fool should know that.

The real problems are (1) a lack of legal aid pushing people towards use of moronic gen-AI “reasoning” and (2) a poor filtering system allowing spurious and legally incomprehensible gen-AI made claims to be brought.

Moises

Lawyers, judges, whoever is using whatever to get their case going will come across mistakes, Books also may contain mistakes, this is a case. My suggestion is to whoever is a self-litigante to make sure a miniscule robust sophisticated check is done before submitting any material to avoid any rookie mistakes…we are in the right way of justice and everyone deserves to gain knowledge and
applied… every member of society like the great master Eustace Mulls said, I’m a member of society of first hand knowledge…I speak for myself…

Saras

Barristers relying on professional should be abolished as it has no place in the justice system of the 21st Century. All civil case judges should be made accountable for their decisions as there is no appeal system and its all napotism to assist each other in the Legal Cartel.

Join the conversation

Related Stories

Judges encouraged to embrace AI — carefully

What could possibly go wrong?

Dec 12 2023 2:33pm

Barristers warned against risks of ChatGPT

But Bar Council says AI use not 'inherently improper'

Feb 1 2024 8:54am
2