Barrister warns of risks as litigants-in-person turn to ChatGPT

Avatar photo

By Angus Simpson on

30

AI legal advice gone wrong

A barrister has publicly raised concerns about the increasing use of ChatGPT in legal drafting after a litigant-in-person (LiP) seemingly relied on the AI tool to draft witness statements and skeleton arguments.

Posting on LinkedIn this week, barrister William Rees-Mogg shared his experience of a recent small claims hearing where the litigant-in-person (LiP)—someone without legal representation—was “unable to support her own ‘evidence’ in court.” The LiP admitted to using ChatGPT to draft an inaccurate witness statement, according to the barrister, whilst “several other discrepancies in her other statements could be explained the same way”.

The skeleton argument ChatGPT drafted did cite real cases but made mistakes “immediately obvious to any lawyer” although “hard to spot for someone without training”.

The 2025 Legal Cheek Chambers Most List

For Rees-Mogg — who is a member of Whitestone Chambers which has a rather eye-catching website — the real worry was that ChatGPT gives non-lawyers confidence in wrong legal advice.

“[T]he LiP had produced a skeleton argument, which appeared also to have been written with AI assistance,” he wrote. “The skeleton cited relevant authority, with significant (but plausible) inaccuracies. This goes to show the risk of using generative AI for legal purposes as matters stand. The errors in that skeleton would be immediately obvious to any lawyer (or anyone with the knowledge base to find a copy of the relevant case and read it for themselves) but would be hard to spot for someone without training. This is arguably much more dangerous than generative AI producing an obviously wrong answer which can be spotted by a lay person immediately.”

ChatGPT isn’t just getting LiPs into awkward situations. Last month, an Australian lawyer was referred to a legal complaints commission after he admitted to using ChatGPT.

Earlier this year, the Bar Council issued guidance stating that while the use of AI is not ‘inherently improper,’ barristers should exercise caution and carefully assess the associated risks.

30 Comments

Lauren

Sorry – but isn’t the image under this headline AI generated? Incredibly contradictory / ironic to publish an article on the dangers of AI use and then utilise said danger. I might be wrong though!

Ok mate

Are you concerned people might think it’s a real image? 😂

Ermm

Perhaps more worrying is she seems to be equating the use of an AI generated image on a Legal Cheek article to using AI for the purposes of drafting legal submissions and documentation in a court of law.

What's the problem?

Seems a bit ridiculous to think “legal Generative AI needs thorough regulation to protect Litigants in Person from issuing claims on the basis of AI ‘advice’ which is superficially attractive, but fundamentally wrong in law.”

What are we going to protect against? The fact that not everything read on the internet is true? That is trite and any fool should know that.

The real problems are (1) a lack of legal aid pushing people towards use of moronic gen-AI “reasoning” and (2) a poor filtering system allowing spurious and legally incomprehensible gen-AI made claims to be brought.

Moises

Lawyers, judges, whoever is using whatever to get their case going will come across mistakes, Books also may contain mistakes, this is a case. My suggestion is to whoever is a self-litigante to make sure a miniscule robust sophisticated check is done before submitting any material to avoid any rookie mistakes…we are in the right way of justice and everyone deserves to gain knowledge and
applied… every member of society like the great master Eustace Mulls said, I’m a member of society of first hand knowledge…I speak for myself…

Saras

Barristers relying on professional should be abolished as it has no place in the justice system of the 21st Century. All civil case judges should be made accountable for their decisions as there is no appeal system and its all napotism to assist each other in the Legal Cartel.

AI from the "Legal Cartel"

Your comment would have benefitted from an AI spell check Saras.

Ha ha I made a funny

What is napotism? Is it the unmerited promulgation of day time snoozes?

Ha ha I consumed the funny

Sensational

simple simon

was that junk written by AI?

Luke

“The errors in that skeleton would be immediately obvious to any lawyer”.

This inherently means that legal system is inaccessible to anyone who hasn’t studied law or can’t afford a lawyer.

How can such a system be considered fair and just?
This isn’t “justice”.

Alex

Attention everyone who’s considering working on their own discrimination case. Don’t! Please don’t! You’ll cry like never before, because you’ll eventually confirm with all your body and soul the betrayals and agreesion and you’ll feel shredded to bits. Please believe me

Something Clever

Horrendous. Equally if I don’t study medicine I can’t operate on someone! This isn’t access to medicine!!!

David

I think we have enough to concentrate
on with qualified lawyers making significant ‘errors’ of legal judgement without the use of AI. Particularly, those lawyers who appear blinded to the legality of their own actions whilst seeking to conceal the unlawful activities of their corporate clients.

Liane

I agree. I am a LIP and going through the tribunal currently. I tried to go it alone and my fourth hearing is in May. I assumed that the tribunal’s were set up whereby employees did not need a lawyer. This is inaccurate, the judges have been very helpful up until now and very approachable. However I have now had to enlist a lawyer due to the complexity of law and my own case.

Now days

As litigant in person myself who paid astronomical amounts of money to solicitor and barrister advice chat gpt helped me more than all of them together. However, chat gpt just giving you the direction and the base , and the real work is to conduct proper facts finding into it and take the most helpful information out of it. The brutal truth is that without chatgpt I would never be able to make a successful claim simply because it’s too expensive.

Ots

I’m an ex Police officer. I have just issued a judicial review against a local authority. I used AI. I proof read the application and supporting documents prior to sending them to the court. 2 hours later the court accepted the file and I paid the fee. This can be done with the assistance of AI, but as mentioned, AI is just a tool.

PH junior associate

Alright, bit of a confession: I’ve been using AI to draft email responses to opposing counsel and put together term summaries. Works like a charm, to be honest. Somehow, it makes me feel more on top of things than partners at K&E.

Sean South

And this is how it begins…
Before you know it we’ll all be looking for Sarah Connor whilst dodging gooey metallic dudes..

On a practical note.
If you ask the robot to read through a thousand or so letters sent by your firms top dog and to then instruct it to respond to your prompts in this style – it should be possible to avoid any criticism for your “skills”..😱

Bushbaby

This article just demonstrates the writer’s privilege and ignorance. Feelings of entitlement and being totally out of touch with the experiences of ordinary people. Get off your high horse!

Sean South

Not sure what the argument is?
That LiP’s who can’t afford representation and don’t qualify for legal aid should just do their best without using all the tools available to them?
It’s not like virtually everyone with an internet connection- from long-toothed legal professionals to first year law student – isn’t making use of AI!
Why would Lexis Nexis and others be providing the facility if it’s not an essential part of modern practice?
Maybe there’s a bit of concern that it’s levelling the playing field for those ghastly LiP’s…

Not WR-M

Knowing William personally, he is somewhat more down to earth with those that share his name. Don’t tar everyone with the same brush.

Up North

You would do well to click the link onto the article about the chambers involved. Very strange errors there.
However, AI is simply not suited to 9/10 of legal work. Remember, what you put in is what you get out. Most litigants (be they in person or lawyers) would ask a question biased towards their own case. AI is not objective, it will answer entirely from one point of view. And if your case happens to be a losing one, get, it will hallucinate a whole load of spurious reasons for you.

Up North

However it would not have added the word “get” into the last sentence like I did.

Winston Leachman

I think they are somewhat misconceived. AI is used as a legal tool to show that the law and prescribe procedures concerning criminal trials are conduct appropriately without the monopoly by trying to dismiss the issues at hand to show or demonstrate that it is wrong at hand there is no burden of proof that AI is wrong in its entirety what it demonstrates the important elements of criminal trials are being deliberately missed and shows the errors that are being made where they have a duty of care to protect the individuals they represent rather than act in bad faith where their clients are being wrongly convicted which breaches the human rights act and the habeas corpus acts where the principles of illegality are not being complied with bear this in mind

Ots

I’m an ex Police officer. I have just issued a judicial review against a local authority. I used AI. I proof read the application and supporting documents prior to sending them to the court. 2 hours later the court accepted the file and I paid the fee. This can be done with the assistance of AI, but as mentioned, AI is just a tool.

Pat Green

Very interesting perspective here, thank you for sharing Mr Rees-Mogg !

LIP LIPPER LIPPEST

Rees-Mogg… where have I heard that name before?

Thomas

Yep, I volunteer and help litigants-in-persons at civil and family courts. Our justice system is horribly underfunded and is not suitable for anyone but the richest who can afford lawyers. No politician or the public seems to care, it’s not an issue until you’re facing the reality of having to organise your own legal case or pay thousands in legal costs

Deb

Im using AI in all my appeals as im dyslexia and other disabilities it as so much failures from the tribunal and the Respondent’s solicitor and it needs publishing because in the interests of justice and people

Join the conversation

Related Stories

Judges encouraged to embrace AI — carefully

What could possibly go wrong?

Dec 12 2023 2:33pm

Barristers warned against risks of ChatGPT

But Bar Council says AI use not 'inherently improper'

Feb 1 2024 8:54am
2