Law firm that cited fake AI-generated cases to pay wasted costs


Charman: Firm’s actions were improper

A law firm has been ordered to pay wasted costs after it cited two fictitious cases that were generated by artificial intelligence (AI).

According to barrister Alexander Bradford, the firm blamed a member of its administrative team.

He acted for the defendant, Birmingham City University, and reported the case on the website of his chambers, St Philips.

He said His Honour Judge Charman has ordered his ruling be published but it has not yet been.

The unnamed law firm was acting for a former student in a claim for breach of contract, negligence and fraud. Her solicitors submitted an application to the court on 10 July 2025 which cited two cases that turned out to be fictitious.

Mr Bradford recounted that, the following day, university’s solicitors, Cambridgeshire firm JG Poole & Co, said they had been unable to locate the cases and asked for copies of them.

The solicitors did not respond, instead withdrawing and re-submitting the application without the fictitious cases, telling the court that the previous application had been submitted “in error”.

The claim and application were struck out with indemnity costs on 30 July 2025. The issue of the false citations and wasted costs was adjourned.

The claimant’s solicitor filed witness statements accepting that the cases had been AI-generated and were fictitious.

“The solicitor’s evidence was that a member of the administrative team had drafted the application using a built-in AI research feature of a widely used legal software,” Mr Bradford said.

“That staff member had submitted the application without the solicitor’s knowledge, had not verified the authorities cited, and had signed the statement of truth personally in the solicitor’s name without his knowledge or consent.”

Applying the guidance of the Divisional Court issued in June in Ayinde, HHJ Charman held that the behaviour of the solicitor and firm had been improper, unreasonable and negligent, that his explanation was inadequate, and the threshold for a wasted costs order had been met.

Mr Bradford said: “This decision serves as a further reminder of the importance of the risks that large language models pose to the administration of justice.”

Last month, a barrister who used ChatGPT to draft grounds of appeal for the Upper Tribunal that included a fake case reference, and then failed to admit it, was referred to the Bar Standards Board.




    Readers Comments

  • R. Mark Clayton says:

    Well at least they ‘fessed up and took their medicine.

    Was the legal executive blamed invented by AI as well??


Leave a Comment

By clicking Submit you consent to Legal Futures storing your personal data and confirm you have read our Privacy Policy and section 5 of our Terms & Conditions which deals with user-generated content. All comments will be moderated before posting.

Required fields are marked *
Email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Blog


Change in regulator shouldn’t make AML less of a priority

While SRA fines for AML have been climbing, many in the profession aren’t confident they will get any relief from the FCA, a body used to dealing with a highly regulated industry.


There are 17 million wills waiting to be written

The main reason cited by people who do not have a will was a lack of awareness as to how to arrange one. As a professional community, we seem to be failing to get our message across.


The case for a single legal services regulator: why the current system is failing

From catastrophic firm collapses to endemic compliance failures, the evidence is mounting that the current multi-regulator model is fundamentally broken.


Loading animation