- Legal Futures - https://www.legalfutures.co.uk -

Law firm that cited fake AI-generated cases to pay wasted costs

Charman: Firm’s actions were improper

A law firm has been ordered to pay wasted costs after it cited two fictitious cases that were generated by artificial intelligence (AI).

According to barrister Alexander Bradford, the firm blamed a member of its administrative team.

He acted for the defendant, Birmingham City University, and reported the case on the website of his chambers, St Philips.

He said His Honour Judge Charman has ordered his ruling be published but it has not yet been.

The unnamed law firm was acting for a former student in a claim for breach of contract, negligence and fraud. Her solicitors submitted an application to the court on 10 July 2025 which cited two cases that turned out to be fictitious.

Mr Bradford recounted that, the following day, university’s solicitors, Cambridgeshire firm JG Poole & Co, said they had been unable to locate the cases and asked for copies of them.

The solicitors did not respond, instead withdrawing and re-submitting the application without the fictitious cases, telling the court that the previous application had been submitted “in error”.

The claim and application were struck out with indemnity costs on 30 July 2025. The issue of the false citations and wasted costs was adjourned.

The claimant’s solicitor filed witness statements accepting that the cases had been AI-generated and were fictitious.

“The solicitor’s evidence was that a member of the administrative team had drafted the application using a built-in AI research feature of a widely used legal software,” Mr Bradford said.

“That staff member had submitted the application without the solicitor’s knowledge, had not verified the authorities cited, and had signed the statement of truth personally in the solicitor’s name without his knowledge or consent.”

Applying the guidance of the Divisional Court [1] issued in June in Ayinde, HHJ Charman held that the behaviour of the solicitor and firm had been improper, unreasonable and negligent, that his explanation was inadequate, and the threshold for a wasted costs order had been met.

Mr Bradford said: “This decision serves as a further reminder of the importance of the risks that large language models pose to the administration of justice.”

Last month, a barrister who used ChatGPT to draft grounds of appeal for the Upper Tribunal that included a fake case reference, and then failed to admit it, was referred to the Bar Standards Board [2].