
Matthew Letts
By Matthew Letts, Founder and Director of Legal Futures Associate Codified Strategy [1]
In April 2026, Sullivan & Cromwell, one of the three most expensive law firms in the world at $2,000+ an hour, wrote to the US Bankruptcy Court for the Southern District of New York to apologise. The motion in In re Prince Global Holdings Limited contained case citations that did not exist. Fabricated by AI. Filed without anyone catching them. The firm had AI policies, training and oversight in place. None of it worked.
Damien Charlotin, a researcher in Paris, maintains a public database of AI hallucination cases in courts worldwide. It currently logs more than 1,350 [2].
What is an AI hallucination, and why won’t it stop?
AI does not look things up. It generates text that sounds right. Ask it for case law and it will produce something that looks exactly like case law: proper names, plausible citation formats, confident judicial language. The cases just do not exist.
This is how the technology works. No software update will change it. The High Court of England and Wales said as much in 2025: generative AI tools are “not capable of conducting reliable legal research” and can “make confident assertions that are simply untrue”.
The cases worth knowing
The case that opened the conversation was Mata v Avianca in New York in 2023. Lawyers used ChatGPT to draft a brief. The brief cited cases that did not exist. The court sanctioned them. The profession took note, and largely moved on.
By May 2025, the problem had reached major international firms. Lawyers from Ellis George LLP and K&L Gates submitted a brief built using CoCounsel, Westlaw Precision and Google Gemini. The citations were fabricated. The Special Master struck the brief entirely, denied the relief sought, ordered the firms to jointly pay $31,100 in the other side’s costs, and told them to go and explain themselves to their client. He called it “deeply troubling”.
In September 2025, the California Court of Appeal fined the lawyer who submitted fake citations $10,000. It refused to award costs to the other side, however, on the basis that opposing counsel had not spotted the fakes either. The implicit question the court left hanging: is checking your opponent’s citations now part of the job?
England and Wales: the position hardens
In June 2025, Dame Victoria Sharp, President of the King’s Bench Division, sat in the Divisional Court with Mr Justice Johnson to deal with two cases referred under the Hamid jurisdiction.
In R (Ayinde) v London Borough of Haringey, counsel cited five cases that did not exist and could not point to a single source for any of them. In the companion case, Al-Haroun v Qatar National Bank, an £89 million claim, a solicitor submitted a witness statement relying on 45 authorities. Eighteen turned out to be fiction. Others were misquoted or simply irrelevant. His explanation: the research had been done by the lay client. He had not checked it.
Both lawyers were referred to their regulators. Sharp P reminded the profession that the court’s options run from a public reprimand through costs orders and contempt proceedings, and, in her own words, the police, by way of prosecution for perverting the course of justice.
A Family Court judge in the Midlands, working through a father’s applications in children proceedings, noticed that many of the cases cited simply were not real. The matter was referred upwards. The father was ordered to pay £5,900 in costs.
Who is actually checking this?
AI is useful for legal research. It finds relevant areas of law quickly, spots angles a lawyer might miss when working under time pressure, and turns three hours of background reading into twenty minutes. Firms that have written it off entirely are falling behind.
The problem is verification, and underneath that, professional negligence. The duty not to mislead the court does not care how the research was generated. A fake citation is a fake citation, whether it came from a tired associate, a paralegal who did not know better, or a model that cannot tell the difference between sounding authoritative and being right. The lawyer whose name is on the submission carries the responsibility.
S&C had policies. The policies were not followed. There was no fallback. $2,000+ an hour, and the work still went out of the door unverified.
Managing this is not complicated, but it does require the right people. Software vendors sell tools. They have not stood across the table from opposing counsel going through a witness statement line by line. That oversight has to come from lawyers, people who understand the standards because they have worked under them.
That is what Codified Strategy Limited does. We embed qualified lawyers into SME firms as technology advisers. We design the workflows, set the verification standards and build the oversight layer, so that when AI produces something plausible that happens to be wrong, and it will, someone in the process catches it before it reaches a court.
We also do the advisory work that sits before any of that: helping firms decide which tools fit their practice, how to roll them out without creating new compliance risks, and how to train people to use them in a way that holds up under regulatory scrutiny. We are vendor neutral. We have no interest in selling you a particular platform. We have practised law, and we know what a firm needs to function: not in theory, but in the daily reality of running files, managing client money and keeping the SRA satisfied.
AI in legal practice is here. The question is whether your firm has built the infrastructure around it to use it safely.
If it has not, we should talk.
Codified Strategy works with SME law firms on technology strategy, AI implementation and practice management. Our consultants are qualified lawyers. They have done the job.