Court fines US lawyers who cited fake cases produced by ChatGPT


ChatGPT: System produced six fake cases

Two lawyers who unwittingly submitted fake cases generated by ChatGPT to support their claim have been fined by a New York court because they “abandoned their responsibilities”.

As well as being fined $5,000 each, the lawyers and their firm have been ordered to inform their client and the judges whose names were wrongfully invoked in the case of the sanctions imposed on them.

Peter LoDuca, Steven A Schwartz and the firm of Levidow Levidow & Oberman attracted international attention after the brief in a personal injury claim prepared by Mr Schwartz contained six cases that ChatGPT had simply made up – the system later insisted they were real when Mr Schwartz asked it.

US District Judge P Kevin Castell in the Southern District of New York said: “In researching and drafting court submissions, good lawyers appropriately obtain assistance from junior lawyers, law students, contract lawyers, legal encyclopedias and databases such as Westlaw and LexisNexis.

“Technological advances are commonplace and there is nothing inherently improper about using a reliable artificial intelligence tool for assistance. But existing rules impose a gatekeeping role on attorneys to ensure the accuracy of their filings.”

He continued that here the lawyers “abandoned their responsibilities when they submitted non-existent judicial opinions with fake quotes and citations created by the artificial intelligence tool ChatGPT, then continued to stand by the fake opinions after judicial orders called their existence into question”.

Among the “many harms” that flowed from the submission of fake opinions was that it promoted “cynicism about the legal profession and the American judicial system”, while “a future litigant may be tempted to defy a judicial ruling by disingenuously claiming doubt about its authenticity”.

The law firm primarily practises in New York state courts. It uses a legal research service called Fastcase and does not have access to Westlaw or LexisNexis.

But the case involved the Montreal Convention and was in federal court, and the firm’s Fastcase account had limited access to federal cases. Mr Schwartz said this was why he turned to ChatGPT.

Judge Castell said the outcome of the matter would have been “quite different” had the lawyers come clean after the defendant first questioned the existence of the cases, or after the court had required them to produce them.

Instead, they “doubled down and did not begin to dribble out the truth” until the court issued an order to show cause why they ought not be sanctioned.

This was evidence of bad faith on their parts, as was Mr Schwartz’s statement to the court that ChatGPT had “supplemented” his research, when in fact it was the only source of his substantive arguments.

Mr Schwartz testified at the sanctions hearing that he was “operating under the false perception” that ChatGPT “could not possibly be fabricating cases on its own”.

He said: “My reaction was, ChatGPT is finding that case somewhere. Maybe it’s unpublished. Maybe it was appealed. Maybe access is difficult to get. I just never thought it could be made up.”

The law firm told the court that it has arranged for outside counsel to conduct mandatory training on technological competence and artificial intelligence.

Judge Castell credited “the sincerity of the respondents when they described their embarrassment and remorse”. The fines were “sufficient but not more than necessary to advance the goals of specific and general deterrence”.

Earlier this month, the Master of the Rolls, Sir Geoffrey Vos, cited the case as a reason why legal regulators and the courts may need to control “whether and in what circumstances and for what purposes” lawyers can use systems like ChatGPT in litigation.




Leave a Comment

By clicking Submit you consent to Legal Futures storing your personal data and confirm you have read our Privacy Policy and section 5 of our Terms & Conditions which deals with user-generated content. All comments will be moderated before posting.

Required fields are marked *
Email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Blog


Navigating carer’s leave: A personal journey and call for change

The Carer’s Leave Act 2023, which came into force on 6 April 2024, was a pivotal moment for the UK. It allows workers to take up to five unpaid days off a year to carry out caring responsibilities.


House of Lords shines a spotlight on flawed DBA regulations

As the Litigation Funding Agreements (Enforceability) Bill was debated in the House of Lords last month, a number of peers shone the spotlight on the need to address the poor state of the rules governing DBAs.


Align success measures with your firm’s core values for long-term success

What sets you apart from your competitors? How does your team’s core values help you deliver a service that makes you stand out and help you retain – and win – business?


Loading animation