
AI: Barrister did not intend to mislead court
A barrister has reported herself to the Bar Standards Board (BSB) after submitting authorities hallucinated by artificial intelligence (AI) to the High Court.
Layla Parsons was acting as a lay advocate to a mother and then a litigant in person in a case about the welfare of four children. Though an unregistered (ie, non-practising) barrister, she also held herself out as a lawyer.
She cited four non-existent cases, which she admitted had been generated by AI, in her skeleton argument in support of various applications.
Though Ms Parsons withdrew the applications, Recorder Howard in Bournemouth Family Court decided that he should name her in his ruling, despite her protests and the fact that she had self-reported to the BSB – which the judge said was the “responsible” thing to do.
He explained: “She offers, or has offered, paid legal work to members of the public. This is an important consideration.
“I am satisfied having read her written submissions lodged since the hearing that Layla Parsons still does not really acknowledge or accept that her actions in not checking the citations and propositions she included in her skeleton argument were serious.”
Evidence from last November showed she was the lawyer available to people who bought documents from an unnamed website and paid for legal support with them. This meant “there is a real and not fanciful possibility that Ms Parsons will in the future offer legal services to members of the public”.
Recorder Howard went on: “I consider that this factor, and the need for any person engaging the services of Ms Parsons in legal proceedings to know that she has misled the court (albeit unintentionally) and does not in my judgment properly understand what she has done wrong is a strong and overwhelming factor in favour of naming Ms Parsons.”
Ms Parsons asserted that naming her in the ruling risked people finding her, harassing her or otherwise placing her at risk. The evidence indicated that these risks were “likely to be greatly exaggerated”, the judge said.
The public interest in naming her “strongly outweighs the risks to her”, he decided, “and that naming her is a necessary and proportionate interference with her right to family life”.
Ms Parsons confirmed that she had not uploaded any documents from the bundle to the unnamed AI tool that she used.
The judge said that, despite her legal qualification, he treated her as a litigant in person, but they too had a duty not to mislead the court.
Though absolving her of any intention to do so, he remained concerned that “Ms Parsons minimises the seriousness of misleading the court and goes so far as to assert that criticising use of AI risks setting a harmful precedent for disabled litigants in person and will discourage access to justice”.
But he said: “It is an example of the day to day working of the Family Court, the issues that can arise in these difficult cases, and another example where AI hallucinations have led to the court being misled by a person representing themselves relying on the AI tool without reference to their duty to check the citations.”
He stressed that he had taken care to avoid including any personal information about Ms Parsons that was not strictly necessary.













Leave a Comment