Generative AI could be useful “secondary tool”, judges told

Carr: Endorsed guidance

Generative AI could be a “potentially useful secondary tool” for judges to use in the course of their work, according to new guidance from the senior judiciary.

However, all judicial office holders “must be alive to the potential risks” of it.

The guidance was produced by a cross-jurisdictional judicial group to assist judges, their clerks, and other support staff on the use of AI.

It was issued with the support of Baroness Carr, the Lady Chief Justice, Master of the Rolls Sir Geoffrey Vos, Sir Keith Lindblom, Senior President of Tribunals, and Lord Justice Colin Birss, deputy head of civil justice.

They said the guidance was the first step in a “proposed suite of future work to support the judiciary in their interactions with AI”, with a frequently asked questions document to support the guidance to follow next.

The guidance cautioned judges to “ensure you have a basic understanding of their capabilities and potential limitations” before using AI tools, such as appreciating that public AI chatbots did not provide answers from authoritative databases.

“As with any other information available on the internet in general, AI tools may be useful to find material you would recognise as correct but have not got to hand, but are a poor way of conducting research to find new information you cannot verify.

“They may be best seen as a way of obtaining non-definitive confirmation of something, rather than providing immediately correct facts.”

But “provided these guidelines are appropriately followed, there is no reason why generative AI could not be a potentially useful secondary tool”, judges were told.

“If clerks, judicial assistants, or other staff are using AI tools in the course of their work for you, you should discuss it with them to ensure they are using such tools appropriately and taking steps to mitigate any risks.”

It listed potential uses of AI tools as summarising large bodies of text, writing presentations – e.g. to provide suggestions for topics to cover – and administrative tasks like composing emails and memoranda.

The guidance also warned that AI tools may make up fictitious cases, citations or quotes, or refer to legislation, articles or legal texts that do not exist, so-called hallucination.

Just last week, we reported on how the First-tier Tribunal had decided that nine cases cited by a litigant in person in a tax case had been produced by generative AI and the guidance said it was “appropriate” for judges to ask unrepresented people if they have used AI and what checks for accuracy they have undertaken.

Judges also needed to be aware of “potential challenges posed by deepfake technology”.

Provided AI was used responsibly, there was “no reason” why lawyers ought to refer to its use, but this was dependent upon context.

“Until the legal profession becomes familiar with these new technologies, however, it may be necessary at times to remind individual lawyers of their obligations and confirm that they have independently verified the accuracy of any research or case citations that have been generated with the assistance of an AI chatbot.”

The guidance also stressed the importance of confidentiality and privacy – especially when using a public AI chatbot – the potential for bias, and security risks.

Leave a Comment

By clicking Submit you consent to Legal Futures storing your personal data and confirm you have read our Privacy Policy and section 5 of our Terms & Conditions which deals with user-generated content. All comments will be moderated before posting.

Required fields are marked *
Email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.


Shocking figures suggest divorce lawyers need to do more for clients

There are so many areas where professional legal advice requires complementary financial planning and one that is too frequently overlooked is on separation or divorce.

Is it time to tune back into radio marketing?

How many people still listen to the radio? More than you might think, it seems. Official figures show that 88% of UK adults tuned in during the last quarter of 2023 for an average of 20.5 hours each week.

Use the tools available to stop doing the work you shouldn’t be doing anyway

We are increasingly taken for granted in the world of Do It Yourself, in which we’re required to do some of the work we have ostensibly paid for, such as in banking, travel and technology

Loading animation