Generative AI could be useful “secondary tool”, judges told


Carr: Endorsed guidance

Generative AI could be a “potentially useful secondary tool” for judges to use in the course of their work, according to new guidance from the senior judiciary.

However, all judicial office holders “must be alive to the potential risks” of it.

The guidance was produced by a cross-jurisdictional judicial group to assist judges, their clerks, and other support staff on the use of AI.

It was issued with the support of Baroness Carr, the Lady Chief Justice, Master of the Rolls Sir Geoffrey Vos, Sir Keith Lindblom, Senior President of Tribunals, and Lord Justice Colin Birss, deputy head of civil justice.

They said the guidance was the first step in a “proposed suite of future work to support the judiciary in their interactions with AI”, with a frequently asked questions document to support the guidance to follow next.

The guidance cautioned judges to “ensure you have a basic understanding of their capabilities and potential limitations” before using AI tools, such as appreciating that public AI chatbots did not provide answers from authoritative databases.

“As with any other information available on the internet in general, AI tools may be useful to find material you would recognise as correct but have not got to hand, but are a poor way of conducting research to find new information you cannot verify.

“They may be best seen as a way of obtaining non-definitive confirmation of something, rather than providing immediately correct facts.”

But “provided these guidelines are appropriately followed, there is no reason why generative AI could not be a potentially useful secondary tool”, judges were told.

“If clerks, judicial assistants, or other staff are using AI tools in the course of their work for you, you should discuss it with them to ensure they are using such tools appropriately and taking steps to mitigate any risks.”

It listed potential uses of AI tools as summarising large bodies of text, writing presentations – e.g. to provide suggestions for topics to cover – and administrative tasks like composing emails and memoranda.

The guidance also warned that AI tools may make up fictitious cases, citations or quotes, or refer to legislation, articles or legal texts that do not exist, so-called hallucination.

Just last week, we reported on how the First-tier Tribunal had decided that nine cases cited by a litigant in person in a tax case had been produced by generative AI and the guidance said it was “appropriate” for judges to ask unrepresented people if they have used AI and what checks for accuracy they have undertaken.

Judges also needed to be aware of “potential challenges posed by deepfake technology”.

Provided AI was used responsibly, there was “no reason” why lawyers ought to refer to its use, but this was dependent upon context.

“Until the legal profession becomes familiar with these new technologies, however, it may be necessary at times to remind individual lawyers of their obligations and confirm that they have independently verified the accuracy of any research or case citations that have been generated with the assistance of an AI chatbot.”

The guidance also stressed the importance of confidentiality and privacy – especially when using a public AI chatbot – the potential for bias, and security risks.




Leave a Comment

By clicking Submit you consent to Legal Futures storing your personal data and confirm you have read our Privacy Policy and section 5 of our Terms & Conditions which deals with user-generated content. All comments will be moderated before posting.

Required fields are marked *
Email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Blog


Five reasons why diversity and inclusion are important in law firms

Diversity and inclusion, along with equality and equity, are increasingly common terms we encounter in professional life. This is why you should prioritise them to reap substantial rewards.


Keeping the conversation going beyond Pride Month

As I reflect on all the celebrations of Pride Month 2024, I ask myself why there remains hesitancy amongst LGBTQ+ staff members about when it comes to being open about their identity in the workplace.


Third-party managed accounts: Your key questions answered

The Solicitors Regulation Authority has given strong indications that it is headed towards greater restrictions on law firms when it comes to handling client money.


Loading animation