Let AI help manage cases, says president of Supreme Court


Reed: AI should complement human legal advice

Case management backed by artificial intelligence (AI) could make the courts more accessible, the president of the Supreme Court has said.

Lord Reed also said that, rather than ask whether AI could replace judges, “the more interesting question” was whether it should be allowed to decide cases and, if so, which ones.

He referenced the expectation of the Master of the Rolls, Sir Geoffrey Vos, that AI should, in due course, be used to take “very minor decisions”, such as the decision to extend time limits by a few days.

He had also proposed that integrated mediation processes – such as the process embedded in the online civil money claims service – “can and should be driven by AI, so that the parties are faced with regular logical proposals for the resolution of their dispute.”

Lord Reed said: “This may help increasing numbers of litigants to resolve claims themselves, without the need for any kind of oral hearing.”

A version of this type of AI-based case management was already used in the private sector by companies such as eBay, Amazon, AirbnB and Uber, he continued, but stressed that he was not suggesting that the civil courts should be replaced “with eBay-style dispute resolution”.

But the court could feel “out of reach for too many people” and more should be done to improve accessibility.

“Digital forms of case management, complemented by AI, may be one part of the answer.

“The challenge is to design the services in a way which is user-friendly, and which facilitates the participation needed to ensure that litigants feel that their complaints have been heard, and that they have been shown respect.”

On “the possibility of AI judges”, Lord Reed said: “There is much debate as to whether AI will ever be capable of the complex factual and legal evaluation conducted, for example, by a High Court judge.

“But technology is improving all the time. So, I think the more interesting question is: as a matter of principle, should AI be permitted to decide our cases and, if so, which ones?

“I have suggested that there is a value in the process of human decision-making, in the context of the courts, which is separate from the decision itself.

“At the same time, decision-making by AI may enhance other values such as efficiency and consistency. So AI is likely to offer both opportunities and threats.

“The question may be whether we can exploit the opportunities, and minimise the threats.”

Lord Reed said there were concerns about bias, transparency, and the social, ethical and human rights implications of using ‘robot judges’.

He went on: “Lawyers, judges, academics and policy-makers should be engaging seriously with these concerns, to ensure that AI is used to enhance our justice system, not to replace it.

“We need to bear in mind that the pace of development is extremely fast. There is a risk that future developments may occur at a pace which exceeds our ability to adapt.”

In a speech on Oral Hearings in the United Kingdom courts: Past, present and future to the Legal Training and Research Institute of Japan, Lord Reed said AI had “the potential to revolutionise legal advice”.

AI tools were “increasingly able” to predict the results of cases in advance, which “should ideally complement, rather than replace” human legal advice.

“It is possible that greater access to legal analytics could cause more claims to settle before they come to court, thereby reducing the need for hearings, since the parties will know (or think they know) the outcomes in advance.”

Lord Reed said AI had also been used to assist with the preparation of cases and submissions, with “mixed results”.

Rather than following the example of Canadian courts, which had issued practice directions requiring those who used AI in their submissions to inform the court and indicate how it was used, Lord Reed said a better approach was to emphasise, as the Bar Standards Board had done, that AI was a ‘promising tool’, but not a replacement for human responsibility and oversight.

“Whatever the quality of the AI they use, lawyers remain responsible for their research, arguments, and representations under their core duties to the court and to their client.”




Leave a Comment

By clicking Submit you consent to Legal Futures storing your personal data and confirm you have read our Privacy Policy and section 5 of our Terms & Conditions which deals with user-generated content. All comments will be moderated before posting.

Required fields are marked *
Email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Blog


Five reasons why diversity and inclusion are important in law firms

Diversity and inclusion, along with equality and equity, are increasingly common terms we encounter in professional life. This is why you should prioritise them to reap substantial rewards.


Keeping the conversation going beyond Pride Month

As I reflect on all the celebrations of Pride Month 2024, I ask myself why there remains hesitancy amongst LGBTQ+ staff members about when it comes to being open about their identity in the workplace.


Third-party managed accounts: Your key questions answered

The Solicitors Regulation Authority has given strong indications that it is headed towards greater restrictions on law firms when it comes to handling client money.


Loading animation