Susskind ends time as LCJ’s tech adviser as he predicts AI future


Susskind: Great privilege of career to advise senior judiciary

Professor Richard Susskind has stepped down as technology adviser to the Lord – now Lady – Chief Justice after 25 years.

An academic, consultant and author who has led the way in predicting the development of technology in the law, he first advised Lord Bingham in 1998, followed by Lords Woolf, Phillips, Judge, Thomas and Burnett.

Professor Susskind said: “The last quarter of a century has seen the judiciary evolve from having almost no basic technology – even computers, email and web access – to the present day, a new era in which the top judges now energetically support ‘digital justice’.

“It has been the great privilege of my career to advise the senior judiciary on the relentless march of legal and court technology.”

The Lady Chief Justice, Lady Carr, said: “I am enormously grateful to Richard for his contribution during a time of incredible advancement in technology. His highly valued advice, provided for many years and to many senior judges, has allowed the judiciary always to keep pace with the most significant technological developments where they might impact our courts and tribunals.

“Richard’s considerable expertise has instilled a deep understanding of the pertinent issues within the judiciary, laying the strongest of foundations for so much of our work today.

Professor Susskind is president of the Society of Computers and Law, visiting professor at the Oxford Internet Institute, emeritus Gresham professor of law and a professor at Strathclyde University.

Speaking at the Westminster Legal Policy Forum conference on artificial intelligence (AI) in legal services earlier this month, he said most of the short-term predictions about AI and the legal profession had overstated its impact, while the long-term predictions had understated it.

The legal world would not be “turned on its head” over the next couple of years, but by 2030 it could be.

Lawyers should think less about the “commercial opportunities” in designing AI systems, and more about their “obligation to be familiar with these systems and know how they can best use them for clients”.

The fact that AI systems underperformed or exhibited bias must be a concern, but “we can imagine a world where systems outperform us”, and there was a “moral and ethical obligation on us to use them”.

Referring to the testing of new medicines before they are released, Professor Susskind said there was no “systematic method” for comparing legal AI systems.

“We need to be more rigorous. We have never had to do this in law before – it’s a whole new world. We must have a system in place to evaluate the technology.”

Professor Susskind predicted that the real long-term impact of AI would be felt when it empowered people who were not lawyers to conduct many of their own affairs.

He added: “We could look to a society where citizens are generally able to have access to justice and resolve their disputes. We have to move away from being one-on-one advisers to deploying AI systems for the community.”

Matthew Hill, chief executive of the Legal Services Board (LSB), warned the that “when the lightning pace of AI meets the historic inertia of the legal profession”, there was a risk of the legal sector “being done to rather than leading the revolution”.

There were currently “not enough lawyers to go round” for consumers and small businesses. By “drastically cutting unit costs”, AI could “perhaps bring law to the people”.

He said one of the flaws in the debate was the focus on regulating current problems with AI, when “we have to be thinking about a world in which technology works well and bias is eliminated”.

Mr Hill said he did not think the LSB would “have a problem” backing Professor Susskind’s version of the future, in which citizens were empowered by AI to assert their legal rights.

However, Ellen Lefley, a lawyer at law reform and human rights charity JUSTICE, said the risk from AI she was most concerned about was that the access to justice gap would be “exacerbated rather than closed” by it.

Her concern was that people without access to law firms would be left in a “Wild West” in terms of whether they could understand the results they obtained from AI and what happened to their data.

She added: “We may not have seen the tip of the iceberg of inequality in legal technology. My call to action would be that we harness the technology for a future in which it is available to all.”




Leave a Comment

By clicking Submit you consent to Legal Futures storing your personal data and confirm you have read our Privacy Policy and section 5 of our Terms & Conditions which deals with user-generated content. All comments will be moderated before posting.

Required fields are marked *
Email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Blog


Five reasons why diversity and inclusion are important in law firms

Diversity and inclusion, along with equality and equity, are increasingly common terms we encounter in professional life. This is why you should prioritise them to reap substantial rewards.


Keeping the conversation going beyond Pride Month

As I reflect on all the celebrations of Pride Month 2024, I ask myself why there remains hesitancy amongst LGBTQ+ staff members about when it comes to being open about their identity in the workplace.


Third-party managed accounts: Your key questions answered

The Solicitors Regulation Authority has given strong indications that it is headed towards greater restrictions on law firms when it comes to handling client money.


Loading animation