SRA: Law firms must be able to explain decisions made by AI


AI: Monitoring is like supervising a trainee

Law firms must be able to explain “automated decisions”, including those involving artificial intelligence (AI), the Solicitors Regulation Authority (SRA) has warned.

The regulator also said AI systems needed to be carefully trained and monitored, in many ways a “similar task to that of introducing and supervising a trainee”.

In a paper on technology and legal services, published today, it was positive about AI, saying its development would mostly be focused on back-office functions for now.

This would free up solicitors from lower-level work to carry out more complex tasks. It could also see firms reduce their costs.

While AI has not been 100% accurate in various tests, the SRA said it has never proven any less accurate than work carried out by humans, and in some cases, it has been more so.

Looking at some of the issues AI’s use would raise, however, the SRA said firms may find it difficult, where decisions were made by “self-learning AI”, to explain the “assumptions and reasoning behind some automated decisions”.

Just like the Information Commissioner’s Office, the regulator expected ‘algorithmic accountability and auditability’ from firms, including that they could show their algorithms were compliant with GDPR.

“Without transparency, AI is more likely to develop biases without the operator realising this. There have been studies as well as real-world examples of how unwanted biases exist and develop in sophisticated algorithms.

“For example, a facial recognition system may fail to recognise all individuals it is supposed to identify if it has only been trained on a single ethnicity.”

The SRA said AI systems needed to be trained in a way that followed the rules and was regularly monitored – “in many ways this is a similar task to that of introducing and supervising a trainee”.

It went on: “One advantage of AI is that checking and testing its workings can be an easier task than analysing human thinking, particularly if the AI has been set up well.

“While the AI systems used by lawyers can have biases, misunderstandings and errors in the same way as those who train and use them, it can be simpler to identify and correct them.”

Although the SRA said 40 of the top 100 law firms were already using AI, its use was still “relatively new” and ethical issues were “still emerging”.

As with other areas of innovation, solicitors would have to apply the SRA Principles and “their own ethical judgement” to resolve issues.

When training new AI systems, they must still protect the confidentiality of client data and avoid conflicts.

The SRA said decisions made by AI might not be accurate if a law firm used the wrong type of system or the system was trained using poor or incomplete data. Since AI systems could learn very quickly, “problems in their reasoning could also appear rapidly”.

Firms must have a “structured quality assurance programme” for their systems and should test them before rolling them out, for example piloting a chatbot or trialling automatic discovery on archived files.

The regulator warned firms they could not “outsource” their responsibility to clients by using third-party companies, but in this situation “we are unlikely to take regulatory action where the firm did everything it reasonably could to assure itself that the system was appropriate and to prevent any issues arising”.

The SRA added: “We do not intend to impose any specific rules on the use of IT or AI. It is not for us to say which AI systems firms should buy.

“As with many new technologies, firms may have initial uncertainty around the use of AI. This may soon give way to a perception that it is less efficient or riskier not to use it.”

Paul Philip, chief executive of the SRA, said: “There is no doubt that new technology has already improved the way legal services work. Latest surveys show that 30% of legal work is now delivered online and the business use of emails has speeded up many tasks.

“Our report highlights the potential for technology to add further value in the workplace and we are looking further at how AI can enable the provision of high-quality legal services through the government Pioneer Fund award.

“Many firms are already exploring the possibilities and I would urge all law firms to consider how technology can help you and your business.”




Leave a Comment

By clicking Submit you consent to Legal Futures storing your personal data and confirm you have read our Privacy Policy and section 5 of our Terms & Conditions which deals with user-generated content. All comments will be moderated before posting.

Required fields are marked *
Email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Reports

Our latest special report, produced in association with Temple Legal Protection, looks at the role of after-the-event (ATE) insurance in commercial litigation post-LASPO. We are at a time when insurers, solicitors, clients and litigation funders work ever more closely to create funding packages that work for all of them, with conditional fee and even damages-based agreements now part of many law firms’ armoury.

Blog

18 November 2019

Protecting data in the cloud – a guide for law firms

Despite all of its advantages by way of ease and efficiency in a digital world, storing data within the cloud does not come without risks. Firms have to ensure they are safeguarded from a potentially disastrous loss.

Read More

Loading animation