Clients demanding AI chatbots to “circumvent lawyers”


Jeffery: Regulatory burdens are not holding back AI in the law

Large law firms are facing “increasing demands from clients” for artificial intelligence (AI) chatbots to allow them to “self-serve and circumvent the need to direct all queries through lawyers”, the Law Society has warned.

The society also said law firms were unsure “as to whether the regulatory expectation is that all client data needs to be anonymised before being inputted into generative AI tools”.

The Law Society was responding to a call for evidence by the Department for Science, Innovation and Technology (DSIT) on the creation of an AI Growth Lab, which would act as a “pioneering cross-economy sandbox” and “oversee the deployment of AI-enabled products and services that current regulation hinders”.

The society said large law firms had outlined the “increasing demands from clients for the development and/or deployment of AI tools that would allow them to self-serve and circumvent the need to direct all queries through lawyers”.

It explained: “These demands are often in the form of requests for AI-powered chatbots, trained on the specific client matter data, that clients could directly interact with.

“Whilst firms are interested in exploring this idea, there is a lack of certainty as to whether the technology is yet advanced enough to provide the level of legal accuracy and guarantee of standards that clients require or expect from regulated law firms and solicitors.

“Law firms are grappling with how to create chatbots that could operate without the oversight of a qualified lawyer, whilst still complying with professional principles, but report a current lack of certainty on the regulatory position for such developments.”

The society warned of the further danger of chatbots for legal services being developed and deployed by companies that are not regulated legal service providers.

“This would include no guarantee of client confidentiality or protection against conflict of interest. The dangers of this to ordinary consumers of legal services who are likely unable to discern the quality of advice are a significant risk.”

The response said law firms had also raised concerns about the practicalities of inputting client data into generative AI (GenAI) tools which could then be used across the practice for other clients.

If anonymisation was required, law firms were concerned it would be “a burden that negates the efficiency gains” of using the tools.

However, law firms described the desire from clients for the confidentiality duties to be “modified or disapplied” as “incredibly low”, and large firms were “more interested in the clarification of this duty” than disapplying it.

The society said neither legal professional privilege nor client confidentiality more broadly should be curtailed within a legal services sandbox for the purposes of “enabling more expansive AI innovation to support economic growth”.

The society said it had heard concerns “from AI-developing and AI-adopting” law firms of a “growing disconnect between the government’s ambitions for innovation” and the ethical and regulatory obligations set by the Solicitors Regulation Authority (SRA).

“Solicitors are held to the highest standards in their practice and this extends to the deployment of AI within areas of work for which they are accountable.

“There is the potential that, through introducing environments in which regulation can be disapplied, this disconnect could be worsened, therefore creating uncertainty amongst practitioners and their clients.”

The society said its preference for a sandbox model was for an AI Growth Lab run by a lead regulator, with the SRA the “most suitable to fulfil this role”, as opposed to an AI Growth Lab run by central government, with the support of sector regulators.

Law Society chief executive Ian Jeffery added: “AI innovation is vital for the legal sector and already has great momentum. The existing legal regulatory framework supports progress.

“The main challenges don’t stem from regulatory burdens, but rather from uncertainty, cost, data and skills associated with AI adoption.”




Leave a Comment

By clicking Submit you consent to Legal Futures storing your personal data and confirm you have read our Privacy Policy and section 5 of our Terms & Conditions which deals with user-generated content. All comments will be moderated before posting.

Required fields are marked *
Email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Blog


What high-performing consumer claims firms get right

Recurring concerns about parts of the volume claims sector show that the gap between well-run firms and those struggling to manage volume effectively is widening.


The SRA’s 2025 AML report: What law firms need to know

The SRA has released its 2024-25 anti-money laundering report and the scale of supervision is striking – it carried out 935 proactive engagements in the year to 5 April 2025.


The managing partner in 2026: skills, security and strategic technology

The legal sector stands at a pivotal moment. The pace of technological change is accelerating, cyber threats are becoming more sophisticated, and client expectations are higher than ever.


Loading animation