Generative artificial intelligence (AI) will not replace lawyers, but lawyers who use it will replace those who do not, the head of lawtech at a leading City law firm has predicted.
Chris Tart-Roberts said the extensive testing his firm, Macfarlanes, has done of the technology in the last year showed that it was currently accurate about 50-60% of the time on average.
Speaking at a briefing on generative AI this week, he suggested that the hype around AI was actually “holding us back” because it was “scaring and confusing people”, as well as sometimes disappointing them when they used it.
Mr Tart-Roberts, who is also Macfarlanes’ chief knowledge and innovation officer, listed three lessons from the testing done to date – one of the pilots it ran involved 15% of the firm’s fee-earners.
“The reality with this technology is that it is a long way from being perfect. Anything you get out of it requires an awful lot of checking,” he said.
The firm’s assessment was that AI was “probably 50-60% accurate”.
“The second thing that becomes very clear when you start using this technology in anger is that it’s not sentient. It’s a really obvious point but clearly it does not have human experience, it cannot demonstrate empathy, it can’t quantify risk, it can’t understand commercial circumstances or the bargaining power of the sort of clients that we act for.
“When you see people using the technology in the context of their day-to-day work, you realise what a big part those things play in what lawyers actually do.”
The third finding was that, at the moment, the technology “is only useful in relation to certain, very specific things”.
Macfarlanes has found it “very good at summarising things”, such as documents, cases or pieces of legislation, “and it’s pretty good at doing some first cuts of emails and notes”.
It could “do a decent job at research” but not legal research. Mr Tart-Roberts explained how AI really struggled with technical legal analysis, well below the 50-60% accuracy average. For example, in analysing a court ruling, it could not differentiate between the decisions cited in a judgment, such as one at first instance and one on appeal.
“Admittedly the technology is developing at some pace and those areas where it can help will increase but right now it’s a little bit of a drop in the ocean,” he continued, saying that when the firm’s lawyers were asked at the end of a pilot how often they thought they would use this tool in practice, the most common answer was once a week.
Mr Tart-Roberts went on: “Right now, we have a technology that is incredibly impressive but has a long way to go and I would propose that the right way for us to think of it as a firm, as individuals and as a profession is that [it has] a huge amount of potential – but as a support tool. We really need to be clear about that.”
He observed that lawyers already have access to a lot of support technologies, such as know-how and precedents, and “those things have not replaced our lawyers, they’ve augmented them”. Generative AI “will be no different from that”, he forecast.
Firms needed to manage their lawyers’ understanding of the technology: “We need to frame it in that way so lawyers are not threatened by it, don’t misunderstand it and are not disappointed by it.”
At the same time, while it was difficult to predict right now what the future would bring, Mr Tart-Roberts listed three things “we can say right now with certainty”.
First was that generative AI is “here to stay”. He said: “When you get your hands on it, it is really, really impressive. The genie is out of the bottle. We will see, over probably the next three years, that this technology will become ubiquitous across the profession. We are already seeing the start of that.”
He pointed out how it was already being implemented by existing technology suppliers to the profession, like LexisNexis, Practical Law and Microsoft.
“We will also see some firms, like us, that will invest in developing technologies themselves and looking at how they can use these open-source models within their firm on their own datasets to do very specific things that the organisation needs.
“We need to start planning for those things now, exploring those use cases, understanding what the technology is good at and not good at, establishing our checks and balances, and establishing good governance.”
There also needed to be a discussion about wider issues, such as regulation, data and AI’s environmental and societal impacts.
He said legal regulators too would need to provide “more clarity and guidance”, particularly around client protection.
The second certainty was that the profession needed to widen its skillset. “We are going to need more legal technologists, more AI specialists, and more data and computer science specialists. We are going to need to build up our expertise in relation to things like prompt engineering, which six months ago nobody knew anything about.”
Third, he said: “Whilst I do not think that AI is going to replace lawyers any time soon, I do absolutely think that lawyers who use AI in the future will replace lawyers who don’t use AI.”