Artificial intelligence-backed lawtech has the potential to improve access to justice but also carries a danger that automating law will be used negatively, meaning regulators will have to step in, a global innovation charity has warned.
Nesta, which is working with the Solicitors Regulation Authority (SRA) to identify and support transformative AI legal technology, backed by a £700,000 government grant , said the lesson of technology developments elsewhere was that such innovations had a “dark side”.
For instance, the advent of cheap or free and anonymous internet communications had also brought with it a “toxic social media culture of abuse”.
The authors, Olivier Usher and Chris Gorst, both senior members of Nesta’s challenge prize centre, which oversees rewards for innovation, wrote in a blog  : “When speech is free, all speech flourishes, including hate speech.”
From a lawtech point of view, they wrote, it could both enable access to justice but also create a “less palatable future”.
They said it could involve “the silencing of #MeToo activists with an avalanche of libel lawsuits; honest tradesmen ripped off by an automatic lawsuit over every invoice; online bullies spinning up endless court cases against their enemies in order to intimidate them into submission; patent trolls automating their hunt for genuinely innovative companies to exploit”.
However, speaking to Legal Futures, the authors said they were hopeful that the practice of law, as a highly regulated profession, might escape the arrival of negative elements, if regulators were vigilant and willing to be proactive if necessary.
Mr Usher said the risk of lawtech being used for ill would only come about if “the regulators completely wash their hands” of acting to stop it.
He highlighted the importance of ‘safe spaces’ for innovation to be tested, such as the Financial Conduct Authority’s (FCA) regulatory sandbox.
Earlier this year the SRA announced  it would simplify its system for granting waivers to regulations in order to promote innovation, and formalise its ‘innovation space’ initiative, which is comparable to the FCA’s sandbox.
The space includes a guarantee that the SRA will take no enforcement action if innovations bring a firm into technical breach of its rules.
Mr Gorst said it was already possible to launch cases to harass people and regulators had it in mind when setting conduct rules. “I wonder how new a problem this would be by virtue of the fact it could be somewhat more automated?” he asked.
He continued: “The really exciting opportunity is… technology can help people navigate their way through the system, help them avoid needing recourse to law where it’s not really necessary, help them with a guided pathway through the system… [and help] them to represent themselves in legal situations where the cost of a lawyer might be prohibitive.
“We think the space of opportunity seems really large, but we should also be mindful of the risks, and regulators need to be mindful of the risks as well.”
It is understood the SRA is in the process of tendering for its AI project.