A Supreme Court justice has called for the creation of an expert commission to act as “a sort of independent regulator” of algorithms.
Lord Sales said the commission, “staffed by coding technicians, with lawyers and ethicists to assist them”, would also be a public resource for government, Parliament, the courts and the public.
Lord Sales, who joined the Supreme Court in January this year, said models already existed in academia and civil society where tech experts and ethicists had been brought together, such as the International Digital Health and AI Research Collaborative, established last month.
“Contributions from civil society are valuable, but they are not sufficient. The issues are so large, and the penetration of coding into the life of society is so great, that the resources of the state should be brought to bear on this as well.
“As well as being an informational resource, one could conceive of the commission as a sort of independent regulator, on the model of regulators of utilities.
“It would ensure that critical coding services were made available to all and that services made available to the public meet relevant standards.”
Lord Sales went on: “More ambitiously, perhaps we should think of it almost as a sort of constitutional court.
“There is an analogy with control and structuring of society through law. Courts deal with law and constitutional courts deal with deeper structures of the law which provide a principled framework for the political and public sphere.
“The commission would police baseline principles which would structure coding and ensure it complied with standards on human rights.”
Delivering the Sir Henry Brooke Lecture for BAILII on Algorithms, Artificial Intelligence and the Law, Lord Sales said the government should carry out impact assessment on algorithms, just as it did in relation to the environment and diversity.
However, government may lack “the technical capacity to do this well”, particularly where coding and design of the systems involved had been contracted out.
The judge said an expert commission on algorithms could be given access to commercially sensitive code on strict conditions that its confidentiality was protected.
He outlined a further use for the algorithm commission – to provide neutral expert evaluation in cases involving code which needed to remain confidential.
Lord Sales called for “scope for legal challenges to be brought regarding the adoption of algorithmic programs, including at the ex ante stage”, which seemed to be happening already.
“This is really no more than an extension of the well-established jurisprudence on challenges to adoption of policies which are unlawful and is in line with recent decisions on unfairness challenges to entire administrative systems.”
In these judicial reviews, the court “would have to be educated by means of expert evidence”, and to avoid the expense and time involved of experts giving evidence on both sides, a system would be needed to refer the code for neutral expert evaluation “by my algorithm commission or an independently appointed expert”.
Since it would not be possible to hold a judicial review in every case, the algorithm commission could proactively identify cases which raised “systemic issues” and bring them together “in a composite procedure, by using pilot cases or group litigation techniques”.
Lord Sales warned that the algorithms commission would have “its own dangers, in terms of an “expert elite monitoring an expert elite”, but the dangers could be mitigated by making its procedures as open and transparent as possible.
“All this is to try to recover human agency and a sense of digital tech as our tool to improve things, not to rule us. Knowledge really is power in this area.
“Coding is structuring our lives more and more. No longer is the main grounding of our existence given by the material conditions of nature, albeit as moulded by industrial society.
“Law has been able operate effectively as a management tool for that world. But now coding is becoming as important as nature for providing the material grounds of our existence.”