Panel urges regulation of algorithms used in criminal justice system


Blacklaws: Consensus rooted in rule of law needed

A year-long study of the use of computer algorithms in the criminal justice system has recommended creating a national register to bring openness, expose built-in biases, and ensure public trust.

It joined a growing chorus of voices calling for the processing of ‘big data’ in sensitive areas of law involving matters of liberty, to be brought under some sort of regulation.

The Law Society’s technology and law policy commission found algorithms have been used by the police and others in a haphazard way to assist decision-making, and oversight was essential to make sure efficiency benefits from the technology were rooted in the rule of law.

It found an absence of transparency and centralised coordination of the use of algorithms to crunch data, which presented a danger to human rights.

The society formed the commission last June to examine the ethical and moral risks that flowed from the growing use of algorithms in criminal justice, for things such as where to deploy police, who to let out of prison on parole, and which visa application to scrutinise.

It interviewed more than 75 experts, read over 82 submissions of evidence and other documents and held public evidence sessions.

Among several recommendations, the commission – consisting, among others, of society president Christina Blacklaws and academic professors – said data protection laws should be strengthened to cover algorithms, and their use “must allow for maximal control, amendment and public-facing transparency, and be tested and monitored for relevant human rights considerations”.

Accompanying the report, an interactive map of England and Wales showed the widespread use of algorithms by police forces.

Among instances from all over the country, it recorded an ongoing Metropolitan Police trial of predictive policing that used crime data to calculate which areas needed a higher police presence.

Other examples included facial recognition technology, individual risk assessment, case management, and algorithms used in visa processing.

It said many deployments of the technology were “in a pilot or experimental stage” but their potential for “unintended and undesirable side effects” merited proper oversight.

Rather than individual bodies and agencies deciding on what was appropriate, “civil society, academia, technology firms and the justice system more broadly” should be involved, the report insisted.

The commission was particularly concerned that some of the systems now operating – such as facial recognition in policing – lacked a “clear and explicit lawful basis”.

It said that “high-stakes decisions and measures taken in the justice system demand extremely careful deployment”, yet it concluded: “At the most basic level [we have] found a lack of explicit standards, best practice, and openness or transparency about the use of algorithmic systems in criminal justice across England and Wales.”

It recommended a national register of algorithmic systems be created “as a crucial initial scaffold for further openness, cross-sector learning and scrutiny”.

However, it concluded that: “The [UK] has a window of opportunity to become a beacon for a justice system trusted to use technology well… in line with the values and human rights underpinning criminal justice.

“It must take proactive steps to seize that window now”.

In a statement, Ms Blacklaws said a “consensus rooted in the rule of law, which preserves human rights and equality, to deliver a trusted and reliable justice system now and for the future” was needed.

She added: “Police, prisons and border forces are innovating in silos to help them manage and use the vast quantities of data they hold about people, places and events.

“Complex algorithms are crunching data to help officials make judgement calls about all sorts of things – from where to send a bobby on the beat to who is at risk of being a victim or perpetrator of domestic violence; who to pick out of a crowd, let out on parole or which visa application to scrutinise.

“While there are obvious efficiency wins, there is a worrying lack of oversight or framework to mitigate some hefty risks – of unlawful deployment, of discrimination or bias that may be unwittingly built in by an operator.

“These dangers are exacerbated by the absence of transparency, centralised coordination or systematic knowledge-sharing between public bodies. Although some forces are open about their use of algorithms, this is by no means uniform.”

Specifically, key recommendations in the report were:

  • Oversight: A legal framework for the use of complex algorithms in the justice system;
  • Transparency: A national register of algorithmic systems used by public bodies;
  • Equality: The public sector equality duty is applied to the use of algorithms in the justice system;
  • Human rights: Public bodies must be able to explain what human rights are affected;
  • Human judgement: There must always be human management of complex algorithmic systems;
  • Accountability: Public bodies must be able to explain how specific algorithms reach specific decisions;
  • Ownership: Public bodies should own software rather than renting it from tech companies.



Leave a Comment

By clicking Submit you consent to Legal Futures storing your personal data and confirm you have read our Privacy Policy and section 5 of our Terms & Conditions which deals with user-generated content. All comments will be moderated before posting.

Required fields are marked *
Email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Blog


Use the tools available to stop doing the work you shouldn’t be doing anyway

We are increasingly taken for granted in the world of Do It Yourself, in which we’re required to do some of the work we have ostensibly paid for, such as in banking, travel and technology


Quality indicators – peer recommendations over review websites

I often feel that I am banging the SRA’s drum for them when it comes to transparency but it’s because I genuinely believe in clarity when it comes to promoting quality professional services.


Embracing the future: Navigating AI in litigation

Whilst the UK courts have shown resistance to change over time, in the past decade they have embraced the use of some technologies that naturally improve efficiency. Now we’re in the age of AI.


Loading animation