AI tools “too biased” for sentencing decisions

Report: algorithms no substitute for trials

Bias and inaccuracy render artificial intelligence (AI) algorithmic criminal justice tools unsuitable for assessing risk when making decisions on whether to imprison people or release them, according to a report by experts in the field.

They warned that whlie prosecuting authorities in some US jurisdictions increasingly relied on this technology, it was more urgent that software engineers and designers exercised caution and humility.

Other jurisdictions could come under pressure to adopt the technology as part of efforts to reduce cost, they argued: “Lessons drawn from the US context have widespread applicability in other jurisdictions, too, as the international policymaking community considers the deployment of similar tools.”

Many contributors to the report by the independent US-based Partnership on AI (PAI) – which has over 80 members, including AI, machine learning research, and ethics specialists – accepted that only individualised judicial hearings should inform decisions on detention to achieve just outcomes.

An “overwheling majority” agreed that “current risk assessment tools are not ready for decisions to incarcerate human beings”.

The report adds weight to recent findings by a legal academic, who argued for transparency in AI, and a large-scale European study, which argued that a human-centred approach to AI development was vital to maintain public trust in the technology.

PAI recommended that developers of criminal justice risk assessment tools should adopt 10 minimum requirements that addressed issues such as technical accuracy, bias and validity, and the transparency and accountability of the systems in which life-changing decisions were made.

One expert, Andi Peng, AI resident at Microsoft Research, said: “As research continues to push forward the boundaries of what algorithmic decision systems are capable of, it is increasingly important that we develop guidelines for their safe, responsible, and fair use.”

Another, Logan Koepke, senior policy analyst at Upturn, an organisation which “promotes equity and justice in the design, governance, and use of digital technology”, said: “This report… highlights, at a statistical and technical level, just how far we are from being ready to deploy these tools responsibly.

“To our knowledge, no single jurisdiction in the US is close to meeting the 10 minimum requirements for responsible deployment of risk assessment tools detailed here.”

The PAI report was prompted by a California Senate bill, which would mandate the use of statistical and machine learning risk assessment tools for pre-trial detention decisions.

It concluded: “PAI believes standard setting in this space is essential work for policymakers because of the enormous momentum that state and federal legislation have placed behind risk assessment procurement and deployment…

“For AI researchers, the task of foreseeing and mitigating unintended consequences and malicious uses has become one of the central problems of our field.

“Doing so requires a very cautious approach to the design and engineering of systems, as well as careful consideration of the ways that they will potentially fail and the harms that may occur as a result.

“Criminal justice is a domain where it is imperative to exercise maximal caution and humility in the deployment of statistical tools.”

Leave a Comment

By clicking Submit you consent to Legal Futures storing your personal data and confirm you have read our Privacy Policy and section 5 of our Terms & Conditions which deals with user-generated content. All comments will be moderated before posting.

Required fields are marked *
Email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.


A new route to practice rights for chartered legal executives

Following approval from the Legal Services Board in May 2022, CILEx Regulation has launched an alternative route for chartered legal executives to obtain independent practice rights.

NFTs, the courts and the role of injunctions

In May, news broke that a non-fungible token was the subject of a successful injunction made by the Singapore High Court. The NFT in question is part of the very valuable Bored Ape Yacht Club series.

Matthew Pascall

Low-value commercial cases – an achievable challenge for ATE insurers

There are many good claims brought for damages that are likely to be significantly less than twice the cost of bringing the claim. These cases present a real challenge for insurers.

Loading animation