Skip to the main content
Assessing the Assessments

Assessing the Assessments

Lessons From Early State Experiences In The Procurement And Implementation of Risk Assessment Tools

Published

by Christopher Bavitz, Sam Bookman, Jonathan Eubank, Kira Hessekiel, and Vivek Krishnamurthy

For state and local officials, considering the development, procurement, implementation, and use of Risk Assessment (RA) tools can be a daunting endeavor.

This report provides context for those making these decisions, beginning with brief case studies of four states (Kentucky, Wisconsin, California, and Pennsylvania) that have adopted (or attempted to adopt) such tools early on and describes their experiences.

It then draws lessons from these case studies and suggests some questions that procurement officials should ask of themselves, their colleagues who call for the acquisition and implementation of tools, and the developers who create them.

This report concludes by examining existing frameworks for technological and algorithmic fairness.

The authors offer a framework of four questions that government procurers should be asking at the point of adopting RA tools. That framework draws from the experiences of the states we study and offers a way to think about accuracy (i.e., the RA tool’s ability to accurately predict recidivism), fairness (i.e., the extent to which an RA tool treats all defendants fairly, without exhibiting racial bias or discrimination), interpretability (the extent to which an RA tool can be interpreted by criminal justice officials and stakeholders, including judges, lawyers, and defendants), and operability (the extent to which an RA tool can be administered by officers within police, pretrial services, and corrections).

You might also like


Projects & Tools 01

Algorithms and Justice

The use of algorithms in the judiciary has already raised significant questions about bias and fairness, and looking ahead the moral questions become even more challenging.