Skip navigation
The Habeas Citebook: Prosecutorial Misconduct - Header
× You have 2 more free articles available this month. Subscribe today.

Philadelphia Tests Automating the Bail Risk Assessment Process

by David M. Reutter

Philadelphia is using part of a $3.5 million grant to create a computerized bail-risk assessment tool. The effort is part of the city’s Reentry Project.

The MacArthur Foundation selected Philadelphia to take part in its Safety and Justice Challenge. According to Gabriel B. Roberts, spokesman for the First Judicial District of Pennsylvania, “The risk-assessment tool is just one of 19 initiatives funded by the MacArthur grant to safely reduce Philadelphia County’s jail population while also reducing racial and ethnic disparities.”

“The goal with complementing a new risk tool is to reduce or eliminate cash bail,” said Michael Bouchard, director of pretrial services for the First Judicial District. “Once we have a risk tool and once we have a model with numbers, we’ll be able to allocate our resources in the pretrial arena to provide those that are more suited for community supervision than pretrial incarceration.”

“As with other initiatives, every effort will be made to reduce racial and ethnic disparities,” Roberts said. “To that end, the model, which is still being developed, will not include any information concerning race or Zip Code.”

Some criminal-justice reform advocates, however, are concerned that is not enough to protect black or brown people. While they acknowledge that the ultimate goal of eliminating cash bail is a good one, they worry that “computerized risk assessment tools could predict recidivism by weighing factors that serve as a proxy for race and socioeconomic status,” which would result in the incarceration of disproportionately more black and brown defendants than white ones.

Recently elected district attorney Larry Krasner agreed, noting: “There is a real danger that the components going into the risk assessment are proxies for race and for socioeconomic status.”

“If they’re in the process of designing a tool, part of that process should be directly working with people,” said Hanna Sassamn, policy director of the Media Mobilizing Project, who was awarded a fellowship to study risk assessment models. “Not just letting policy-makers make political and moral decisions, but also everyday people from the poorest big city in America should be a structural part of that conversation.”

Automating the processes in the criminal justice system where human prejudices lie seems like an admirable goal, but using models to do so is fraught with danger because any model or computer program outputs information as the programmers decide (in computer-science terms, garbage in, garbage out). Working with prominent data scientists, Philadelphia seeks to create a computerized tool that assigns defendants a label: low-, medium-, or high-risk. Bail will be based upon defendants’ classification.

Activists warn that the system must be open source and completely transparent. “If my very liberty is being determined by a computer program that is invisible to me,” said Sassaman, “then I have every right to watch it, and I have every right to make sure it is trained on data that is local.” 

Source: billypenn.com

As a digital subscriber to Criminal Legal News, you can access full text and downloads for this and other premium content.

Subscribe today

Already a subscriber? Login

 

 

Stop Prison Profiteering Campaign Ad 2
Advertise Here 4th Ad
The Habeas Citebook Ineffective Counsel Side