by Ed Lyon
A computer-era dictum states, “To err is human but to really foul things up requires a computer.” There is no place like the criminal injustice system and its growing dependence on AI and algorithms where that folksy axiom runs so true.
Not all algorithm-driven artificial intelligence (“AI”) programs are bad things. In recent efforts to reform cash bail systems across the nation, some well-thought-out and properly applied algorithm programs have assisted logical and just decisions as to who should be released from pre-trial detention and under what conditions.
In all areas related to or even entirely dependent upon computers for decision making, especially where a person’s life might well hang in the balance, the computer programming acronym GIGO should always be observed. Usually Garbage In, Garbage Out has come to refer to computer users. The term and its GIGO acronym did originate as a reference to bad programming. This was recently discovered by tenacious attorney Rachel Cicurel, a public defender in the nation’s capital.
In 2017, Cicurel was defending a juvenile whose name and offense remain sealed due to his age. He was referred to as only as “D” in public. His offense must not have been all that serious, however, because the prosecutor readily agreed to allow D to enter into a plea agreement for a probated sentence. After the parties agreed, but prior to its acceptance in court, a computer algorithm-driven AI program stopped the deal.
Several input factors in D’s young life, some of them beyond his control, placed him in a category of “high risk” of committing future criminal activity. One of the happenstance factors was D’s residing in “government-subsidized government housing.” Another was D’s “negative attitudes toward police.” This prompted government attorneys to renege on the deal and insist that D be sentenced to juvenile detention. Sure, give D plenty of positive reinforcement to support his already negative attitude toward police.
Cicurel and her team challenged this algorithm-driven AI program. Tracing it to its source, she found it was a thesis written by a 20-year-old graduate student. The paper had never been examined, validated, or accepted by any scientific community. In other words, the algorithm-driven AI program was Garbage In and was producing Garbage Out. When informed of this, the trial judge invalidated the test. Presumably, D was granted the originally offered probation. Not all defendants in all jurisdictions are as fortunate as D was, though.
Tool Prone to Inaccuracy
Paul Zilly was convicted for stealing a lawn mower in 2013 in Wisconsin. The parties had settled on a plea agreement. When the presiding judge considered an algorithm-driven AI program similar to the one that nearly torpedoed D, this one called COMPAS (aka Correctional Offender Management Profiling for Alternative Sanctions), he rejected the agreement and imposed a prison sentence twice as long as the parties originally agreed on.
COMPAS had predicted Zilly to be in a high-risk category to commit future crimes.
Journalist Julia Angwin of ProPublica investigated COMPAS, finding results similar to those discovered by Cicurel. COMPASscore results on 7,000 defendants in Broward County, Florida, revealed accurate future crime predictions for only one in five cases. Angwin also discovered COMPAS was twice as likely to falsely forecast future criminal activity by black defendants than white defendants. Even more disturbing is that these types of algorithms’ equivalents to source codes are protected as proprietary, nearly ensuring that no one will be able to discover the garbage that was input into the program would cause the garbage it would output.
For algorithm-driven AI programs to be truly beneficial in criminal justice, their source coding must be transparent and vetted by the scientific and criminal justice communities working together to ensure their applicability and reliability.
As a digital subscriber to Criminal Legal News, you can access full text and downloads for this and other premium content.
Already a subscriber? Login