Robert D. DiDio & Associates - Criminal Defense
Your First Meeting Is Free
917-300-0984
En Español

STREET CRIME DEFENSE WITH NO JUDGMENT

Recidivism prediction software used by courts may be biased

| Jun 1, 2016 | Criminal Appeals |

There is little doubt that bias – both conscious and unconscious – negatively impacts outcomes in the criminal justice system. America’s long-standing problems with race and class discrimination are perhaps the most pressing biases plaguing criminal justice.

Because it can be very difficult to detect and control for human bias, it would theoretically seem like a good idea to leave major decisions (bail, sentencing, parole) to the cold logic of a computer. As it turns out, many parts of the country already use computer software to predict a defendant’s risks of recidivism. Unfortunately, there is evidence to suggest that even software can be racially biased.

The investigative journalists at ProPublica recently wrote about the increased use of “risk assessment” software in courts and correctional facilities nationwide. These programs look a host of factors about each defendant (based on prior criminal records, questionnaires, etc.) and then assign that person a score. The higher the score, the more likely that person is to reoffend – at least according to the software.

But there are serious problems with the use of such programs. For starters, many counties and states started using software before it had been extensively tested. There was little data about the accuracy or reliability of these algorithms.

Any testing that had been conducted was often conducted by the individuals or for-profit companies developing the programs. Moreover, some of these companies will not disclose how their software actually works (and what factors are considered), saying that the technology is proprietary.

When ProPublica conducted its own investigations into risk assessment software, it found that black defendants are often given higher risk scores than white defendants convicted of similar crimes.

The software proved to be wrong in roughly equal amounts for black and white defendants, but in opposing ways. When defendants were mislabeled as low risk, whites were more likely to be given this incorrect designation. When defendants were mislabeled as high risk, these defendants were more likely to be black.

Risk assessment software is being used to make important decisions at every step in the criminal justice process, including setting bond amounts, sentencing and parole. But should that be the case? And, if the software can be both wrong and biased, should we continue to let it be used as widely as it is?

Archives

In the News
Articles
Review Us