When Algorithms Determine Whether Criminal Defendants Will Spend Decades in Prison
Many people likely do not realize the significant role that criminal sentencing AI plays in our criminal justice system. What this means is that algorithms used to estimate whether a defendant is likely to commit a future crime affects sentencing and other major decisions made by prosecutors and judges, such as the possibility of probation.
What is perhaps even more troubling concerns the underlying methodology of the algorithms and reports that these life-affecting decisions are based on involve factors that are racially and otherwise biased; for example, to what extent of defendant has a “negative attitude towards the police” and lives in government subsidized housing. In addition, this assessment technology has not been validated by any scientific or judicial organization. In fact, in some circumstances, defendants’ lives have been affected and determined based on the tool that has been only been validated in a college thesis paper.
The reality is that criminal assessment tools like this one are being used across the country, and it is fair to say that they play an often devastating role in our criminal justice system: determining bail, policing, sentencing, and parole, and essentially allowing a number of cities and states to place Americans’ fates in the hands of algorithms that are in many ways simply mathematical expressions of underlying biases.
“Machine Bias”: Predicting Future Criminals
Take for example one such tool known as COMPAS: COMPAS is a risk assessment tool created by one company which was the subject of a recent expose. The program has been consulted by judges to determine that defendants connected with minor crimes are high risk for future violent crimes and, as a result, those same judges have rejected plea deals and imposed sentences that double prison time for a defendant; all based on this one program.
Upon taking a closer look at COMPAS, the results are nothing short of disturbing: based on scores associated with 7,000 people arrested in one county and comparing the scores with the criminal histories of those people over the next few years revealed that these scores were remarkably unreliable in forecasting crime. In fact, only 20 percent of the people who were predicted to commit violent crimes actually did so. In addition, the algorithm was twice as likely to falsely flag black defendants as future criminals compared to white defendants.
Another concern associated with programs like COMPAS is the lack of transparency: the companies that develop these tools do not have to share any information as to how the scores are calculated out of protection of their own proprietary interests. This means that the public and the courts have no idea how the scores were computed.
Is There a Positive Side?
Still, there are those who feel that these algorithms can play a positive role in the system. For example, some states have only been willing to eliminate cash bail when judges have a way of first consulting a risk assessment algorithm to determine whether a defendant is a risk for future crimes. As a result, these algorithms can also provide states with the opportunity to reduce incarceration and overall crime by being able to decipher and separate out who is likely to be more dangerous to the public.
Contact Our Criminal Justice Attorneys
If you have been arrested, contact our experienced New City criminal defense attorneys at the office of Phillip J. Murphy to ensure that you are not the victim of bias in our criminal justice system.
Resources:
theatlantic.com/ideas/archive/2019/06/should-we-be-afraid-of-ai-in-the-criminal-justice-system/592084/
propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing