There has never been a more complex time to be accused of a crime. For good or for ill, advances in predictive analytics technology are playing a role in how judges set bail, how judges sentence, and whether parole will be granted.
Whether this is good news or bad news depends on your specific case, and how the technology is being applied.
In the Defendant’s Favor: Bail Algorithms
Bail reform is a hot topic here in New York. Overcrowded jail populations and dockets along with skyrocketing bail rates means people can sit in jail for years without ever seeing the inside of a courtroom, despite their constitutional right to a fair and speedy trial.
So algorithms which make granting and setting bail a “fairer and more even-handed process” could be incredibly important to restoring a little justice to the system. Using algorithms which measure a defendant’s risk of either skipping town before their trial or committing another violent crime has allowed some cities to drop jail populations by up to 16% without moving the pre-trial crime rate at all.
In short, this is an algorithm that’s working well, and in favor of people accused of crimes.
Working Against the Defendant: COMPAS, and Similar Programs
On the flip side you have algorithms which measure how likely a defendant is to reoffend. This is one of the more popular programs currently in use, and scores are being applied both to a judge’s sentencing decisions and to the decisions being made by parole boards around the nation.
Use of the COMPAS score is proving to be far more problematic. Perhaps even unconstitutional. COMPAS measures a defendant’s likelihood of recidivism. In Wisconsin v. Loomis the defendant challenged the constitutionality of the algorithm on the following grounds.
The Supreme Court of Wisconsin ruled this test to be constitutional. The Supreme Court of the United States officially declined to hear the case on June 26, 2017.
Critics have also raised concerns about whether judges really understand the algorithms, or their limitations, well enough to be expected to use them in a just and fair manner.
The secretive nature of these tests and the inability to launch an effective appeal of their results is a huge concern, and one we expect will continue to play out in courts for a long time to come.
Also worth noting, we do know the test looks at past convictions. But given the factory-like nature of plea bargains across America, perpetuated by overworked, underpaid, and overloaded public defenders, people are more likely to be convicted of something than acquitted, even if they’re 100% innocent. This means the COMPAS algorithm, and others like it, may already contain fatal flaws which have failed to take the unfortunate realities of our legal system into account.
As the world of criminal law gets more complex, the need for strong private legal representation grows.
The introduction of big data into our courts points to three things you, as a defendant, should worry about.
If you’ve been accused of a crime, we can help. Give yourself your best shot against overzealous law enforcement agencies, iffy algorithms, and persistent prosecutors by giving us a call today.