IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Justice-Focused Algorithms Need to Show Their Work, Experts Say

The Justice in Forensic Algorithms Act aims to ensure that when algorithmic analyses are used as evidence in court, defendants get to know how the tools reached their conclusions and allow them to contest the results.

A concept image of legal technology and algorithms.
Shutterstock
A panel including legal and artificial intelligence (AI) experts made the case today that algorithms used as evidence in court decisions and to influence other criminal justice decisions should be opened up for scrutiny by those impacted.

The idea is to ensure that criminal defendants can understand the workings of algorithms whose findings are used as evidence against them and question the strength and relevance of the algorithms’ reasoning and conclusions.

“With humans, we don’t just say ‘OK, this expert in this white lab coat, they’re pretty good, let’s just let them say anything and you don’t get to cross-examine them, you don’t get to talk to them before trial, you don’t get to find out why they’re saying what they’re saying,’… We subject them to a lot of adversarial scrutiny, even after a judge has deemed them reliable enough for the juries to hear them,” said Berkeley Law Professor Andrea Roth during a Brookings Institute panel discussion.

The same should be true for software, Roth argued.

Law enforcement agencies examining crime scene evidence use so-called forensic algorithms for purposes like helping them compare DNA samples and fingerprints from the scene against those of potential culprits and identifying faces from photos. The technology aims at “improving the speed and objectivity,” of this work, the Government Accountability Office (GAO) wrote in a 2021 report on the practices.

But the tools have limitations that are not always made clear. The GAO report said analysts and investigators have run into challenges around bias, “misuse” and “difficulty interpreting and communicating results.”

Forensic algorithms are also not required to meet any federal standards and are often owned by private companies that may wield trade secret privileges to block anyone from seeing how their proprietary algorithms work, said Rebecca Wexler, assistant professor of law and co-director of the Center for Law and Technology at Berkeley, during the discussion.

A federal bill aims to change that, however, and ensure defendants can see the algorithms used against them — even if they're proprietary.

THE MATH BEHIND THE CURTAIN


Algorithms may be used in various stages of the criminal justice system, from influencing parole decisions to pretrial ones, as well as providing evidence assessments used during trials.

Glenn Rodriguez is currently the program director for the Horizon Juvenile Center and co-director of Youth Services at the Center for Community Alternatives. He has first-hand experience confronting the mysteriousness shrouding criminal justice algorithms, when he sought parole after serving a 27-year sentence for an offense committed when he was 16.

Rodriguez had spent the past 15 years of that sentence taking advantage of productive opportunities, including gaining a college degree, becoming a certified dog trainer and counseling at-risk youth, he said. Then his quest for parole was rejected when an assessment algorithm called COMPAS ranked him as high risk for recidivating.

Glenn Rodriguez discussed his work to contest the algorithm used to deny him parole.
Glenn Rodriguez discussed his work to contest the algorithm used to deny him parole.
Without an explanation of the algorithm’s calculations, Rodriguez resorted to interviewing 100 other inmates about their results to estimate how the algorithm weighted details to arrive at its scores.

He was ultimately able to argue to overturn its decision.

“If it were up to COMPAS, I would still be in prison today,” Rodriguez said. “And here I am, five years later. I've been employed since two weeks out of prison; I received five promotions; I’m currently sitting in a senior leadership position at my organization … None of this would have been possible had it been for COMPAS.”

Few individuals would be able to achieve this feat of reverse engineering, of course, and a federal bill aims to help defendants and their counsel get access to such information without being held back by companies’ reluctance to let outsiders see their algorithms and training data.

JUSTICE IN FORENSIC ALGORITHMS


Rep. Mark Takano, D-CA, spoke during the panel to explain and promote his Justice In Forensic Algorithms Act. That policy would block private companies and developers from “us[ing] trade secrets privileges to prevent defense access to evidence in criminal proceedings.”

The act would also restrict federal law enforcement and the crime labs serving them to only using algorithms that had been tested against to-be-created standards. Those standards would aim to ensure the tools are fair and accurate enough to be used for such purposes, including by seeing if they cause disparate impacts on certain demographics.

Courts also would only be able to admit forensic algorithm analyses as evidence if the tools had been vetted against these standards.

“My position is not that I’m against these algorithms … but we need to make sure that defendants in a court of law and their attorneys are able to exercise, under the Constitution, due process rights to a fair trial,” Takano said.

Trade secret privileges are intended to help companies keep their intellectual property out of the hands of their competitors. While defendants aren’t likely to fall into this category, it’s possible they’d solicit third-parties to help them examine the forensic algorithms being used against them, sparking algorithm owners’ concerns, Takano said.

But there are ways to deal with that, Wexler said. Protective orders are used to protect trade secrets during civil cases and could be used in criminal cases as well, she said.

The stakes, she noted, are also not the same:

“At a minimum, somebody facing incarceration or death should get the same or better access to information than somebody that's defending in a contract dispute,” Wexler said.

A NARROW TOOL


Visibility into algorithms is essential in part because people tend to over-assume what forensic algorithms can do and what their conclusions mean, said Rediet Abebe, assistant professor of computer science at University of California, Berkeley.

Where AI systems shine is in running calculations rapidly, she said. Developers define a very specific problem and determine the variables the tools consider and data they draw on. That won’t capture the full context of the broader situation.

“There's this sort of assumption that just because algorithms are able to do something very narrowly specified, very quickly, that that makes them also very good at looking at a broader set of things,” Abebe said.

Today's defendants may find themselves arguing against algorithm-issued recommendations, with no clear idea of how those conclusions were reached or whether the algorithms have been tested to be accurate in cases like theirs.

A tool designed to compare DNA from a crime scene against that of suspects may not have been validated for situations in which the sample contains five or more individuals’ DNA, for example, Abebe said. Or the algorithm might be less accurate at identifying people over age 45 — something that juries ought to be made aware of.

“It could be the case that the software is like 95 percent accurate, right? But really, in my particular case, it's not — maybe it's 60 percent accurate in the case that you're using it,” Abebe said.
Jule Pattison-Gordon is a senior staff writer for Government Technology. She previously wrote for PYMNTS and The Bay State Banner, and holds a B.A. in creative writing from Carnegie Mellon. She’s based outside Boston.