IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Privacy, Justice Reform Advocates Voice Police Tech Concerns

Legal system reform advocates say new policing technologies such as decision-making algorithms and facial recognition can exacerbate problematic practices, making them more efficient as well as more opaque.

surveillance-cameras.jpg
Privacy researchers, legal system reform advocates and others raised concerns about police surveillance tools and decision-influencing algorithms during a recent Georgetown University panel.

The panel — called Cop Out: Automation in the Criminal Legal System — discussed what’s at stake for communities subjected to these tools and whether new technology is simply exacerbating long-standing issues in the criminal justice system and reinforcing disparities. The public doesn’t always have insight into algorithms’ workings or surveillance tools’ deployments, either, which can make it difficult for communities to respond, some panelists said.

The conversation around rapidly evolving police technologies is ongoing, and such technologies have proven deeply controversial. California, for one, is currently considering how it wants to approach police use of facial recognition technology. Some lawmakers are pushing to restrict use, citing fears of privacy invasion and of misidentifications leading to arrest, while other lawmakers say it’s too useful a public safety tool to give up entirely.

Decision-influencing algorithms, too, have been long-disputed. Pretrial risk assessment algorithms were intended to help judges make more considered decisions, but researchers have questioned the tools’ fairness, accuracy and whether they unduly sway judges’ thinking.

As civic leaders and policymakers consider how best to approach these kinds of tools, it’s important to understand the risks and the arguments against them, including those presented by the panel.

WHAT’S AT STAKE


Assia Boundaoui is a journalist and the filmmaker who has explored FBI surveillance of Chicago’s Arab and Muslim Americans. She said that discussions of police surveillance technology often spark privacy concerns, but that many affected communities are most worried about the loss of their liberty and livelihoods.

“[For] a lot of communities that are surveilled, privacy is not the primary concern,” Boundaoui said. “People are worried about criminalization, about deportation, about losing their work because agents show up at their workplace. There are all of these harms that are erased when we talk about this just within the framework of privacy.”

Police use of facial recognition technology has been particularly controversial, with the technology more likely to misidentify nonwhite faces. All five people who have been wrongfully arrested based on the tools’ misidentifications were Black men, said Meg Foster, who moderated the panel. Foster is a Justice Fellow at the Center on Privacy and Technology, a Georgetown Law research and advocacy think tank focused on privacy and surveillance policies.

Crime forecasting algorithms intended to identify hot spots also run the risk of reinforcing disparities, because they often train on historical data. If police historically centered their attention on certain neighborhoods, that would result in more crime data for those neighborhoods, in turn prompting the algorithm to recommend greater policing in these areas, creating a feedback loop. And if the data wasn’t accurate in the first place, that creates more problems.

“Surveillance and algorithmic tools create feedback loops that trap targeted individuals and their communities in a cycle of criminalization,” Foster said. “ … These algorithms rely largely on historical policing data — data that reflects decades of biased, corrupt and even unlawful policing practices and that often incorporates inaccurate and falsified records.”

Nasser Eledroos gestures as he talks. Meg Foster turns toward him. Both are seated behind a table with microphones.
Nasser Eledroos (left) and Meg Foster (right)
Screen shot

She pointed to Chicago’s gang database errors as one example. In 2019, Chicago’s inspector general found the database to be “filled with outdated, unreliable and incorrect information on thousands of individuals who have often been listed as gang members with little or no reasoning,” reported WTTW. The vast majority of entries were for Black and Latino residents.

Freddy Martinez is a senior researcher at the Project on Government Oversight, a nonpartisan watchdog organization focused on government wrongdoing. He said the right answer may be to remove a technology entirely.

Martinez, who is from Chicago, said his organization helped provide community members there with information about ShotSpotter in 2021, after the controversial gunshot detection tool led police to respond to an area where an officer then fatally shot a 13-year-old.

After receiving ShotSpotter alerts about a neighborhood, police had hurried to the area where one of the officers shot the youth, “within five minutes of the initial alert,” per The Hill. A lawsuit from the youth’s family charges that the officer had no justification. The officer now faces potential firing, with the Chicago Police Board holding a status hearing on the matter in May 2023.

“We should not be afraid to say that, ‘As part of the movement for justice, we have to deal with the technology and we have to tear it up,’” Martinez said. “… Politicians hate hearing that … . but that is the solution.”

ShotSpotter has since rebranded itself as SoundThinking. A spokesperson for the organization called Toledo’s shooting “a horrific tragedy” and said that the tool was only one source drawing police to the area.

“There were multiple 911 calls to report an active shooter in addition to the ShotSpotter alerts,” the spokesperson said in an email to GovTech. “ShotSpotter alerted the Chicago Police Department to the sound and precise location of gunfire, and that was the extent of our involvement.”

The spokesperson said the tool has a 97 percent accuracy rate, with a “.5 percent false positive rate across all customers between 2019-2021.”

In 2021, ShotSpotter hired economics and statistical data consulting firm Edgeworth Economics to audit its 2019 and 2020 data to assess accuracy rates, and later to audit its 2021 data. The consultancy reports finding a 97.59 percent accuracy rate for 2019-2020 and 97.69 percent accuracy rate for 2021. Inaccuracies could include false positives (the tool mistakenly sending out a gunfire alert when no gunfire had actually occurred), failures to detect or send alerts about gunfire that did occur or misidentifications about the location of the gunfire. This analysis found a 0.36 percent false positive rate from 2019-2021.

FALSE AUTHORITY?


Algorithms — and their numerical outputs — can seem scientific, and thus more objective or authoritative than the tools really are, Foster said. But in reality, deciding what information to feed into an algorithm and how it should weigh each variable, is effectively a policy decision.

Risk assessment algorithms might be used to influence decision-making in areas like setting bail, prison supervision and parole. But Foster said these tools often draw on biased data sets and consider details they shouldn’t.

Alongside weighing details about defendants’ criminal records or their alleged offenses, the tools might consider information about their life circumstances such as “education and employment histories, housing stability, family relationships or community ties,” Foster said. In doing so, the tools are “ultimately assigning numeric value to social, political and context-dependent categories of information, and thereby passing on policy decisions as scientific ones,” she said.

Similar concerns were raised by Nasser Eledroos, former technologist for the district attorney’s office in Suffolk County, Mass., and currently the managing director of Northeastern Law’s Center on Law, Innovation and Creativity.

“Outside of Massachusetts … I’ve seen before people being asked whether or not they have siblings or other relatives within their family who have been convicted for a crime [and] that being fed into a decision-making tool as a determining factor as to whether they are likely to be at risk,” in determinations around parole or bail, Eledroos said. But such information is “totally not important, not relevant to whatever case, whatever is happening at present.”
Panelists seated at a long table, that's covered in a white tablecloth. Each speaker has a microphone and water bottle. At the far left, Paromita Shah is speaking. Everyone else is looking toward her.
Panelists, left to right: Paromita Shah, of Just Futures Law; Puck Lo, of Community Justice Exchange; Assia Boundaoui, journalist; Freddy Martinez of Project on Government Oversight; Nasser Eledroos of Northeastern Law’s Center on Law, Innovation, and Creativity; Meg Foster of Georgetown Law's Center on Privacy & Technology.


Screenshot

OLD PROBLEM, NEW FORM


The surveillance tools and decision-making algorithms may also be so widely — and quietly — adopted that it becomes difficult for those affected by the technologies to fully understand and question them.

“These tools are so subtly embedded into our environment as to remove them from the public eye, making it difficult to expose and critique not only the technologies themselves, but also the power asymmetries,” created by them, Foster said.

Modern technologies like these can make existing problematic practices more efficient as well as more opaque. Still, the root cause of most of these harms aren’t the tools but rather certain underlying assumptions baked into the criminal system, said Puck Lo, research director at the Community Justice Exchange, which promotes abolishing the current prison and policing system.

“It's easy to misunderstand the target and say the problem is the algorithm,” Lo said. “ ... But what we miss in that targeting is the entire criminal legal system and, in the apparatus that it upholds, multiple kinds of state violence.”

Court systems that “structure their processes as punishment, before any conviction is even levied,” and parole assessments that are harsher toward people who have less financial stability, for example, are issues that persisted before modern technologies started being involved in these practices, she said.

“If we’re talking about paper checklists in the city of New York, for instance, that store whether or not you are houseless, whether or not you have a job — and these things all count against you if you want to get out of jail easier,” Lo said. “A lot of the algorithms that make use of this paper checklist are just fancier versions of that.”

Eledroos spoke similarly, saying technologies take existing problematic practices and make them faster and thus more damaging.

“Carceral technology … takes what is already happening underlying the justice system — absent the technology — and then, facilitates its expeditious, repetitive output and makes it quicker and more harmful in communities in which it operates,” he said. “At the end of the day, we're talking about the justice system … technology are ultimately tools by which we augment that existing system.”
Jule Pattison-Gordon is a senior staff writer for Government Technology. She previously wrote for PYMNTS and The Bay State Banner, and holds a B.A. in creative writing from Carnegie Mellon. She’s based outside Boston.