IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Task Force Examines Whether Local Gov Algorithms Can Be Fair

If you have thoughts about how local governments can or should use the data that they collect about you and your fellow citizens, a task force at the University of Pittsburgh wants to hear from you.

Algorithm_shutterstock_6506543351
Shutterstock
(TNS) — If you have thoughts about how local governments should use the data they collect about you and your fellow citizens, a University of Pittsburgh task force wants to hear from you.

The group — convened by Pitt’s Institute for Cyber Law, Policy, and Security — is looking at algorithms used by local governments for potential bias. A public meeting that had been scheduled for this week has been postponed, though comments can still be submitted online.

Algorithms are used by government agencies to predict decisions using past data, said Christopher Deluzio, policy director at Pitt Cyber and a task force member.

“We’re worried about locking in and perpetuating inequality and bias,” if algorithms are used without proper oversight, he said, speaking at a public meeting of the task force last week.

For instance, neighborhoods that have been historically policed more might have data that reflects more crime there, because of a greater police presence.

At a meeting last week at the Homewood-Brushton YMCA, several dozen people heard from task force members, asked questions, and discussed three algorithms used by local governments: the Allegheny Family Screening Tool, predictive policing, and algorithms used in bail decisions.

The Allegheny Family Screening Tool is used in screening calls for potential child neglect and has been used by the county’s Department of Human Services since 2016.

Predictive policing by Pittsburgh police uses a “hot spot” tool to predict where crime might occur, based on 911 and prior crime data.

Judges in Allegheny County use data such as past criminal history, age, and driving record, in tools used for bail decisions.

Much of the discussion last week focused on the uses and potential uses of algorithms in criminal justice settings, such bail and policing. The crowd was generally skeptical that the tools could be applied without bias.

“We live in a world that’s, unfortunately, biased,” said Tricina Cash, who attended the meeting.

Several in attendance lamented that a person might not know when an algorithm is being used to make a decision impacting them.

“I think there needs to be more transparency,” Richard Morris said.

Black youth already face harsher treatment by police than their white peers, said Tim Stevens, chairman of the Black Political Empowerment Project

“That’s why I have an issue with all this,” Mr. Stevens said.

More information is online at https://www.cyber.pitt.edu/algorithms.

©2020 the Pittsburgh Post-Gazette. Distributed by Tribune Content Agency, LLC.