A Pitt task force will study algorithms used by Allegheny County, Pa., to spot possible bias. This initiative aims to ensure that historical discrimination and inequalities are not reinforced.
(TNS) — A task force convened by the University of Pittsburgh's Institute for Cyber Law, Policy, and Security will examine algorithms used by Allegheny County, Pa., in human service and criminal justice settings for potential bias.
“Increasingly, algorithms are being used to facilitate efficient government. We need to ensure that historical discrimination and existing inequities are not reinforced,” said David Hickton, Pitt Cyber founding director and task force chairman, in a statement Wednesday announcing the formation of the task force.
“Pittsburgh should lead the way in effective and fair oversight of these systems. We can be a national model, ensuring algorithmic accountability and equity for all residents,” said Mr. Hickton, a former federal prosecutor.
Allegheny County's Department of Human Services has used an algorithm to aid in screening calls about possible child neglect since August 2016. County human service officials have studied and modified the tool since its roll out, and have said they believe it reduces racial disparities in the child welfare system.
The “Hello Baby” program officials hope to roll out later this year also uses an algorithm to assess — almost as soon as children are born — potential risk for serious child abuse or neglect.
Algorithms are at use elsewhere in local government, such as a pretrial “risk assessment tool” that aims to aid judges in Pittsburgh Municipal Court in making bail decisions.
In addition to Mr. Hickton, members of the Pittsburgh Task Force on Public Algorithms include a number of academic, legal and foundation representatives and the task force will also have an advisory panel of Allegheny County and city of Pittsburgh officials.
While the group aims to make recommendations, Mr. Hickton said, such suggestions won't be binding for local governments.
The task force group is being convened with support from The Heinz Endowments.
“As Pittsburgh develops into one of the world’s leading centers of research and deployment of artificial intelligence, machine learning, and other emerging technologies, it is imperative that we simultaneously develop a set of ethics, policies and procedures informed by people who will be impacted by these technologies,” Heinz Endowments Chief Equity Officer Carmen Anderson said in a statement. “It’s particularly crucial as algorithms are used by complex systems with histories of racism and bias such as the criminal justice system."
The task force can aid government by doing things such as laying out protocols that should be followed, or suggesting an avenue for citizen complaints, said Erin Dalton, Department of Human Services deputy director for the office of analytics, technology and planning, who has spearheaded the Allegheny Family Screening Tool and Hello Baby efforts. She will be a member of the panel of local government officials that will be advising the task force.
Algorithms are involved in everything from decisions about supervision levels of people on probation and parole to mortgage applications, and when not used carefully and transparently can replicate existing inequalities, said Hannah Sassaman, policy director at the Philadelphia-based Media Mobilizing Project.
"Algorithms are a part of how human beings get judged by systems," said Ms. Sassaman, who is not a member of the Pitt task force but who has raised concerns about algorithms in pretrial decision making in Philadelphia and one put forth by the Pennsylvania Commission on Sentencing.
Community outreach meetings for the task force are scheduled for March 10 and 19.
The task force aims to publish a full report of findings and recommendations by summer 2021.
More information is available about the task force at https://www.cyber.pitt.edu/algorithms.
©2020 the Pittsburgh Post-Gazette Distributed by Tribune Content Agency, LLC.