IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Opinion: Dallas Police Early Intervention Hinges on Good Data

Dallas police have chosen First Sign, an early intervention system by Benchmark Analytics developed in partnership with University of Chicago researchers who’ve studied police misconduct for several years.

Shutterstock-police-data
(TNS) — In advocating for criminal justice reform, this newspaper has supported efforts to embrace an early intervention system that identifies police officers with problematic behaviors. We applaud the city of Dallas for making good on its commitment to adopt such a system.

Dallas police officials said last week that the department had chosen First Sign, an early intervention system by Benchmark Analytics, a company that developed its software in partnership with University of Chicago researchers who’ve studied police misconduct for several years. Their technology was developed by more than 20 data scientists.

Early intervention systems, also known as early warning systems, are not new. First Sign is billed as an improvement over older systems that used rigid thresholds to identify officers — for example, three complaints in six months. First Sign looks at an officer’s historical data on use of force, complaints, arrests and other factors and compares it to the records of her peers, meaning other officers with similar experience, rank and assignment. While other systems spit out a binary assessment (yes, this officer is at risk, or no, he isn’t), the First Sign technology uses risk scores and risk levels that can guide departments in figuring out who should be prioritized for intervention.

Our note of caution in using this system is for the public and the department’s leaders administering it to recognize that an unsubstantiated complaint is something that needs to be treated with care. Criminals are savvy enough to game the system with false complaints, and bad cops can hide in a system that doesn’t look at raw complaints.

Dallas police officials said officers who are flagged will be considered for support that helps them improve their performance and avoid future problems that can hurt the public and end careers. The system does not target them for disciplinary action.

Supervisors are expected to consider the context of flagged officers’ behavior, and they’ll have to meet with those officers to discuss a course of action. This can be coaching, additional training, counseling or more. Supervisors might conclude that no intervention is necessary. At any rate, Dallas police said supervisors’ recommendations will have to be approved by leaders up the chain of command.

City Council member Cara Mendelsohn raised concerns about the city not controlling the algorithm. We don’t see that as a problem. After all, rigorous research performed by experts should be the basis of this technology, not guesswork by city officials.

As is the case with other predictive software, it will be only as good as the data entered to get a result. We urge Dallas to periodically audit the system and to brief the City Council and the public at least once a year on the outcomes. Is the number of officers being flagged within the expected range? How are officers being helped, and how did those interventions change their performance?

An early intervention system is no replacement for good supervisors, who are in the best position to notice who is struggling. We are glad that the Dallas Police Department has a program in place that allows supervisors to initiate a referral to support an officer. Friends and family can also make confidential referrals for counseling and other services.

If the early intervention system works as intended, it will improve officers’ mental health and their interactions with residents. It will protect police and the public they serve. That’s an outcome we should all support.

© 2021 The Dallas Morning News. Distributed by Tribune Content Agency, LLC.