Exclusive: Predictive Policing Startup Publishes Code Online, Seeks to Address Bias

CivicScape, now in the pilot test phase, thinks it can answer concerns about predictive policing.

  • Facebook
  • LinkedIn
  • Twitter
  • linkText
  • Email
CivicScape, an early stage startup, wants to fight crime. And it wants to show you exactly how it does it.

Brett Goldstein — whose work has included stints with the Chicago Police Department, Argonne National Laboratory and the tech company OpenTable — has been working on something akin to CivicScape for about seven years now. And with six city-based pilot projects underway, he is ready to launch the predictive policing company that he hopes will address the controversial aspects of the practice.

The company is one of the first investments of Ekistic Ventures, of which Goldstein is a managing partner. In fact, Ekistic is effectively serving as the creator of CivicScape — on top of traditional venture capital-style investments, Ekistic wants to launch at least one startup per year.

Predictive policing, or the use of data to identify crime “hot spots” and deploy officers accordingly, is controversial. Ezekiel Edwards, director of the Criminal Law Reform Project at the American Civil Liberties Union, said the problems with the practice are manifold.

“Broadly speaking," he said, "the quality of the data that police are using to make predictions raises significant concerns, [as does] the lack of transparency that the predictive policing programs seem to embrace, such that we don’t really know how they’re being used and how they’re constructed."

That is, if police are already biased — say, by searching black people for drugs more frequently than white people — then using the data from their work will likely amplify that bias. To a computer program designed to look for patterns in policing, it could look like police should search for drugs in black neighborhoods more often. Researchers, however, have found that black and white people use illegal drugs at about the same rate.

CivicScape wants to do things a little differently. First, it wants to identify which data is most biased and then either remove that data from consideration or adjust for its bias. Second, it wants to use machine learning algorithms to constantly check for bias. Third, it wants to publish its algorithms online for researchers and the general public to see.

Or, rather, it has already published its algorithms online. They’re on GitHub.

“We don’t want to say, ‘Trust us, and we’re going to build an algorithm behind closed doors,’” said Anne Milgram, former New Jersey attorney general and chair of CivicScape’s board of directors.

The company makes no attempt to claim that it’s eliminated bias. Rather, its documents and representatives talk a lot about how to find bias where it exists and minimize its influence on outcomes. Then, by making its work public, it’s hoping that researchers will find ways to improve the tool over time.

CivicScape also approached ACLU and other civil rights organizations while building the tool to get feedback about their predictive policing concerns and address them. Edwards said he thinks the company’s transparency is a good move, and that other predictive policing schemes should follow suit.

“From my understanding, CivicScape is putting a premium on transparency,” he said. “They are doing something that no other major developer of predictive policing has done yet to my knowledge, which is to make the method, the algorithm [and] the weighting system transparent.”

According to Milgram and Goldstein, CivicScape has been able to achieve high accuracy rates while generating hot spot maps that take new data into account in real time. The tool puts its police-specific data alongside external data like weather patterns and 311 reports in order to create its location-specific risk scores. Many data-based policing tools run on three-year averages. According to Milgram, that doesn’t reflect the shifting nature of crime risk.

“Having the ability to do it within three blocks and one hour is game changing when you think about protecting residents of a community,” she said, adding that the 311 inputs are also valuable because they create avenues for the police department to work better with other local government agencies.

“[It enables] the ability of the chief to say, ‘We know you can’t fix all 20,000 broken lights tomorrow, but can you fix these 200 … because there’s a lot of community concern around them?’” Milgram said.

Though Edwards doesn’t have a firm number on how many of the roughly 18,000 police departments in the U.S. are using predictive policing, he thinks the practice is becoming more common.

“Most major police departments I think are either using or seeking to use some form of predictive policing," he said, "and it would be a matter of time and access and development for most smaller departments to do the same."

Though Edwards applauds CivicScape’s transparency, he’s not sure the company has solved all of predictive policing’s problems just yet — particularly, the bias problem.

“It seems they’re aware of that concern, and similar to their openness on transparency, I think they’re open to ideas and suggestions on how to generate a fairer tool, but I’m not convinced that their tool will be necessarily more equitable than other tools before it,” he said. “That remains to be seen.”

Goldstein said he expects to publicly announce which cities CivicScape is working with in a matter of weeks.

  • Facebook
  • LinkedIn
  • Twitter
  • linkText
  • Email
Ben Miller is the associate editor of data and business for Government Technology. His reporting experience includes breaking news, business, community features and technical subjects. He holds a Bachelor’s degree in journalism from the Reynolds School of Journalism at the University of Nevada, Reno, and lives in Sacramento, Calif.