The move puts ShotSpotter into a competitive and contentious space.
ShotSpotter, a gunfire detection system for police departments, signaled a move toward proactive enforcement this week by acquiring HunchLab and its crime-prediction software, according to a news release.
Now called ShotSpotter Missions, HunchLab’s program applies historical crime data, as well as current and future indicators, to statistical models and machine learning to forecast crime at specific places and times. The news release said Missions breaks a city into 250-square-meter “cells” and offers guidance on when to patrol each cell, what sort of crime is likely to occur there and what patrol tactics to use. It advises different patrol shifts based on factors such as seasonality, time of day, day of week, socioeconomic trends and upcoming events.
“We believe our investment will democratize the sharing of important intelligence with patrol officers who currently have limited direct access to crime analysts,” said Ralph Clark, president and CEO of ShotSpotter, in the statement. “ShotSpotter Missions provides officers in the field with their own personal virtual crime analyst to plan the most strategic and efficient patrol missions in an effort to reduce crime.”
ShotSpotter Missions will be available as both a standalone application and an add-on to the company’s gunfire detection system. By integrating Missions with its own gunfire data, ShotSpotter said it will be able to update predictive models and patrol missions in real time. The company plans to invest in more research and development to continually improve Missions and advance the use of AI technology in its other products.
According to the news release, ShotSpotter used existing cash on hand to fund the acquisition. By stepping into an arena with other predictive policing technologies like PredPol and CivicScape, ShotSpotter also potentially opens itself up to criticism from civil rights groups, which have raised concerns that the technology reinforces over-policing of minority neighborhoods. ACLU released a statement in 2016, co-signed by 16 other organizations, arguing that these technologies use data from biased enforcement and will therefore perpetuate it.
“Systems that are engineered to support the status quo have no place in American policing,” the statement said. “[Predictive policing] concentrates existing law enforcement tactics, and will intensify stringent enforcement in communities of color that already face disproportionate law enforcement scrutiny.”
Another gov tech company, the recently created CentralSquare, is also moving into predictive policing.