Santa Cruz, Calif., Police Department uses algorithm to determine crime hot spots to assist with predictive policing.
The Santa Cruz, Calif., Police Department implemented a six-month predictive policing pilot project, which began July 1, to help officers predict certain types of crime in the city before it happens.
Through the predictive model, officers will patrol areas that weren’t previously receiving enough of a police presence with the goal of deterring crime.
The project uses an algorithm that is similar to what’s used for predicting earthquake aftershocks. “There’s a belief that certain crime types — in this case, burglaries and vehicle thefts — can be predicted in the same way,” said Zach Friend, the Santa Cruz Police Department’s press information officer and principal management analyst.
The algorithm was developed by George Mohler, an assistant professor in the Department of Mathematics and Computer Science at Santa Clara University in California. The Santa Cruz Police Department reached out to Mohler after reading about the algorithm in the Los Angeles Times.
The Police Department worked with Mohler for six months starting in October 2010 to develop the project for real-world implementation. Since the model had already been created through grant funding, the department didn’t have to pay to use it.
For the six-month pilot, the Police Department pulls crime data every day from its record management system that tracks crime that’s been reported in the city. The data is put into a spreadsheet and geo-coded and then run through Mohler’s Web-based computer algorithm.
The result is 10 maps outlining Santa Cruz’s crime hot spots, which are distributed to police officers, who then can patrol more efficiently based on that information.
Due to the daily calibration process, Friend said creating the maps is currently a bit of a cumbersome process. But he believes the predictive model may become more user-friendly as the computer program improves.
In the nearly two months of use, the pilot has garnered positive results. Since the pilot’s deployment, the model has correctly predicted 40 percent of the crimes that it was aiming to predict, and the Santa Cruz Police Department has seen a reduction in the types of crime that it’s been addressing.
In addition, the Police Department saw a 27 percent decrease in the number of reported burglaries in July compared with July 2010. Friend said the department won’t know how successful the model is until it’s been running for at least three months.
Friend said since the model’s implementation, the Police Department hasn’t changed other aspects of its operations, such as the number of officers patrolling the city or what shifts they work.
Although this is a new model for hot-spot and analytics-based policing, it isn’t the first. For example, Compstat, a similar tool to Santa Cruz’s, came into use in the mid-90s to help track more serious crimes in New York City.
Friend said Santa Cruz’s predictive policing project differs from Compstat and other hot spot-based policing tools because it is calibrated every day. Compstat and other tools aren’t constantly recalibrating the data to give patrol officers more exact times and locations for when and where they should patrol.
Santa Cruz’s model also removes potential biases officers may have about a particular area they patrol, according to Friend. If an officer has patrolled a certain neighborhood for a few years and is aware of problematic homes with inhabitants that have a history of drug use or criminal activity, that officer may feel inclined to spend additional time going by those locations.
“The model normalizes the information. It doesn’t look at people, it simply looks at crime,” Friend said. “[The model] may reinforce that you should go back to the [problem] area, but maybe only twice that week as opposed to all four days that you work your shift.”
Looking for the latest gov tech news as it happens? Subscribe to GT newsletters.