The city's partnership with insurer Allstate to leverage data analysis as a means to improve city services could represent a new means for public CIOs to take advantage of private-sector expertise.
In the past, public-private partnerships have mostly involved the delivery of services, much like the way Chicago teams with Comcast to bring the Web to low-income families through the Internet Essentials partnership.
Recently, though, the city has found a new way to partner with the private sector, joining with insurer Allstate to leverage data analysis as a means to improve city services. In an era driven by big data and data analytics, such efforts could represent a new means for public CIOs to take advantage of private-sector expertise.
The city has just 42 health inspectors to monitor some 15,000 food establishments. The best way to tackle this Herculean task is to start by checking in on those restaurants most likely to be in violation of code. But how to know where to begin?
Chicago IT officials paired their extensive open data with Allstate’s analytic muscle to come up with a solu-tion, leveraging outside talent to augment internal skills. At a time when skilled workers with deep analytical skills are in short supply, the partnership gave the city access to capabilities it might not otherwise have been easily able to acquire.
“We have been looking for ways to use analytics and data to improve the quality of life for residents, or to improve the efficiency of city operations,” said Chicago Chief Data Officer Tom Schenk. To that end, city IT managers and Allstate volunteers looked at a wide range of variables: Does the establishment serve alcohol? How long has it been since the last inspection? Have there been local burglaries, sanitation complaints or unusual weather? In all, some 600 data sets were considered.
“All of that information is available on the city’s open data portal, so it was essentially a matter of gathering what was already there and pulling it all together,” said Gavin Smart, Allstate’s quantitative research director.
Those data produced results in an initial two-month trial, with the city proving that data analysis could help inspectors zero in on potential problems much faster than in the past.
In September and October 2014, the health department found 1,637 violations in eating establishments. Of those, 55 percent of critical violations showed up in the first month and 45 percent in the second month. That’s about as expected.
At the same time, the city ran its own model, using data analytics to determine the most efficient hypothetical routes for inspectors. In the simulation, 69 percent of violations turned up in the first month. Data-driven inspec-tions were more timely and more effective, meaning fewer people would have potentially gotten sick. That’s exactly the outcome project planners were looking for. “We want to catch the violations earlier,” Schenk said. “That is what reduces the exposure of patrons to unsanitary restaurant conditions.”
Why did the data-driven model show such strong results? A number of factors likely came into play, including the productive teaming of the city CTO’s three data scientists with the Allstate volunteers, as well as a quantita-tive approach that’s just beginning to take hold in public management processes.
The Chicago simulation worked because data works. As public CIOs are learning, the rise of big data and analytics has profound real-world implications.
In this case, the city’s access to data-savvy talent came about partly thanks to the Civic Consulting Alliance, a nonprofit group that helps to broker public-private alliances in Chicago. The use of data is a powerful new tool in these partnerships, said alliance CEO Brian Fabes.
“The challenges have changed and now the tools have changed, in the evolution of technology and the ability to dive in and use data and analytics to find new ways to do things,” he said. “This is a great, very specific ex-ample of that. And we are starting to see lots of other examples like this all around the country, with big data and data analytics providing a window on what is happening and generating ideas about what is to be done.”
Allstate, for its part, saw the project as a way to drive positive social action from the kinds of data-driven modeling it already has been doing in the insurance arena.
“That is the beauty of working with data and mathematical models, that you can see the impact of the things you are doing, you can see how well they are performing,” Smart said. “When you can quantify things, it is easi-er to see how well you are doing and what the opportunities are.”
Allstate engaged with the city through its Project Lightbulb, a company program that gives each employee four hours a week to pursue work-related interests outside their daily responsibilities. Sometimes this includes professional enrichment; sometimes employees use the time to work on theoretical models and tools. In this case, three to four analysts used their time to help the city solve its restaurant problem.
“This is a great opportunity to give some of our folks a chance to learn from people in different fields, as well as help those folks make some progress in their particular areas,” said Smart. “And it’s a great way to be able to give back to the community.”
To that end, Allstate volunteers worked primarily on developing data sets and building mathematical models based on available data. That information is sprawling — 600 data sets were put into play. This raises a question: With big data capable of doing so much, how do analysts know when to stop? If they are counting everything from the weather to local robbery statistics, is there anything that doesn’t go into the equation?
“Whenever you are approaching analytic-type projects or problems, you can almost go on forever looking at data sources that might be predictive,” Smart said. The solution is to work within constraints — how fast do we need a result? — and also to accept that any improvement, however imperfect, still counts as a win.
“If we can get something done, get a Version 1 model completed within some reasonable amount of time that shows some reasonable improvement, we can always look to other people to move it forward,” he said.
In the push to put data into play, Schenk noted that there are pros and cons to the kind of public-private ar-rangement that arose here. While the project did give the city access to data pros at a time when such talent is pricey and hard to come by, it’s also true that any engagement with an outside partner can bog down the public CIO in contractual paperwork.
At the same time, the move toward opening up access to public data can help overcome such bureaucratic roadblocks. “When the data is already publicly available, there is no nondisclosure agreement we need to sign,” said Schenk. “It lets us sidestep months of work, since they already have the ability to access that data very easily.”
Still, that very openness merits a word of caution from Smart, who reminds CIOs that analytics will forever be a moving target.
“As the analytic capability grows, people need to be mindful of how to maintain a lot of this. From our own experience in predictive modeling, we know that if you have a model that has been in production for quite a while and then trends start to change, it can produce some deterioration,” he said. “You need to have the re-source and capabilities to keep refreshing and rebuilding those models.”
For now, the partnership between the city and Allstate is continuing beyond food inspection analysis. In their latest joint venture, analysts have set their sights on prioritizing elevator inspections based on available data. The hope is to isolate those most likely to fail in order to maximize the benefits of city inspections and best serve the cause of public safety.
Looking for the latest gov tech news as it happens? Subscribe to GT newsletters.