Editor's note: The Digital Communities Special Report, which appears twice a year in Government Technology magazine, offers in-depth coverage for local government leaders and technology professionals. View all sections of the special report.
The New York City Fire Department (FDNY) has gone from reacting to fires to anticipating them. No, it doesn’t have a crystal ball, but it does have data, algorithms and predictive analytics on its side. The FDNY has taken paper-based information from building inspections and built a database that can be analyzed based on a risk-scoring algorithm to help anticipate a potential building fire. The predictive tool works because the data is clean and the system shares information from other departments.
City fire departments aren’t the only agencies taking advantage of analytics. The technology can be found in police departments, economic development efforts, public works, permitting, utilities and public transit, to name a few.
Fueling the growth is the explosion in data collection taking place in cities, whether it’s data from sensors or from unstructured sources, such as Web forms and video. At the same time, cities are under pressure to release the data and make it available to the public, creating new opportunities to analyze what was once hidden from view. Just as important, analytics technology is becoming cheaper to own, faster to use and better at complex problem solving, making it more valuable.
“If you are talking about being a smart city, you are talking about information technology, which is about data,” said Jennifer Robinson, director of local government solutions at SAS. “And data only becomes information when it can be digested. That’s why analytics is the backbone of any smart city solution.”
So far, many cities are pilot testing analytics to tackle discrete problems. But the goal is for cities to use analytics across the government enterprise. That requires a data center that can pull information from many different sources, similar to how the FDNY is using data from the city’s building department to predict potential fire hot spots. In Chicago, data sharing is helping the city predict rodent infestation and food safety. In the case of the latter, the city prioritizes food safety inspections by statistically classifying food establishments with respect to the probability of a possible violation.
But these cities are the exception when it comes to enterprise analytics. Most entities struggle with getting an analytics project off the ground. According to Gartner, more than half of analytics projects either fail or don’t deliver the expected results. Part of the problem is bad or unclean data, which produces poor results. Also, the data needs to be integrated. Despite years of discussion around the value of data sharing, many public-sector agencies are still unwilling to share with other departments.
But the good news is that analytics is getting better. New advances allow data to be analyzed prior to it being stored. Given how much data government is collecting and how much more it will have to handle as the Internet of Things matures, this could be a game-changer for analytics. “This technology is going to help make sure the right data is collected and analyzed appropriately, so that the non-relevant data is dumped,” said Robinson. The process is known as “analytics at the edge,” she said. “It’s a way to make sure data collection doesn’t become overwhelming.”
When the President’s Council of Advisors on Science and Technology released its 2016 report, Technology and the Future of Cities, it made an important point regarding the role of IT: “The urban ecosystem can benefit from the integration of a wide array of technologies that have been evolving rapidly, including systems to increase energy efficiency, renewable energy technologies, connected and autonomous vehicles, water and wastewater management systems, communications technologies to enhance connectivity, and new ways to do farming and manufacturing.”
The report looks at the entire urban ecosystem and presents a variety of ways that the federal government can help cities collaborate when it comes to advancing technology in a cost-effective way. Similarly, this series has looked at five key technologies that every city government should have if it’s to become a so-called smart city. On their own, each of the technologies — broadband, GIS, CRM, open data, analytics — provides a benefit to city operations and services. But the true impact comes when they are treated as part of an integrated system, rather than as singular solutions.
Along with having an enterprise vision when it comes to technology, government needs to have a coherent and sensible set of strategies and policies if it wants to maximize the smart city impact. That means having effective policies around privacy, security and open data sharing. Governments will need to craft creative initiatives to attract the talent needed to develop and run smart city solutions. And they will have to be willing to invest in the core technologies described in this report, but in a way that’s strategic and has enterprise objectives. In other words, the days of siloed solutions need to end.
For decades, cities have faced a host of challenges that have tested their ability to function. Today, new ideas and answers are emerging that have the potential to help them cope with growth and also to transform into sustainable and resilient places to live and work. At the core of this transformation will be information technology. Best to get it right.
Go back to the Digital Communities Special Report.
NEW ON THE PODCAST