IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Navigating the Data Flood in an Emergency

Often in coordination with other agencies, city technology teams are often inundated with needs across data collection, distribution and decision-making.

There’s another type of flood to deal with in the midst of a natural disaster: a flood of information. From 311 submissions to sensors to social media, city technology teams are often inundated with needs across data collection, distribution and decision-making — often in coordination with other agencies.

The most unpredictable times, however, are when cities lean the hardest on their systems for gathering information, implementing responses and communicating effectively both internally and with the public. Gathered at the Summit on Data-Smart Government, chief data officers from across the country discussed their challenges, and the strategies they’ve adopted to meet them, in emergency preparation and response.

Standardize, Standardize, Standardize

The first time you’re running a process or pulling a report shouldn’t be in the face of a natural disaster, noted Maksim Pecherskiy, chief data officer for the city of San Diego. Cities need standardized systems for preparing and distributing reports across departments, as well as clear coordination systems in place in to ensure seamless response across departments.

In New York City, these systems are put to the test through “data drills,” which expose any gaps in availability or quality of data, and provide insight into the amount of time required to accomplish various tasks, between planned and actual response. Governance structures vary, with some cities opting for more centralized systems of data storage and others encouraging departments to own and govern their own data.

Predict When Possible

Cities have applied predictive analytics to a wide range of disaster preparedness and response efforts — from identifying areas where the elderly in Chicago were at greatest risk of heat stroke to finding buildings with the greatest fire risk in Pittsburgh to pinpointing homes in New Orleans least likely to have a fire alarm. The opportunities to innovate with predictive models are numerous, for instance in estimating the effect of a disaster on a city’s municipal services system. With heavy snow or a flood, as South Bend Chief Data Officer Santiago Garces noted, a city can reasonably expect a strain on its other operations, such as missed trash pickup. Asking questions about how to predict these side effects, inform residents and help them access the services they really need can vastly improve response during an emergency.

Find the Needle in the Data Haystack

Typical procedure for a city’s emergency response team is to solicit images or reports from residents on social media to gauge areas of high need. According to Barney Krucoff, chief technology officer in Washington, D.C., that data can be extremely valuable (if anecdotal) and also foster a relationship of greater trust with citizens. However, data that is unstructured or not geo-tagged can also be overwhelming or even misleading. Organizing the plethora of data inputs and identifying key points for escalation and intervention is another emerging challenge in cities across the U.S.

The confusion this type of information inundation can elicit isn’t limited to Twitter, either. Pecherskiy relayed an experience of soliciting data from a wide range of housing organizations, only to find that the data was being delivered in a variety of formats that made analysis and identification of duplicates nearly impossible. Well before an emergency strikes, cities need to think carefully about what data they might need and organize the information so that it’s actionable in a real disaster scenario.   When it comes to disaster preparedness, it obviously behooves cities to be, you know, prepared. The systems can’t simply be in place, they must be tested and improved routinely — not saved for a rainy day.