During a disaster, life-saving decisions are often made based on the most current information of a situation and past experiences in similar circumstances. While that’s a tried-and-true approach, the availability of complex, computer-generated data streams is changing the ball game for some emergency managers.

Large volumes of data sets — commonly referred to as big data — derived from sophisticated sensors and social media feeds are increasingly being used by government agencies to improve citizen services through visualization and GIS mapping. In addition, big data is enabling responders to react to disasters more efficiently.

Volunteers at Splunk, an operational intelligence software provider, are involved in a project that culls data from Twitter feeds. By analyzing keywords along with time and place information, a pattern of activity in a particular area can be unearthed.

The idea was used during Superstorm Sandy. FEMA created an innovation team composed of public agencies and private companies. One of the participants was Geeks Without Bounds, a nonprofit humanitarian project accelerator, which partnered with Splunk’s charity arm, Splunk4Good, to apply the social media analysis.

Team members working on the project looked at hashtags and words in Twitter feeds as well as Instagram photos related to Sandy, evacuation rates in specific areas and other keywords about resources, such as power, food, fuel and water. Using that data, the team plotted out locations where supplies might be most needed and got a finger on the pulse of a community’s sentiment about available resources.

“You can imagine the ways it can be used in real time for response during an emergency,” said Stephanie Davidson, director of federal civilian sales for Splunk. “It’s really helpful for where to allocate those resources to the places that need them most.”

Government agencies have been using social media data for sentiment analysis and public relations for a while. But according to Art Botterell — associate director of the Disaster Management Initiative at Carnegie Mellon University, Silicon Valley —  practical use by emergency management agencies for response, recovery and preparation activities is fairly new.

Botterell called current efforts of emergency managers using social media a period of rich experimentation, where decision-makers must determine whether big data derived from Twitter and Facebook should be further incorporated into practical emergency situations, or used simply as a communication tool.

“This is an area that has been technology- and concept-driven, which is how most innovation happens, but now we’re getting to the point where it all falls under the big data tent [and] how do we know what is more useful and less useful,” Botterell said. “This is a conversation that I haven’t heard emergency managers having.”

The Challenges

While computer-generated data has been a staple in decision-making processes for government and emergency personnel in the past, big data takes the volume and complexity to another level. As the data has expanded, so has the ability of companies and individuals to analyze it and apply the findings.

Theresa Pardo, director of the Center for Technology in Government at the University at Albany, State University of New York, said the extent to which emergency management organizations can embrace big data relies on the culture within those agencies. If resources allow for analysts to spend time combing through data and putting out a presentation that is usable, high-volume data can be an asset.

But Pardo admitted that’s an ideal situation that’s likely not present in most emergency management agencies nationwide.

“That perfect model doesn’t really exist everywhere,” she said. “If we think about the adoption of big data, we also have to look at the maturity of … the data use environment generally within any emergency management community or agency.”

Ted Okada, chief technology officer of FEMA, agreed and said that emergency agencies and the industry as a whole is still in the very early stages of using big data. As a community, he said, emergency managers need to learn how to extract the right bits of information at an early stage during a disaster.

GIS was one of the first forays into complex data streams for FEMA. The agency works closely with a variety of organizations such as the National Oceanic and Atmospheric Administration and the U.S. Geological Survey to access their real-time data and create predictive models containing high-resolution maps and sensor data to help FEMA prepare for storms and other events.

Brian Heaton  | 

Brian Heaton was a writer for Government Technology magazine from 2011 to mid-2015.