IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

How Emergency Managers Can Benefit from Big Data

Large volumes of data sets derived from sophisticated sensors and social media feeds are increasingly being used by government agencies.

During a disaster, life-saving decisions are often made based on the most current information of a situation and past experiences in similar circumstances. While that’s a tried-and-true approach, the availability of complex, computer-generated data streams is changing the ball game for some emergency managers.

Large volumes of data sets — commonly referred to as big data — derived from sophisticated sensors and social media feeds are increasingly being used by government agencies to improve citizen services through visualization and GIS mapping. In addition, big data is enabling responders to react to disasters more efficiently.

Volunteers at Splunk, an operational intelligence software provider, are involved in a project that culls data from Twitter feeds. By analyzing keywords along with time and place information, a pattern of activity in a particular area can be unearthed.

The idea was used during Superstorm Sandy. FEMA created an innovation team composed of public agencies and private companies. One of the participants was Geeks Without Bounds, a nonprofit humanitarian project accelerator, which partnered with Splunk’s charity arm, Splunk4Good, to apply the social media analysis.

Team members working on the project looked at hashtags and words in Twitter feeds as well as Instagram photos related to Sandy, evacuation rates in specific areas and other keywords about resources, such as power, food, fuel and water. Using that data, the team plotted out locations where supplies might be most needed and got a finger on the pulse of a community’s sentiment about available resources.

“You can imagine the ways it can be used in real time for response during an emergency,” said Stephanie Davidson, director of federal civilian sales for Splunk. “It’s really helpful for where to allocate those resources to the places that need them most.”

Government agencies have been using social media data for sentiment analysis and public relations for a while. But according to Art Botterell — associate director of the Disaster Management Initiative at Carnegie Mellon University, Silicon Valley —  practical use by emergency management agencies for response, recovery and preparation activities is fairly new.

Botterell called current efforts of emergency managers using social media a period of rich experimentation, where decision-makers must determine whether big data derived from Twitter and Facebook should be further incorporated into practical emergency situations, or used simply as a communication tool.

“This is an area that has been technology- and concept-driven, which is how most innovation happens, but now we’re getting to the point where it all falls under the big data tent [and] how do we know what is more useful and less useful,” Botterell said. “This is a conversation that I haven’t heard emergency managers having.”


The Challenges


While computer-generated data has been a staple in decision-making processes for government and emergency personnel in the past, big data takes the volume and complexity to another level. As the data has expanded, so has the ability of companies and individuals to analyze it and apply the findings.

Theresa Pardo, director of the Center for Technology in Government at the University at Albany, State University of New York, said the extent to which emergency management organizations can embrace big data relies on the culture within those agencies. If resources allow for analysts to spend time combing through data and putting out a presentation that is usable, high-volume data can be an asset.

But Pardo admitted that’s an ideal situation that’s likely not present in most emergency management agencies nationwide.

“That perfect model doesn’t really exist everywhere,” she said. “If we think about the adoption of big data, we also have to look at the maturity of … the data use environment generally within any emergency management community or agency.”

Ted Okada, chief technology officer of FEMA, agreed and said that emergency agencies and the industry as a whole is still in the very early stages of using big data. As a community, he said, emergency managers need to learn how to extract the right bits of information at an early stage during a disaster.

GIS was one of the first forays into complex data streams for FEMA. The agency works closely with a variety of organizations such as the National Oceanic and Atmospheric Administration and the U.S. Geological Survey to access their real-time data and create predictive models containing high-resolution maps and sensor data to help FEMA prepare for storms and other events.

For example, during Sandy, FEMA accessed more than 150,000 geo-tagged photos from the Civil Air Patrol, which helped the agency perform assessments and make better decisions.

“All that big data helped us very quickly come to a very definitive answer on how many people were affected,” said FEMA Geospatial Information Officer Chris Vaughan. “It helped us determine who was exposed and where there were structural damages so we could do a better job of providing assistance to disaster survivors faster than we have ever done before.”

Social media is a different story. Ole Mengshoel, associate research professor of electrical and computer engineering for Carnegie Mellon University, Silicon Valley, said restrictions on the availability of public data on social media sites could slow progress in using it as a reliable tool in the big-data arena. Users who protect their tweets and make their Facebook postings private limit the amount of data available and therefore impact the data’s scope and dependability.

From an academic point of view, Mengshoel said it “would be a pity” if big data’s potential based off social media data streams wasn’t reached because the companies were too protective of it. Although there are privacy and proprietary concerns with sharing some of that information, Mengshoel said that for emergency managers to truly harness the power of social media data, they’ll need the ability to sample or access it.

GIS and sensor data may be easier to come by, but presenting that data in a useful form can be a daunting task. Vaughan said it is “insane” how many layers of information can be embedded on a Web-based map. The real challenge, said Vaughan, lies in putting the data in an easily understood format for emergency managers.

“The faster we can provide imagery to the right person or group of people with the right assessments, it helps us streamline and make better decisions,” he said.



Looking Ahead


Despite the challenges, Pardo feels the attention on big data will eventually benefit the industry. She believes that because there’s so much new data being generated, decision-makers will get more confident leveraging analytical information in policy development, program evaluation and delivery.

Pardo called big data’s exposure in the last few years a mutually reinforcing process that draws attention to the need for a higher level of capability to use data more generally in the emergency management community, be it big or small.

Event simulations is one area that Pardo felt big data could help improve. She said that as a major part of responders’ preparation activities, disaster simulations can at times suffer from a lack of statistical information to fuel predictive models.

So where earthquakes, hurricanes or even shoreline erosion events are being trained for, large-volume data sets could help increase the accuracy and reliability of those models.

“We’re in the phase right now where there’s a lot of very obvious and relatively straightforward ways to use these large-volume data sets,” Pardo said. “But we’re just beginning to develop new analytical tools and techniques to leverage that data.”

Splunk4Good has made some inroads, Botterrell said, improving efficiency using big data could take some time. Actual emergency situations aren’t the best times to test the quality of data and do experiments because lives are usually at stake, he explained.

Exposing people to large data sets doesn’t mean decision-making will be more accurate, Okada said. He said it could be a small fraction of a larger set of trends that can be overlooked that leads to a bad decision during a disaster.

Instead of relying solely on data, Okada said a three-pronged approach can help protect decision-makers from the pitfalls of information overload. He referenced a principle from Robert Kirkpatrick, director of the Global Pulse initiative of the United Nations Secretary-General, a program that aims to harness the power of big data, as one way to prevent mistakes.

Kirkpatrick advocates using the power of analytics combined with the human insight of experts and leveraging the wisdom of crowds.

“That kind of data triangulation can help protect us going forward,” Okada said.

Brian Heaton was a writer for Government Technology and Emergency Management magazines from 2011 to mid-2015.