For example, during Sandy, FEMA accessed more than 150,000 geo-tagged photos from the Civil Air Patrol, which helped the agency perform assessments and make better decisions.

“All that big data helped us very quickly come to a very definitive answer on how many people were affected,” said FEMA Geospatial Information Officer Chris Vaughan. “It helped us determine who was exposed and where there were structural damages so we could do a better job of providing assistance to disaster survivors faster than we have ever done before.”

Social media is a different story. Ole Mengshoel, associate research professor of electrical and computer engineering for Carnegie Mellon University, Silicon Valley, said restrictions on the availability of public data on social media sites could slow progress in using it as a reliable tool in the big-data arena. Users who protect their tweets and make their Facebook postings private limit the amount of data available and therefore impact the data’s scope and dependability.

From an academic point of view, Mengshoel said it “would be a pity” if big data’s potential based off social media data streams wasn’t reached because the companies were too protective of it. Although there are privacy and proprietary concerns with sharing some of that information, Mengshoel said that for emergency managers to truly harness the power of social media data, they’ll need the ability to sample or access it.

GIS and sensor data may be easier to come by, but presenting that data in a useful form can be a daunting task. Vaughan said it is “insane” how many layers of information can be embedded on a Web-based map. The real challenge, said Vaughan, lies in putting the data in an easily understood format for emergency managers.

“The faster we can provide imagery to the right person or group of people with the right assessments, it helps us streamline and make better decisions,” he said.

Looking Ahead

Despite the challenges, Pardo feels the attention on big data will eventually benefit the industry. She believes that because there’s so much new data being generated, decision-makers will get more confident leveraging analytical information in policy development, program evaluation and delivery.

Pardo called big data’s exposure in the last few years a mutually reinforcing process that draws attention to the need for a higher level of capability to use data more generally in the emergency management community, be it big or small.

Event simulations is one area that Pardo felt big data could help improve. She said that as a major part of responders’ preparation activities, disaster simulations can at times suffer from a lack of statistical information to fuel predictive models.

So where earthquakes, hurricanes or even shoreline erosion events are being trained for, large-volume data sets could help increase the accuracy and reliability of those models.

“We’re in the phase right now where there’s a lot of very obvious and relatively straightforward ways to use these large-volume data sets,” Pardo said. “But we’re just beginning to develop new analytical tools and techniques to leverage that data.”

Splunk4Good has made some inroads, Botterrell said, improving efficiency using big data could take some time. Actual emergency situations aren’t the best times to test the quality of data and do experiments because lives are usually at stake, he explained.

Exposing people to large data sets doesn’t mean decision-making will be more accurate, Okada said. He said it could be a small fraction of a larger set of trends that can be overlooked that leads to a bad decision during a disaster.

Instead of relying solely on data, Okada said a three-pronged approach can help protect decision-makers from the pitfalls of information overload. He referenced a principle from Robert Kirkpatrick, director of the Global Pulse initiative of the United Nations Secretary-General, a program that aims to harness the power of big data, as one way to prevent mistakes.

Kirkpatrick advocates using the power of analytics combined with the human insight of experts and leveraging the wisdom of crowds.

“That kind of data triangulation can help protect us going forward,” Okada said.

Brian Heaton  |  Senior Writer

Brian Heaton is a senior writer for Government Technology. He primarily covers technology legislation and IT policy issues. Brian started his journalism career in 1998, covering sports and fitness for two trade publications based in Long Island, N.Y. He's also a member of the Professional Bowlers Association, and competes in regional tournaments throughout Northern California and Nevada.