Data can be a fundamental tool in disaster preparedness, but the insights aren’t always heeded. This was the observation of three emergency management experts from academia, government and the private sector in an exchange last week on natural disaster data.
The trio, who spoke about data use for city resilience at the Atlantic CityLab Summit in Los Angeles, Sept. 29, said that an analysis of data shows an overwhelming need for infrastructure improvements, but states and cities typically take short-term savings over long-term protections against catastrophe.
Lucy Jones, a seismologist at the U.S. Geological Survey (USGS), is collaborating with Los Angeles to draft a seismic-resilience plan. She said the city is a prime example of what happens when there’s an abundance of data and absence of investment in disaster preparation. About 85 percent of the city’s water supply is delivered by aqueducts across the southern San Andreas Fault — a fault line the USGS estimates will generate a major earthquake sometime in the next decade or so, according to its data. The danger centers on indications city aqueducts will break, leaving only a six-month supply of water reserves for residents, she said.
Put in context, these reserves are dismally inadequate when considering the aqueducts would require 18 months to repair — a one-year gap without water.
"When the San Andreas earthquake happens in Southern California — and that's the most likely big earthquake in the U.S. — we know that all of the transportation life lines, the electric systems, the water systems, the gas lines that cross the San Andreas fault, exactly where they'll break and what will happen when they break,” Jones said. “[Yet] that hasn't gotten anybody to do anything about them.”
Hopefully the seismic-resilience plan will help city officials to find data-driven remedies and propose new ordinances. For example, the city could expand its capacity for water reserves.
For the state’s homeowners, procrastination to retrofit for earthquakes mirrors the inaction from officials. Jones questioned if such costly construction — that can cost thousands of dollars per housing unit — should be left to residents or if government should be an enabler. Looking at GDP figures, a financial investment in retrofits and other infrastructure are likely to pay off when earthquakes — or other natural disasters hit.
In an analysis of data from New Orleans, post Hurricane Katrina, direct losses from the devastation totaled an estimated $80 billion, according to Jones. However, the lion’s share of the loss is ongoing. In the seven years since the hurricane hit New Orleans, the city has lost $105 billion in GDP and continues to lose GDP at a rate of $15 billion per year, she added.
“The consequences and benefits of retrofitting really go to a lot of other people and to the whole community,” she said.
A primary reason to defer investment in emergency management tools and infrastructure stems from two mistaken beliefs: Such expenditures are unjustifiable because they don’t serve immediate needs; and large emergencies are infrequent.
Brian Wolshon, a professor of civil engineering at Louisiana State University and a consultant focused on road and highway infrastructure, pointed out that evacuations of 1,000 people or more happen every two weeks somewhere in the U.S. Further, emergency enhancements don’t have to be costly isolated projects but can be fused into a multiplicity of city projects from roads to facilities to civic services.
“We can look to see how we can integrate this overall idea of community resilience into that overall frame work,” Wolshon said. “That’s not something that costs a lot of money that’s just changing mindsets.”
Data and especially open data — government data published online for citizens — were billed as emergency management tools easily repurposed outside large-scale disasters. Brian Fishman, a representative with Palantir Technologies, a tech vendor of data management platforms for disaster relief and humanitarian work, pointed out that city data systems can be leveraged to gain insight on various challenges inside and outside of emergencies.
"Data is just information combined together so that it informs decisions,” Fishman said. “When you put that information into a common format you're going to use it in a million ways not just for resilience. You're going to use it to find water pipes that need to be replaced. You're going to use it to survey buildings for code violations and not just susceptibility of earthquakes.”
In disasters, such data investment assists on numerous fronts, saving lives, empowering department efficiencies and stimulating recovering economies, he said.
An example of resilient data usage can be seen in the recent public-private partnership between the U.S. Federal Emergency Management Agency and tech startup Appallicious. On July 29, they unveiled the Disaster Assessment and Assistance Dashboard (DAAD) to aid citizens seeking services during and after major catastrophes. The basic platform, free to cities and upgradable at scaled costs, supports economies by linking recovery efforts to local residents and businesses — skills, services and equipment all posted and mapped for use.
In an another example using open data, Oakland, Calif.’s civic tech brigade OpenOakland — part of Code for America’s volunteer organization of technologists — recently launched SoftStory, a Web app that maps soft story buildings (buildings potentially susceptible to earthquakes) to assist and educate officials and citizens on known vulnerabilities.
Regardless of technology’s expanding role in natural disasters, the three underlined the notion data is only as valuable as the actions that follow it.
"What we've really learned and what we've seen over and over again, is the old adage 'that an ounce of prevention is worth a pound of cure' and ‘failing to plan is really planning for failure,’” said Wolshon.
Jason Shueh is a former staff writer for Government Technology magazine.