IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Can Technology Prevent the Next Galveston?

Smartly deployed, useful and actionable information systems, informed by readily available data, can help alleviate the harmful effects of any disaster.

At the end of the 19th century, the port city of Galveston on the Gulf of Mexico was the largest in Texas and a booming center of international trade. By dawn on September 9th, 1900—the day after the deadliest natural disaster in American history struck the city—Galveston almost entirely ceased to exist.

Had it hit only a few years later, a hurricane like the one that toppled Galveston would not have had such an effect; the enormity of the impact of the 1900 storm resulted from a misunderstanding about how hurricanes would affect the city, a reliance on ships to forecast weather, and a lack of available mass communication. Even by 1904, the necessary knowledge and technology—a deeper understanding of weather systems, use of airplanes to conduct atmospheric research, and widespread wireless telegraphy—was available. But for Galveston, it was too late.

In the past two years, the United States has been ravaged by natural disasters rivaling the strength—if not the actual impact—of the Galveston Hurricane. In the spring of 2011, flooding turned urban centers along the Mississippi River into swamps, even as tornadoes wreaked havoc on Joplin and other cities in the Midwest. Massive wildfires ravaged southern and western states, a major drought and heat wave baked the middle of the country, and Hurricane Irene and Superstorm Sandy left thousands powerless or worse across the northeast. In 2011 alone, natural disasters came with a $60 billion price tag, and at a human cost of more than 1,000 dead, 8,000 injured, and many thousands more displaced.

Unlike Galveston in 1900, however, technology that would mitigate the impact of these disasters exists today.

If we are to protect our economies and save lives and livelihoods, the current approach to dealing with disaster—one that relies heavily on clunky processes used based on intuition and human observation—simply isn’t an option. Governments have the opportunity to instead forge a new path, one that would reduce human error and maximize precision by turning readily available data—city-owned figures such as neighborhood demographics and locations of service requests, citizen information such as social media postings, and external records like satellite or environmental data archives—into useful and actionable information systems. Smartly deployed, these systems can inform a range of government initiatives and programming that can alleviate the harmful effects of any disaster.

We know this approach works. The devastating 2010 earthquake that struck Haiti marked one of the first times that such data analytics have been applied to disaster management. The Haitian government, working with NGOs like Direct Relief and businesses like data management company Palantir, heavily relied on public, social, and private data to manage distribution of aid in the wake of the earthquake. Tracking locations and names of collapsed buildings, sites of internally displaced people (IDP) camps, SMS content, and more, the government and partner organizations were able to address a range of issues—from determining which administrative sectors had the most requests for food and how that food could be most efficiently distributed, to sorting out which collapsed buildings contained hazardous materials and whether IDP camps were near enough to these sites to warrant relocation.

After addressing these most pressing issues, the government and its partners additionally mapped risk for flooding during Haiti’s upcoming rainy season as well as for cholera outbreaks.

More recently, data analytics have been employed to help the New Orleans community rebound from some of the long-lingering effects of Hurricane Katrina. Still dotted with homes abandoned during the 2005 storm, the city this past year rolled out software—built with data from 311 calls, city department spreadsheets, and public hearings—that collates information about these blighted properties. Working with fellows from Code for America, the government in October 2012 unveiled BlightStatus, an app that allows citizens to identify blighted properties and track the city’s progress towards demolishing, revitalizing, or otherwise addressing the damaged structures. The platform has given citizens a stronger voice in evaluating how such properties are dealt with, promoted more and better city-citizen interaction, and created accountability for the local government. It has been declared a major success on all fronts, and in November 2012 New Orleans reported nearly three times the number of hearings on blighted property as took place in September, just a few months prior.

Superstorm Sandy cost an estimated $50 Billion in damages.

Other uses of data analytics in disaster management and response are just getting underway. Real-time resource use smart meters, now a common feature of many utility networks, can provide users with a constant stream of data about their consumption of water or electricity. In times of disaster, these meters can alert utility companies of system outages or irregularities—even down to a single affected building. A variety of smartphone apps now create a two-way disaster alert system, warning users of storms or other hazards heading their way and allowing them to report back what they see on the ground. Meanwhile, researchers are exploring how analysis of mass social media content can provide health services departments with a means of predicting the spread of epidemic or disease even before health care providers start to notice a trend.

These efforts demonstrate an encouraging and dynamic response to a critical issue: Modernizing our government’s emergency response mechanisms.

In disaster management, an area in which citizens look almost exclusively to the public sector for guidance and support, policymakers must act smartly, decisively, and with data-driven and scientifically-reasoned backing. The Galveston Hurricane in 1900 produced a death toll of between 6,000 and 12,000. Katrina, in 2005, killed 1,800. When disaster strikes next, we know we can do better.
 
This story was originally published by Data-Smart City Solutions.

Noelle Knell is the executive editor for e.Republic, responsible for setting the overall direction for e.Republic’s editorial platforms, including Government Technology, Governing, Industry Insider, Emergency Management and the Center for Digital Education. She has been with e.Republic since 2011, and has decades of writing, editing and leadership experience. A California native, Noelle has worked in both state and local government, and is a graduate of the University of California, Davis, with majors in political science and American history.