In 2012, Hurricane Sandy rocked the East Coast in a way the region was largely unprepared for. While the Gulf states have had plans in place for withstanding hurricanes, the northeast dedicated significantly less resources to hurricane preparedness. And it showed: The storm knocked out power to more than 8 million homes as far west as Michigan, and the cascading effects were felt "downstream."
And while the transportation, energy, water and communications sectors are aware of the consequences that exist should a failure of their own system occur, what happens “downstream” — to the interconnected infrastructure sector — is not as clear, according to Fred Krimgold, director of the Disaster Risk Reduction Program at Virginia Tech's Advanced Research Institute.
“Up in New Jersey and New York after Sandy, a big problem was fuel shortage, not because they didn’t have fuel but because they didn’t have electric power,” he told Emergency Management, sister publication to Government Technology, adding that all the fuel was in tanks in the ground and had to be pumped electrically.
Part of that understanding and finding solutions to those cascading infrastructure issues falls to Brandon Wales, director of the Office of Cyber and Infrastructure Analysis (OCIA) at the Department of Homeland Security. “Our responsibility is to really understand how infrastructures operate, how they work together, what are the connections, the dependencies, the interdependencies between infrastructure," he said, "and ultimately to understand what happens when those infrastructures fail, are disrupted or attacked ...”
One of the most effective ways of anticipating different types of malicious and non-malicious events to both physical and digital infrastructure is through computer modeling and data visualization tools. Building interactive tools, Wales said, provides significantly higher returns than a static sheet of numbers.
Although interactive data visualizations are still in the developmental stages, Wales said he believes they are the future of data dissemination.
“I think this is the way people are used to consuming information in the 21st century, being able to interact with it,” he said. “Not just stare at a PDF file, but test it, understand it, figure out what's really most important to them and drive in on those kinds of issues."
And that, Wales added, is much more difficult to do in a 100-page PDF document.
“There is an art to displaying complex information in a way that is honest, but it also insightful and easy to use and easy to understand,” said Charles Rath, CEO of Resilient Solutions 21 (RS21), which creates interactive data visualization by producing models based on layered data. The team behind RS21 includes gamers and Hollywood special-effects folks who have worked to make their models as interactive and as interesting as possible.
Making the online tool accessible and easy to use was a conscious decision, explained Kameron Baumgardner, visual informatics lead for RS21. If something is interesting to look at or fun to use, the higher the likelihood that people will want to engage. And getting public employees to buy in is a lot easier once results can be derived and explained with visualization.
The big-data visualization tools, however, are only as valuable as the amount and quality of the data that goes in, Wales said.
“Ultimately, data is the lifeblood of the analysis and modeling that we do,” he said. “Without it, we are not going to be successful, so the more the data is available, the more that it is acceptable to us, the better that we're going to be able to do in fulfilling our mission.”
It is a two-way street, Wales explained. “The more data you share, the higher fidelity analysis that we can provide.”
This sometimes becomes a sticking point across agencies that are concerned with oversharing sensitive data. One project OCIA took on was to create an infrastructure model in northern New Jersey near the state turnpike. The area’s water systems were a mixture of state- and investor-owned utilities that did not want their data publicly accessible, potentially revealing certain vulnerabilities within the system.
However, once the regional water model that represented what would happen should disruptions in any individual water system occur — and how that would propagate to neighboring systems — was built, utilities were able to understand disruptions they were not prepared for. Once the value was understood and the transactional nature of data sharing and its returns were felt, the utilities became more trusting and willing to hand over data.
The OCIA also has a program, called the Protected Critical Infrastructure Information program (PCII), that allows infrastructure owners and operators to share data in a protected way. If the information is shared through PCII, it would be exempt from FOIA and other public sunshine laws.
RS21 is working to build a reputation for itself and create examples of how useful these models can be. “Many times we go into cities and we know more about the city from a data perspective than the city does,” said Rath. “We can say, ‘Look at your data. This is what it's showing you.’ It's really empowering for the community.”
One project from RS21 was to create a flood inundation model of the Charleston peninsula in South Carolina. By adding in all the layers of data publicly available, they showed the National Oceanic and Atmospheric Administration (NOAA) which buildings would be flooded in the event of different categories of hurricanes. “That is very real and visceral and powerful for people to see,” said Rath. “The likelihood of them taking action when they see that is drastically increased.”
Visualization and modeling tools are not reserved solely for disaster preparedness, but have also been utilized by city planners and government architects. RS21 has also created models and data visualization to help reduce youth violence in Mexico. “I think any massive socio-economic behavioral issue that we're facing in humanity can benefit from the better visualization of data, and at a higher fidelity,” said Baumgardner.
Another data visualization player has helped Chicago map both its above-ground infrastructure and underground assets. An online tool from Cityzenith allows users to “point and click on any building and retrieve the entire history in one consolidated place,” said Chairman and CEO Michael Jansen.
The program has been used to log 311 calls by residents, consolidating information in order to see long-term trends and problematic areas. The underground mapping, provides planners a sophisticated 3-D visualization of their underground assets paired with complex data. While the company has pivoted to focus on architecture, engineering and construction firms working in smart cities, many tools are provided to the cities for free.
Data visualization and modeling is also a game changer for sustainable infrastructure, explained Dominique Davison, CEO of PlanIT Impact. The online tool creates models based on federal, state and local data about potential infrastructure being built with regard to energy, water use, stormwater drainage, greenhouse gas emissions, proximity to public transportation and more.
Data visualization is where the information market is headed, Wales reiterated, as the potential for these technologies is hard to overstate.
“We think that really it's a way to engage with our customers and our stakeholders in a way that just can't be done staring at PDF files,” he said. “If we have this conversation in a year, I'll hopefully be telling you a lot of success stories about the feedback that we're getting from folks about what we're deploying in the future.”
Ryan McCauley was a staff writer for Government Technology magazine from October 2016 through July 2017, and previously served as the publication's editorial assistant.