State leaders gathered to discuss the challenges and next steps their respective states face in the cybersecurity landscape during the winter meeting of the National Governors Association (NGA) in Washington, D.C., on Feb. 25.
Virginia Gov. Terry McAuliffe, who also chairs the NGA cybersecurity committee, facilitated the roughly hour-long panel conversation, saying that states are increasingly handling more data than federal agencies and that the need for improved coordination has never been greater. According to the governor, Virginia saw more than 86 million cyberattacks in 2016 and direct attacks on his own state-issued email account. “As you know, the governors of our nation actually have more data than the federal government. When you think of all the data we have, through our state tax returns through the Medicaid and health-care programs we provide, department of motor vehicles, we have a wealth of information that every single day people are trying to get in and get our information through cyberthreats and cybercriminals,” McAuliffe said. In addition, he called for better coordination and the establishment of basic cybersecurity protocols among the 50 states. Of the areas states need to improve on, McAuliffe pointed to the need to bolster critical infrastructure protections and vulnerability assessments; ensure timely, consistent and useful briefings; National Institute of Standards and Technology alignment; and for state leaders to look long-term when it comes to goals and response plans. “If Virginia is in great shape and does a great [job] with cybersecurity, it is absolutely meaningless if some other state doesn’t do anything about cybersecurity,” he explained. “If they have the same health-care provider, they will use that smaller state and go through that health-care provider to get a back door into the commonwealth of Virginia.” In Arkansas, Gov. Asa Hutchinson said several efforts are underway to secure the infrastructure and systems that the government and citizens rely on. In addition to a third-party cybersecurity risk assessment, the governor said staff are also working to consolidate data centers and enterprise architecture under one consolidated agency, the Department of Information Services. After dealing with various cyberattacks launched against state assets, one of which was a denial of service attack that downed the state’s website, Hutchinson said the potential for the loss of constituent data is a constant concern. He also said the vulnerabilities present in the infrastructure space warrant careful consideration and protection efforts — both in the public and private sectors. “There is significant worry on a governor’s part if the energy grid goes down because that impacts our response, our cost to the state. And so, there is a regulatory challenge to us to make sure that our private sector, that is regulated, that they are investing as they need in cybersecurity and protection as well.” The 2014 cyberattacks launched against Oregon’s campaign finance and business registry websites gave then-Secretary of State Kate Brown some up-close experience to draw from as governor. The response garnered “stronger walls,” but repairing the damage done came at a significant cost. “Since that cyberattack in 2014, my state has taken a number of steps to address system deficiencies and increase our IT security posture," she said. "I initiated an audit that uncovered numerous structural security gaps, and then as governor, I issued an order to unify responsibility and upgrade Oregon’s capabilities.” Brown said she is currently supporting a state legislative effort to create a cybersecurity center of excellence and is participating in a policy course through the NGA dedicated to further enhancing the state’s cybersecurity standing. “I believe that we have to build tools that the public can put their confidence in even when doing something as simple as buying a fishing license — and we like to buy fishing licenses in Oregon,” Brown said. Former Assistant Attorney General John Carlin spoke a bit more broadly about the existing cyberthreat space saying that no one is quite where they need to be when it comes to the cybersecurity landscape. “What we’ve found is that we are just at the beginning of this conversation where people are really treating it like the risk that it is,” Carlin said. “We did so systematically, across the board, making those decisions without adequately thinking through what the risks were. So now, as a country, and really much of the world is playing catch up, knowing that the ability to cause harm way outweighs our ability to protect ourselves.” The influx of relatively new technologies, like the Internet of Things and autonomous vehicles, has broadened the scope for potential bad actors while increasing the areas for cybersecurity professionals to watch. While Carlin said the effects of the increasingly prevalent tools will come with a host of positive benefits, he cautioned that “we can’t make the same mistake again of not building in security-by-design on the front end.” As Internet pioneer and panelist Vinton Cerf asserted, the main problem with cybersecurity to this point has been in the software, and software patches are not always provided when issues are discovered. “The root of all of this problem: It’s the software, stupid,” he said. “We don’t know how to write software that doesn’t have bugs. We’ve been trying for 70 years, since computers have been available.” One potential fix could be the development of tools to alert a coder that a problem could be present during the development stages to bridge the gap between the intent and execution. This type of tool, Cerf said, could be an opportunity for state education institutions and government to partner. Another argument made by the Internet expert was that of the need for more widely implemented two-factor authentication. Though he acknowledged the method requires some level of inconvenience, the result is ultimately more secure systems.