IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Report: What is the U.S. Government’s Role in Cybersecurity?

Mounting threats, new technology and the Wild West mentality of the Web spur discussion on whether the government should assume more of a leadership role.

As the Internet evolves, so do the threats lurking within it. But this ongoing evolution raises a larger question: Who is really responsible for bolstering private and public defenses? Is the Wild West approach to the Web going to cut it in the long run, or will someone have to take to take the lead when it comes to a national cybersecurity game plan?
 
As it stands, it could be argued that cybersecurity in the United States boils down to an “every man for himself” approach. For the most part, companies operate independently of one another and the government; and the same is true in reverse. But has this approach been effective? Recent large-scale data breaches of the Office of Personnel Management and the IRS have prompted renewed consideration of the topic.
 
Alan Webber is the lead researcher behind an International Data Corp. (IDC) 2015 report, Business Strategy: Defining the U.S. Government Role in Cybersecurity, which tackles whether the government needs to drive modernization into the next generation of national cybersecurity.
 
“Basically cybersecurity has gone beyond being just a government problem or a private-sector problem,” Webber said. “For a long time we thought if we built a big enough wall, we could keep [intruders] out, and that’s not the case anymore.” 
 
An IDC research director specializing in global public safety and national security, he argues that government leadership is responsible for some of the world’s more innovative technological advancements. He uses the technology that came out of public-private cooperation and the Apollo space program as a fitting example. 
 
“We had a specific goal of getting somebody on the moon. So, what does our cybersecurity goal look like?” he said. “We know for a fact that we can’t keep everyone out. … Then what does our goal look like beyond that? We don’t have one and that’s a problem.”
 
The ever-changing cyberlandscape presents the monumental challenge of defending expansive networks 24/7 from a host of potential intruders who only have to be right once to cause damage to financial markets, public infrastructure and even military assets.
 
From Webber’s perspective, simply “building a wall” to keep the intruders out has proved to be an ineffective approach to a much larger problem, a problem that he said needs to be addressed now more than ever.
 
“What government really needs to push forward, both within government and outside of government, is a different way of thinking and looking at this. That’s to the point that we know they’re going to be in the network, we know they’re most likely already there,” he said. “So, how do we continue to operate safely knowing they are already there?”
 
In many cases, Webber said the problem isn’t so much about developing the next technology as it is about using what is already available more effectively. 
 
Bruce Schneier is a well known cybersecurity author and fellow at the Berkman Center for Internet and Society at Harvard Law School. He agrees with the premise that the government is well positioned to spur cybersecurity innovations and believes it should.
 
“There’s a lot of room for governments to step in and solve problems the markets can’t,” he said.
 
The approach to prompting security advancement could come in many forms, like incentivizing research, setting standards and stimulating the economy, Schneier said. By his count, any of the positive ways governments take a role in collective action would be fitting in the realm of cybersecurity.
 
While the government may have the know-how and tools to unleash state-of-the-art innovations in cybersecurity, Schneier said the matter of the technology being classified puts a damper on its release to the mainstream for further innovation.
 
“There’s a lot of really valuable work being done in security in the U.S. government which never helps any of us because it’s secret inside the [National Security Agency],” Schneier said. “Getting that [technology] out would be of enormous value.”   
 
From Webber’s point of view, the unwillingness to fail is one of the greatest stumbling blocks in the way of government innovation on cybersecurity.
 
Rigorous testing and retesting often stall technological advancements in the public sector and make agencies slow to implement cutting-edge tools and protections.  
 
“There is no culture that allows for failure that says, ‘OK, we can innovate fast,’ because part of innovating fast is the acceptance of a certain level of failure. That doesn’t exist in government. So, to expect government to be the solution to this is just not a viable solution,” Webber said. “Government needs to be the innovation center and push people forward, but it needs to do it in a smart way.”
 
Among the solutions outlined in his report, Webber said he sees three top issues that must be considered for better national cybersecurity. The first is re-examining our definitions of cybersecurity. He likens this to the shift from “conventional warfare to unconventional warfare.” 
 
The second is the government as the innovative driver. And lastly, Webber said, is the need to outline liabilities and reporting requirements. Too often, private corporations do not report breaches to government, a practice Webber said needs to change.
 
Because of the intertwined nature of business and government, Webber said it is essential that the federal government and private industry make the effort to improve the overall stance of the United States.
 
The researcher said the disruption to systems that control economic markets could easily translate to problems with effectively governing the country.  
 
“It’s one thing to lose money; it’s another thing to lose the potential integrity of your government,” he said.
 
The Obama administration has made clear the importance of enhancing cybersecurity through similar strategies. In Feb. 2015, the White House announced the creation of the Cyber Threat Intelligence Integration Center (CTIIC) under the Office of the Director of National Intelligence. The CTIIC was not created as a cyberintellegence collection point, but rather an analytical center for the data collected at other regional centers around the country.
 
Three days later, President Obama called on private industry to participate in information-sharing on cyberthreats at the White House Summit on Cybersecurity and Consumer Protection, an event which was attended by the likes of Apple CEO Tim Cook.
 
"Government cannot do this alone,” Obama said at the time. “The fact is, the private sector cannot do this alone either, as government has the latest information on threats."
 
The president went on to announce his plans for the formation of Information Sharing and Analysis Centers (ISACs), which would be authorized by the Department of Homeland Security to share classified information across business sectors that could help thwart cyberattacks. 
 
While these efforts are largely in keeping with the recommendations of the researcher, it is uncertain whether they will fully address the evolving cybersecurity threat in the United States.
 
Eyragon Eidam is the web editor for Government Technology magazine, after previously serving as assistant news editor and covering such topics as legislation, social media and public safety. He can be reached at eeidam@erepublic.com.