April 28, 2005 By Shane Peterson
Spafford, who also serves on the President's Information Technology Advisory Committee (PITAC) and acts as security adviser to more than a dozen federal agencies and major corporations, believes strong cyber-security policies not only benefit information assurance and trust in cyber-space, but also can act as a bulwark against terrorist actions as well.
But Spafford -- who chairs the U.S. Public Policy Committee of the Association for Computing Machinery, an agency that advises legislators and regulators about the impact of policy on computing technology and vice versa -- is worried about the ongoing lack of support for fighting this growing problem. He took time to speak with Government Technology's Public CIO about his concerns, the nature of cyber-security, protecting information systems against intrusion and training security professionals.
Cyber-security seems to get sufficient attention, but are there one or two less obvious threats to cyber-security that people aren't talking enough about? I can't say I have a firm handle on what all government CIOs are doing, but the kinds of things I've seen most often overlooked generally are a result of people looking outward for threats. They're worried about viruses getting in. They're worried about firewalls being breached. They're worried about intrusions coming in.
But the insider threat -- putting appropriate controls in place against insider abuse often suffers as a result, and in particular, partitioning internal networks to contain failures and limit access on a need basis. Those are well understood in some government agencies, but I'm not sure how well they're understood in all government agencies.
A second thing that comes from that same mindset is a failure to appropriately manage the physical security aspects of the enterprise. This includes not only appropriate inventory and control over computer system equipment, but also things having to do with printouts, CD-ROMs, USB disks -- other kinds of media and auxiliary devices -- to make sure they're accounted for or protected appropriately. In cases where you have control over how either programs come in or information goes out, things like USB disks and the like are being used to circumvent those controls.
That's all in the operational and physical security arena.
There is a third thing. Too often, systems that are provided for use -- because managers are sometimes overly insistent on using mass-market, COTS [commercial off-the-shelf] products at the cheapest price -- are deployed with lots of software and options enabled that aren't needed, and are actually pathways for abuse over a network or in person.
The attempt to make it easier by using a large cookie cutter leaves systems more vulnerable than they should be. Patching doesn't necessarily catch that because patches only fix flaws in what's installed; it doesn't limit the installations.
You've spoken about an overemphasis on standards and how standards can breed cyber-attacks because what you call a monoculture is created. One position that often appears in media coverage of cyber-security is the argument that Linux and other types of operating systems are needed to avoid that monoculture. The problem is a little more complex than that. It's not that I'm opposed to standards, per se, but the kind of standards and the reason they're adopted are important. For instance, building Web pages to be accessible by any standards-compliant browser is a good thing because if you build Web pages that can only be accessed with Internet Explorer, you're forcing people to use a very buggy piece of software. Standards there are a good thing.
Standardizing on a particular vendor or a particular software platform simply because it's currently the cheapest or other such reasons locks you into a cycle -- a cycle that doesn't necessarily provide the appropriate incentives for that vendor to improve or offer better products. We have to be careful about what we talk about when we talk about "standardizing."
I don't think Linux is the answer for security -- for a number of reasons -- but it does provide an alternative in some environments, and in fact, it may be a more appropriate choice for some kinds of solutions. But the choice is not between those two systems. There are other choices.
You've spoken publicly about how people trust the private sector more than government when it comes to safeguarding personal information. The scenario you just described and the mindsets you spoke of earlier -- are those the primary issues government needs to address to regain some of the trust it has lost? Or is it a different exercise to recover that trust? To say to constituents, "We're as good as your bank as far as securing your information." There's more to it. Simply securing one's systems is not sufficient to engender trust. There has to be a more proactive behavior to be willing to spend extra care and sometimes even extra funds to build and run systems that respect privacy, even if that means [they are] a little less efficient than they might otherwise be because maybe you have some additional masking of information.
There have to be well identified, public procedures for people to ask questions and raise complaints that are actually listened to, rather than an attitude of, "We're the government. We know better." Some individuals in some agencies certainly come across that way. Trust is a complex series of things that requires working together.
We want it to be not simply a case of "I'm not worried about you accidentally hurting me." I would actually rather be in the mindset of, "You're actively working to help me." That's a difference in attitude.
Can you elaborate on that? It's a very subtle distinction, and I'm wondering where the line is. It's going to be different for any organization or agency. But from people I've talked to, the kind of focus you hear out of government agencies is collecting lots of information without people necessarily understanding how that information is used or stored, until they hear about it being leaked.
Government organizations that focus on anti-terrorism, which is security-related, rather than primary mission areas [that] carry out programs, enhance the quality of life or improve the efficiency of government -- there's a real difference there.
Is it reactive that we're worried about hackers and viruses and terrorists? Or is it that we're building an enterprise that carries out our mission, and it's strong enough that we don't have to worry about those things because we made the right choices? That's where CIOs need to be. That's where the government infrastructure, and the private-sector infrastructure, needs to be. It's, "I have confidence in the controls I've put in place. I have confidence in the quality of goods and services I've employed."
So if there's a new threat, it's not going to panic me into changing direction. That's not overt, but a lot of people are worried because here we've had some incidents and everybody is running around scared. That doesn't breed confidence that good choices have been made all along.
That seems to lead to your observations that some lack of confidence stems from a confused or misdirected federal policy. You've noted that the federal government seems to be overly worried about physical terrorism, but not cyber-crime. As a result, there's no clear or cohesive dictum coming from on high, such as, "Here's what we need to do to fight computer intrusions and related cyber-crime at a federal, state and local level." Right. I would also say if we took good, proactive measures against any kind of cyber-crime, fraud or identity theft, that would catch any terrorist actions as well. If your systems are protected against garden-variety crooks, then you're also going to protect against the bigger threats. You're going to find them.
Really, what we're doing with terrorism is, "Those al Qaeda foreigners," and everybody else is getting by. That's the wrong attitude, particularly in the cyber-world, where you can't even tell who's at the other end. Organized crime, disorganized crime, vandalism, spur-of-the-moment fraud -- all of these are big problems and every time they occur, they eat away people's confidence in the systems.
We could be focused on the fundamental issue of protecting that information, protecting those systems.
From your perspective, is the lack of cohesion on fighting cyber-crime because it is nebulous and hard to define for the common person? If you say "9/11" or "the twin towers," those engender a lot of reaction in people -- concrete reaction. But when you say cyber-crime, do people's eyes glaze over? No, actually a lot of the people I've talked to don't understand it all, but they're worried about it. They're worried about identity theft. They know about that, and more people are finding out about it because it's such a rapidly growing crime. When we start having bigger incidents like ChoicePoint and the SAIC theft, people see that and begin to get a little more worried.
It doesn't have the same tangible effects, and maybe that's why. Currently we don't have the tools, technology and training in place to do as good a job on this as we'd like. There will be a report coming out sometime in the next couple of weeks from the President's Information Technology Advisory Committee on the terrible shortfall in the government's funding of basic research in this area. Almost all of the efforts have gone into near-term and primarily military research.
For instance, the DHS has a more than $1 billion budget for research and development for homeland security, but only $18 million of that is for cyber-security. The National Institute of Justice has only $7 million a year to invest in cyber-crime research for better tools and prosecution technology.
There's a terrible shortfall in research funding to support development of better products and approaches. In particular, the majority of federal funding is going into services and products, such as antivirus and firewall technologies, instead of into research. This is not advancing us beyond the current platforms that require those add-on products to be used with a modicum of safety.
Is it fair to say Congress doesn't understand what they need to do, or is that not the right tack to take? A lot of things don't have an easily identified voter bloc. It does require a bit more understanding to see the threat, and it's occurring when there are many hard choices to be made because of economic troubles.
I don't mean to make light of the threat from terrorism. There are significant threats there. Although in the cyber-realm, I don't see them as significant as they are in chemical, biological or other means. Those areas need funding. We are also in an engagement with many of our men and women in our armed forces in harm's way, and that certainly takes a lot of attention.
All of those things make it difficult to understand the subtleties of cyber-security, and most people believe the security problems we have are just a matter of fixing a few bugs and getting a better virus-detection program in place, and if only we could apply patches in time, everything would be great.
That attitude has been cultivated by many software vendors and others in the community who either don't understand the real problem or stand to benefit from that view. Security is not an add-on. It's not an after-the-fact kind of thing. If we really want to have more secure systems, they need to be designed from the beginning in that way.
I presume you saw the stories from late February covering Singapore's decision to spend $23 million over three years to create a National Cyber-Threat Monitoring Center responsible for constant detection and analysis of computer virus threats. Singapore is clearly one of the most Internet-connected countries in the world, and the structure of Singapore's government allows it to make a unilateral decision for the entire country, rather than the federated nature of the U.S. government, which doesn't allow for such mandates. What would it take for the United States to build something like this? Could the United States build something like this? Well, supposedly we have the US-CERT [United States Computer Emergency Readiness Team] and the folks at DHS, but you've identified one of the major differences -- the nature of the government/private enterprise relationship is very different in Singapore than in the United States.
We have active resistance by many large players in the ISP and computing space to any kind of government interference or oversight -- very, very strong. That means it would be very unlikely we would able to do something like Singapore has done.
Does that active resistance come from the players themselves? Or is it an extension of the Big Brother-type fear the U.S. population might have if President Bush came out and said, "We're going to build a cyber-threat monitoring center. We're going to watch everything that goes across the wires in this country." We don't have the technological capability to do that. Yes indeed, that would certainly get people upset if it were phrased that way. I'm surprised there hasn't been more outcry about the Real ID Act -- the one that mandated that there be a national standard on drivers' licenses. Furthermore, the bill basically states that the database will be made available to Canada and Mexico.
Having a database of everybody in the United States with all their personal information and biometrics made available to the authorities in Mexico and Canada doesn't make me feel very good. Every time I fly, the absolutely ridiculous things the TSA's people at airports do don't fill me with confidence that some people setting the policies know what they're doing.
U.S. companies fear, and perhaps with some real reason, that new government oversight or regulations will be both misinformed and economically damaging. The general public has varying levels of fear of invasion of privacy depending on how threatened they feel currently.
If you had to put a ranking on the efforts of the DHS and US-CERT to fight cyber-crime, on a scale of one to 10, would those efforts come in at a five? If I were to be assigning a letter grade to what they have accomplished so far, it would be no higher than a "D."
Wow. We've talked about some factors that go into that. It's not simply a matter of them not wanting to do their job. Part of it is that the job isn't well defined. They don't have the authority to do some things that need to be done, or the cooperation. I'm not sure which is most important.
We don't have the manpower with appropriate training. We don't have the tools. We don't have the international agreements in place to get cooperation overseas from many of the parties we need. Domestically we don't have the attitude that computer crime should be reported, so a lot of companies let it go by the boards, or they try to handle it internally.
There's a whole range of issues here, and they're not all attributable to those central organizations.
There's been a lot of talk that the United States needs more emphasis on graduate degrees in IT security, and we need to encourage more course offerings by schools and more interest in students to go down such a study path. What's your take on how to make that happen? Can it happen? It can happen. It's going to be a little difficult to bootstrap because we don't have that many well trained educators in the field already. Typically, to breed a new round of educators, you need for them to get appropriate experience and education, and there have been limited opportunities for that.
We have a lot of people who teach elements of information security but don't really understand the field. At many, many schools, you can learn about elements of cryptography, but there aren't many places that understand how it's used in the real world and what the implementation difficulties are -- and very few places indeed where you can go and not only learn about how to set up a firewall along with the cryptography, but also about the threats to physical and personnel security that go with it.
There's a difficulty there in that we don't have the infrastructure to turn out the number of people we need with the right training. We don't have the support for those efforts.
The center that I run is currently the largest in the United States. I've never been able to find consistent federal support of any kind for running the center. There is no program, because unless I'm committed to basic research on a particular problem, they will fund that problem but not the center. That makes it difficult to build any kind of environment.
This year, we're going to graduate probably about 20 percent of the Ph.D.s in the field in the United States, but that's only about 15 graduate students. That gives you an idea of the magnitude of difficulty.
It is a real problem. There are other parts of it too, such as, where are these programs going to fit in the academy? Information security is a lot more than simply computers. For many institutions, it's difficult to know where to classify these programs.
This is another problem with why the field is kind of small. There aren't many institutions that have provided a nurturing environment to build this kind of enterprise. We've been very fortunate here.
We definitely need to put more effort into this, and another aspect I'll add to this is that the Internet is global. There are no borders, and everybody has a homeland. This is an area where we need a global effort, a global awareness.
The attitude that has sprung up in many government agencies is that this is too sensitive. We can't let anybody from any other country learn anything about it. They classify everything, and they make it difficult for students from other countries to get in, but we're not getting the U.S. enrollment that we should be.
This doesn't help the world situation. Most of the organized crime for extortion is coming into the United States from China, Chechnya, the former Yugoslavia, Nigeria and other places from around the world.
We need to have well trained, cooperative individuals in many places around the world if we're going to make any headway against the problem.
You may use or reference this story with attribution and a link to