In IT Doesn't Matter, published in the Harvard Business Review's May 2003 edition, Nicholas G. Carr examines the evolution of information technology in business, noting that IT follows a pattern strikingly similar to earlier technologies such as railroads and electric power.

For a brief period, as they were being built into the infrastructure of commerce, these "infrastructural technologies," as he calls them, opened opportunities for forward-looking companies to gain strong competitive advantages. But as their availability increased, their cost declined. As they became ubiquitous, they evolved from profit-boosting proprietary resources to simply the cost of doing business. From a strategic perspective, they became virtually invisible -- they no longer mattered.

Carr, a former executive editor of the Review, created a storm of debate and controversy with the article, and expanded his analysis in his book, Does IT Matter? Information Technology and the Corrosion of Competitive Advantage published earlier this year.

Here he challenged many of business's most deeply held assumptions about IT's role today in achieving strategic success. In discussing this work with Government Technology's Public CIO, Carr offers a fresh view about the real challenges for IT executives in government.

Q: In your book Does IT Matter?, you make an essential point: The assumption that IT's strategic importance increases as its power and ubiquity increases is a mistake. Rather, what makes a business resource truly strategic and what gives sustained competitive advantage is not ubiquity but scarcity. Can you explain this more fully?

A: As you know, there has been an enormous amount written and said about computer systems and information technology. But I felt that most of what's been written and said looks at the broad effects of IT -- how it changes entire industries, the way it changes common business processes, and how it affects industrial productivity in general.

But there hadn't been much written about its effect on individual companies or organizations and their ability to distinguish themselves from competitors. It is only by doing something different, valuable to customers, that companies can gain a competitive advantage, and in turn, superior profitability.

As I explored it and looked back in history at other examples of broadly adopted technologies, I noticed a pattern. As technologies mature, and get cheaper and more standardized, they lose their power to distinguish one organization from others. Thus, they lose their strategic importance. This is what I think has been going on with information technology.

Q: And the key point here that some of your detractors perhaps miss is the difference between the effect for an individual company and the broader social effect. When you look at these earlier technologies, they reshaped society. But it was a broad reshaping. So what you are addressing here is not that the Internet or electricity are not broad technological forces of change, but rather that the advantage for the individual company adopting the technology becomes less and less significant.

A: Right. In fact, I think technologies only have their greatest broad impact on economies and societies when they lose their strategic importance, their ability to differentiate one company from the others. It is only then that they become part of the general infrastructure businesses and people use. During the times when technologies are immature and companies can use them in different ways, they have yet to become part of the general infrastructure. Thus their effect on productivity or their usefulness as a platform for new consumer products and services is limited. It is only when you move toward a more standardized infrastructure that you get your biggest broad effects.

Q: In the book, and you've also mentioned it here, you describe how the pattern for the adoption of other technologies, such as electricity, is very similar to the pattern of adoption for IT. Can you elaborate on that?

A: I call these technologies infrastructural technologies because they tend to become part of infrastructure. Although they are different from each other -- railroads are completely different from electric power, and both of those are very different from computer systems -- they showed a similar pattern in the way they are adopted and used by business.

Look at electricity, for instance. When electrification first started, companies that wanted to tap into that power source essentially had to build their own generators next to their plant and create their own power. But within a couple of decades, power had become a utility supplied by outside third parties over a common infrastructure. You see a similar thing happening with information technology. It used to be that companies built their own individual IT architectures that were maintained as little islands, whether it was a mainframe system or your own local area network hooking up your minicomputers. Now, increasingly IT capabilities are provided like utilities. In effect, companies tend to go to vendors to purchase their IT infrastructures. More and more, IT is supplied over what is essentially a public network -- the Internet. This is a natural process. It is not a bad process, but it means your ability to do things differently goes away. You trade off the ability to distinguish yourself to get the technology more reliably and cheaper through a common infrastructure.

Q: So the process of standardization in the IT industry, including standards for enterprise architecture, is an essential part of the pattern you are talking about?

A: Right. I think if you look back over the history of IT, it goes from proprietary systems and architectures created, designed and maintained by individual companies steadily toward more and more open systems with broader networks. In the beginning, you had your stand-alone mainframes. That moved steadily toward local area networks, again originally held within corporate walls; to wide area networks, again maintained by individual companies; and finally to Internet-based systems that require the breaking down of walls between individual company systems so all organizations can share information, applications and so on. But that process requires ever increasing levels of standardization and homogenization of IT components -- whether it is hardware or software. By definition, standardization erases differences. So increasingly companies move toward more uniform IT capabilities.

Q: It is easy to misread the point you are making. You also suggest in the book that although infrastructural technologies lose much of their power to provide competitive advantage as they mature, they don't lose their power to destroy it. In other words, that you are not gaining competitive advantage does not mean you can turn away from or get lax in terms of adopting and using IT intelligently.

A: Right. That statement actually means two things. First, that this very process of standardization can destroy traditional advantages companies held. For instance, if your organization was superior in carrying out some particular kind of transaction, as soon as you automate transactions, all companies begin to adopt the same IT platforms.

Your ability to carry out that transaction better than your competitors will erode because you are all moving toward a common software or whatever automates that transaction. So on the one hand, introduction of a common IT infrastructure can destroy old competitive advantages.

The second meaning is that if you are bad at managing this important infrastructure, this can put you at a competitive disadvantage. If you spend more money than your competitors on the IT infrastructure, if your failure rate on IT projects is higher or if you have poor security, the vulnerabilities are still there even if the opportunities fade away.

Q: In talking about the benefits, you quote Michael Port's article, Strategy and the Internet, which essentially says the benefits provided by the Internet are more difficult for companies to capture as profits. At the same time, you talk about the benefits transferring to the customer in many cases. So the view that the Internet is empowering the user is completely compatible with your view of IT and competitive advantage for a single business.

A: Yes. Much of the discussion about IT over the last 20 years has essentially looked at whether it improves industrial productivity. Even though there is still quite a bit of mystery about how exactly IT influences productivity, it is pretty clear now that it generally does. But there is another question: What happens to those productivity gains? And I think a lot of companies have assumed they will fall to their bottom line. They think, "We'll lower costs, and that will increase our profitability." But productivity gains don't necessarily fall to an individual company's bottom line. If they are broadly shared through an industry, they end up in the pockets of consumers, not in greater profitability, because they are competed away. And there is nothing wrong with that. An economist would say that's great. That's what you want to happen -- the benefits end up with consumers rather than companies. But if you are a company and you have assumed this is going to be your route to competitive advantage and higher profitability, you may be in trouble.

Q: So in terms of a government sphere and the empowerment of citizens, what you are saying here does not negate that at all. Ultimately governments may need to redefine their relationship to citizens because IT can give citizens more access to more information.

A: Right. In some ways, governments are approaching this from a different perspective than companies because the rules of competitive advantage and superior profitability don't apply directly to the government space. So pure gains in productivity -- being more efficient and serving the needs of citizens -- are more important in and of themselves, or are an end in themselves in a way they aren't in business.

Q: You also talk about the CIO's changing role. For instance, you describe how many companies in the early adoption of electricity had vice presidents of electricity, and as electricity became ubiquitous and easily available, these jobs disappeared. Does a similar fate await the CIO?

A: It may well be the ultimate fate. I think we are a long way from that, but steadily moving toward it. What happens is the new technology becomes so much a part of the general infrastructure that what used to require a technical specialist -- a CIO -- no longer does. His or her functions begin to break apart and move into more traditional parts of the business or organization. So IT becomes an essential component of marketing, finance or logistics, just as electricity is -- but you don't need a separate organization to run it. Now as we know, we are quite a long way from that point. Organizations still have enormous trouble using this resource efficiently. But in some ways, you could say the goal of the CIO should be to make himself or herself obsolete, to make the IT infrastructure so stable, so efficient, so taken for granted that you no longer need the CIO position. You could say that is when the CIO has achieved the ultimate success.

Q: And given that, the priorities start shifting perhaps. You quote Paul Strassman: "From now on, it's economics -- and the role of the CIO is to make money. Technology has to be taken for granted." Doesn't this, in one sense, make the CIO's role more strategic?

A: Just the reverse. The role of the CIO or the IT organization actually becomes less strategic. So in other words, you are not looking to IT to provide a competitive advantage. IT is essential. It is going to be a component of virtually all elements of your operation. So just like electricity or the telephone, you couldn't get competitive advantage without it. But it is not the basis of that advantage or your strategic position because competitors can copy the technology very quickly. So your barrier to competition doesn't lie in the technology. As I see it, that means we are moving from the point where you want a CIO who is kind of your strategic visionary about the future of technology, to a person who is a very good manager and can get you the IT capabilities you require at the lowest possible cost with the lowest possible risk.

Q: In mentioning risk, you also talk about when to innovate, and basically you advise to do so when the risk is low.

A: Right. There are some dangers that go along with being a technological innovator. This is particularly true of IT.

One is that you are always going to spend more because the cost of any kind of computing capability goes down so quickly. So if you are out on the cutting edge, you are going to pay a heck of a lot more for any given level of functionality than the organizations behind you. And of course, it is always riskier to be out on the edge because you don't know where standards are going. You don't know which technologies are going to take hold and which are going to become obsolete. So you pay more and you take on more risk to be an innovator.

That can pay off if there is a reasonable chance you are going to get a fairly decent and durable competitive advantage. But if we've moved into a stage where any kind of advantage vanishes rapidly -- because innovation is rapidly defused throughout an industrial or government sector -- then the argument for being on the cutting edge gets weaker and weaker. More and more companies are going to peel back from the cutting edge and follow rather than lead. There is nothing wrong with that. Organizations have to be careful in choosing where they innovate. In a lot of areas, they don't want to be innovators. They want to keep up with the competition, but there is no strategic reason to be out ahead.

Q: Especially in government, there is a built-in caution, which is well justified when you aren't trying to get a competitive advantage in the first place.

A: Right. I would call these trends probably a good thing for government users and buyers of technology. As this commoditization trend continues, one thing you get is a shift in market power from the vendor or supplier of the technology to the buyer and user of technology. If you don't need to be on the cutting edge, you can back up. You get many more options, whether it is cheap commoditized hardware components, cheaper software or open source software. You get more choices. You get more power. You get more leverage over vendors. So there are big opportunities today for companies or governmental organizations to spend less for IT, but get more.

Q: As part of that, you suggest that CIOs and senior IT executives must lead the way in adopting a new sense of realism about the strengths and limitations of IT. If you have a better understanding about where all this is going and the trends you're talking about, then can purchasing and deployment decisions be done more safely and reliably?

A: Yes. But to get there -- and this is one of the toughest challenges they face -- they need to temper their drive to be creative and be innovators. It is a natural desire for any organization to be an innovator, to be creative. Yet we have moved to a time when trying to do something distinctive, something different, something proprietary is often counterproductive for IT. It increases your risk. It increases the risk that you will be isolated from the general infrastructure, from open systems. More and more, organizations want to move toward more generic, more commonplace, more standardized solutions and avoid the customized tailored route. That is going to be a tough challenge because it is a natural human desire to do something new and unique.

Q: At the same time, the general role of innovation in society as a force that moves the society forward is pretty well understood.

A: Certainly. I'm not arguing that innovation in IT is coming to an end. I'm just arguing that it is moving from the user to the vendor. I mean the big IT suppliers are competing like crazy to get even a little edge over their competitors. They are continuing to innovate at a rapid pace. Users are going to get the benefits of that.

Q: IT innovation, at least for the last 20 years, has mainly fallen to the private sector. But as IT innovation becomes less important for companies, does that open the door for government to become more of the innovator than the private sector?

A: I don't think so. In some ways, I think it is a cost issue. Obviously government agencies and governments generally struggle even more than companies with cost constraints. Now on the other hand, that can lead to some kinds of innovation, especially innovation focused on driving down the cost by using new potential solutions. So on the one hand, you could argue that governments are by necessity leading the way in the exploration of open source operating systems and applications, because they can't afford the license fees of some of the big software suppliers. And even more than in the U.S., you see this in the public sector of developing countries, where they have no choice but to look for cheaper solutions. They are pioneering some open source and other technologies that may become more commonplace throughout government and industry. So at that level, you might argue they are innovators. But that is more being creative to control costs rather than being creative and pushing the envelope of functionality, power or capability.

Q: Would it be reasonable to expect open source to proliferate more and more?

A: I think it will continue to grow and be as important as companies look for cheaper solutions that fulfill the needs they have. Open source is just another example of how the options of the users of technology are growing.

Q: What else might government executives gain from this vantage point on technology?

A: I already talked about this to some degree. But if you don't have to worry about competitive advantage -- and I think in the way I'm talking about it, governments don't, although you could argue that different localities, or different states or countries, are competing for business and citizens. But in general, I don't think they have the profit motive companies have. So governments are in position where this commoditization trend can be a very good thing. They can capitalize on commoditization to drive down the cost of doing the kind of IT projects they want to create more efficiency in the way citizens interact with government. But as I said, that means moving to a mindset of embracing standardized, homogenized technology, rather than trying to do something customized and unique. And the way I see it, that's the challenge for IT organizations in government -- to move from a "we need to be innovators" mindset, to be creative managers of the resource, but not necessarily try to be out ahead and do proprietary unique things because that is going to backfire more often than not.

Q: I'm speculating here, but if open source is here to stay, perhaps there is something for governments to learn from this. After all, government is supposed to involve citizens as participants at some level.

A: You are speculating there. Although one thing we haven't talked about -- and I'm not an expert on it, but [the digital divide] is something governments have to keep in mind because a lot of things they are doing involve pushing services over the Internet. We have to keep in mind that still less than half of the households in the U.S. have any sort of fast, broadband Internet connection. So one of the challenges for government is to push ahead technologically, but not leave behind a large part of the citizenry who can't tap into that capability.

This is one point in particular that governments always have to keep in mind: technology races ahead faster than average citizens feel the need -- or have the economic resources -- to use. There are still a heck of a lot of people using dial-up modems, and not even 56K dial-up modems, but older versions. Sometimes that's economics, and sometimes it is simply that it is good enough.

Q: That's another key point you address concerning technology, and particularly IT because it is moving so fast -- the capabilities start outstripping what businesses actually need, or what individuals or citizens need.

A: Right. There has been an assumption about IT, particularly from IT businesses, but also from IT organizations that if we build it, they will use it.

But in fact, as with all technologies, you reach a point where the power and the capability gets out ahead of what users need. They are not necessarily going to follow you to that next power level. They are not going to upgrade to the next chip, or go from version seven to version eight of your software. You always have to keep in mind that you can get out ahead of customers' or citizens' needs when it comes to technology.

Blake Harris  |  Contributing Editor