IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

What Is the Digital Economy?

The second in our series of excerpts from Digital Prosperity, a new report from the Information Technology and Innovation Foundation. Here the authors seek to define the digital economy, noting that IT has been the key factor responsible for reversing the 20-year productivity slowdown and today is driving our robust productivity growth.


For most people the digital economy refers to the economy conducted on the Internet, but the digital economy is much broader than this. The digital economy represents the pervasive use of IT (hardware, software, applications and telecommunications) in all aspects of the economy, including internal operations of organizations (business, government and non-profit); transactions between organizations; and transactions between individuals, acting both as consumers and citizens, and organizations.

Just as 100 years ago the development of cheap, hardened steel enabled a host of tools to be made that drove economic growth, today information technology enables the creation of a host of tools to create, manipulate, organize, transmit, store and act on information in digital form in new ways and through new organizational forms (Cohen, Delong, Weber, and Zysman 2001).

The technologies underlying the digital economy also go far beyond the Internet and personal computers. IT is embedded in a vast array of products, and not just technology products like cell phones, GPS units, PDAs, MP3 players, and digital cameras.

IT is in everyday consumer products like washing machines, cars, and credit cards, and industrial products like computer numerically-controlled machine tools, lasers, and robots. Indeed, in 2006, 70 percent of microprocessors did not go into computers but rather went into cars, planes, HDTVs, etc., enabling their digital functionality and connectivity. Connecting these IT tools is a robust and growing wireless and wireline telecommunications network. Moreover, the technology is anything but static.

As it continues getting cheaper, faster, better, and easier to use, organizations continuously find new and expanded uses for IT every day, as the recent emergence of YouTube illustrates. As some keen observers of the digital economy point out, "At each point in the last 40 years the critical step in the transformation of technological potential into economic productivity has been the discovery by IT users of how to employ their ever greater and ever cheaper computing power to do the previously impossible." (Cohen, Delong, Weber, and Zysman 2001) Cataloging even onetenth of the new applications being created today in a wide array of application areas and sectors would be a monumental task.

Why has IT become so ubiquitous and so central to growth and innovation? Certainly, a number of economic, social and political factors played critical roles, but the short answer is that IT prices have plummeted, performance has exploded, and usability has vastly improved. If just one of these had happened, the digital revolution would have been stillborn.

If prices had fallen without performance improvements, the result would be cheap but not very effective technologies. If performance had improved without price declines, IT would have proven too expensive to put into everyday devices and applications. If both happened but the technology remained hard to use, adoption rates would be significantly lower. Luckily, all three happened.

In 1965, Intel co-founder Gordon Moore observed that as transistors got smaller, the number of transistors that fit onto an integrated circuit grew exponentially. He "challenged" the semiconductor industry to continue this exponential growth, a challenge which the industry has risen to time and again. Each doubling requires innovation, capital expenditure, and risk. In practical terms, the result has been that the computing power of a chip doubles every 18 months. This prediction has held true for over 40 years.

In 1978, the price of Intel's 086 processor was $480 per million instructions per second (MIPS). By 1985, the cost of the 386 processor had fallen to $50 per MIPS. Ten years later the Pentium Pro cost just $4 per MIPS. In 2003 the Itanium 2 processor cost half that, at $2 per MIPS.8We can see this trend by examining the growth in the number of transistors on Intel processors.

This exponential progress is continuing across many core IT technologies (memory, processors, storage, sensors, displays, and communication). The real price of servers fell approximately 30 percent per year between 1996 and 2001 (Van Reenen 2005). Hard drive storage capacity has doubled every 19 months while the cost of a stored megabyte of data has fallen 50 percent per year.

As a result, the cost of storing one megabyte of information fell from $5,257 in 1975 to 17