When the dot-com boom went bust, Silicon Valley wasn't the only high-tech community to suffer. Back east, the Route 128 corridor outside Boston went through a similar slump.

But just as there were survivors in Sunnyvale, Cupertino and San Francisco, a number of Internet-based firms in Maynard, Cambridge and Boston pulled through as well. One of them is Akamai Technologies Inc., once known as a content delivery provider for Web-based firms. Today Akamai is doing something different with its more than 14,000 servers and 1,100 networks deployed around the world.

In addition to content delivery, Akamai provides its customers, which include major government agencies and universities, with network infrastructure and processing power on demand. Customers use the infrastructure as needed, paying for what they use when they use it, and nothing more. Kieran Taylor, Akamai's director of product management, described it as having "apps on tap."

Akamai is one of a growing number of IT companies investing in new products and services some believe will be the next big wave in the IT world. The idea is computer companies will operate like power plants, providing processing power and applications to customers when they need them, much like power plants provide electricity, and customers will pay only for what they use. The concept borrows from the grid model. Hundreds, even thousands, of servers provide processing and storage, and serve up large applications. The servers distribute the work to many computers for processing, then reassemble it and deliver results to the customer.

It's heady stuff, even when compared with the go-go dot-com years that promised so much before imploding. But this is not another crazy idea from college grads. IBM said it will invest nearly $10 billion -- $800 million in 2003 -- in utility computing. Larry Ellison, CEO of Oracle, called utility computing the new alternative to the 40-year-old, big server approach to technology and refashioned his firm's signature database product to take advantage of utility computing. Forrester Research called it the third major computing revolution, after mainframes and the Internet.

But dig beneath the buzzwords and the blizzard of press releases on the subject this year, and you'll be hard-pressed to find customers in the private or public sector who pay for their computing needs using the software equivalent of a utility meter. "A lot of what has to do with utility computing is still a concept, a vision," said Michael Isaac, senior program director for Saugatuck Technology, a Connecticut-based consulting firm. "The technology is not completely there."

Infrastructure with Power

The concept of turning on data and applications like an electrical switch has been around for at least 10 years, but the current broad shift by vendors toward utility computing rests on a more pervasive change taking place in the computing industry.

Steady advances in microchips, memory, storage and bandwidth, along with the ever-growing Internet -- now surpassing 1 billion users worldwide -- and an industry that has swayed from proprietary to common standards means technology matters less and people and businesses who use it matter more, according to Irving Wladawsky-Berger, IBM's general manager for e-business on demand.

"When you put all these advances together, you have a far more powerful information infrastructure available to everybody to do their work," he said. "Organizations such as government no longer have to adapt themselves to the rigidity of technology," he added. "There is no reason you can't bring technology to support business in a flexible manner today."

Wladawsky-Berger points to the Internet as the premier example of how technology has shifted from a rigid, proprietary-based technology to one that is flexible, running on industry standards and low-cost computing parts. "You can go wherever the hell you want on the Internet, and the technology takes you there," he said. "If you change your mind, it will go where you want to go. Ten years ago, you couldn't do that."

That flexibility, Wladawsky-Berger said, is leading businesses and governments to insist on more flexible ways to pay for computing. Until now, computing has required huge capital investments in networks, servers and application software, not all of which are used in the most efficient manner.

Numerous industry reports show organizations use 20 percent to as little as 8 percent of their servers' capacity on average, and often no more than 35 percent of their data centers' overall capacity. The same is true with software licenses. Organizations make enormous expenditures for hundreds or thousands of seats to run an enterprise application, but may use only a fraction of them at any time.

Many analysts believe the days of asset-based IT, where organizations pour large amounts of capital into their own server farms and networks, are over. Instead, it's about services for the customer, where organizations can quickly scale infrastructure up or down for a particular initiative, with the risk and complexity of computing shifting to the utilities.

In addition to simplifying computing for the customer, organizations and businesses are looking for a new way to pay for IT, one with a lower operating costs. "Customers want to change the IT cost structure depending on their requirements," said Tom Sadtler, vice president for solutions and portfolio management at HP. "They want to pay as they go."

Vendors Embrace the Grid

"HP invented utility computing," according to Sadtler, who pointed out that the former computing giant Digital Equipment Corp. first used the concept prior to its acquisition by Compaq, which was later acquired by HP. In fact, Sadtler partly credits utility computing as one reason HP and Compaq merged so smoothly and consolidated redundant systems.

HP's utility strategy -- Adaptive Enterprise -- is earning the company $1 billion in revenue annually through solutions such as utility-based data centers and on-demand messaging services.

IBM's strategy, called e-business on demand, centers around the ability of its huge Business Consulting Services practice to help organizations set up utility computing solutions.

Sun Microsystems calls its initiative N1, centering it around better utilization of servers, storage systems and networks.

Database software giant Oracle also started marketing itself as a utility solutions provider. The firm, which announced its database program will be grid-enabled by the end of the year, believes the utility concept is about to take off, in part, due to the advent of low-cost computers known as blades. Grid computing is about capacity on-demand, explained Ellison during the unveiling of Oracle 10G. "Plug another server into the grid and the application runs faster and more reliably, and the capacity is inexpensive," he said.

Vendors are selling simplicity and lower costs. Computing has become so complex in recent years that 75 percent of an organization's IT operating costs come from staffing, consulting and maintenance, according to BusinessWeek magazine. Yet as noted earlier, most servers are vastly underutilized. Vendors say a utility computing solution -- whether outsourced to a vendor like IBM or HP, or done in-house with the help of an IT firm, such as Sun Microsystems -- can pump up utilization from less than 20 percent to 50 percent or even 80 percent.

Powering Up the Public Sector

Former Georgia CIO Larry Singer once noted that government agencies prefer to have their own generators in their basements, meaning they want control over the servers that run their applications. Part of that philosophy is a turf issue, but it's also a funding issue. Public-sector programs have been, and remain, vertically funded.

But with unrelenting pressure on government budgets in recent years, especially at the state level, CIOs must look at every cost-cutting option without cutting service or value. That's why they need to look closely at the benefits of utility computing, said Todd Ramsey, general manager for global government industry at IBM.

For example, Ramsey pointed out that state revenue agencies are hit by computing demands for processing, networking and storage during the tax season, only to see demand plummet during the rest of the year. In recent years, many states have poured millions of dollars into building up computing assets to handle automated processing of tax returns, while that capacity sits idle for much of the year. That's not a good matching of investments with benefits, he explained.

Because so much taxpayer money is tied up in procuring hard assets, public-sector CIOs lack additional resources to carry out a full business transformation of government services. The result is less value from tax dollars. "Our strategy at IBM is to tie business process transformation with IT infrastructure optimization," he explained.

The optimization part, according to Ramsey, involves outsourcing IT infrastructure to a utility, such as IBM. But Ramsey admits coupling operational transformation in government with infrastructure outsourcing isn't easy. "You need strong leadership to make it work," he said.

Besides tax processing, Ramsey believes utility computing will most benefit the public sector in the areas of social services, motor vehicle licensing and worker training. Already the U.S. Army has contracted with IBM to outsource a distance learning service for soldiers. E-ArmyU.com is a Web-enabled online university that provides soldiers with a range of higher education courses. IBM hosts the hardware infrastructure for E-ArmyU in a utility environment.

The Army's early use of utility computing doesn't surprise computing experts who work in the public market. "CIOs in particular federal agencies are working aggressively to consolidate their infrastructures," said Tim Hoechst, senior vice president of technology for Oracle. "As they do so, they are looking for an architecture that allows them to go with fewer larger systems that are scalable, secure, highly available and cost less." A utility computing architecture would meet all those goals, he said.

It's possible, however, utility computing will initially benefit smaller agencies and medium-sized city and county governments. "You have to remember traditional outsourcing has largely bypassed this sector of the market, due to its emphasis on customized -- and costly -- service plans," said Saugatuck Technology's Isaac. "Many small governments and agencies have not been able to afford outsourcing. Utility computing is less reliant on customization, so it will have a lower entry point in terms of cost, making it a good fit for small governments and agencies."

Standardizing the Utility

Isaac also believes early adopters will be agencies and governments looking for benefits with horizontal applications and processes, such as finance, human resources and procurement, for example.

There are a number of uncertainties that still make utility computing risky from the buyer's perspective. "The variable pricing model of utility computing looks attractive on the one hand when compared to fixed pricing," he said. "But it's also unpredictable."

A number of experts say other uncertainties about utility computing leave nagging questions about its practicality in the short term. "Utility computing will require technology advances in many areas: grid computing, workload management and storage virtualization," the Financial Times recently reported.

The grid form of utility computing uses hundreds or thousands of computers to break down an application into small manageable parts for quicker processing. So far, software programs that allow a distributed network of PCs and servers to act as one supercomputer are designed primarily to handle massive number crunching operations. That's great for scientific research projects, but not for financial applications -- at least not yet. Right now, it's too hard for systems to parcel out and manage computing workloads in response to shifting demand.

Isaac said few applications for utility computing are written from scratch for sharing among multiple customers. For example, application service providers (ASP) allow customers to share applications over the Web, but they still have to pay for a software license to use the application, which is hosted on the ASP's computers. "That's not utility computing," said Isaac. The software must be shared on a metered basis for it to truly be utility service.

Perhaps the biggest concern among customers is the standards issue. Few public-sector entities are standardized on a single operating system, for example. Some agencies may use IBM's AS/400; others might have Sun's Solaris, another version of UNIX, Microsoft's Windows NT or even Linux. Trying to align utility computing with different operating systems could be a problem. The need for standards will also extend into other areas, such as Web services.

Others argue standards are far less of an issue than some believe. There are plenty of standards already in place for the Internet, Web services and the grid, according to Wladawsky-Berger. "Because everything today is based on standards, you don't have to do it all at once," he said, referring to the deployment of a utility infrastructure. "You don't have to do a massive architecture. Start with something small, and let it build incrementally, knowing with full confidence the pieces will fit with each other."

Tod Newcombe  |  Senior Editor

With more than 20 years of experience covering state and local government, Tod previously was the editor of Public CIO, e.Republic’s award-winning publication for information technology executives in the public sector. He is now a senior editor for Government Technology and a columnist at Governing magazine.