IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Witnessing an Evolution

The combination of grid computing and utility computing just might be the next big thing.

Utility computing, to some observers, is part of the Internet's coming evolutionary advance and the logical future of how we operate. Also called on-demand computing, it allows users to access more computational power while maintaining fewer resources.

Numerous endeavors have been established in the worlds of academia and science, and the West Virginia High Technology Consortium (WVHTC) Foundation and the Minnesota Historical Society (MNHS) are already taking advantage of utility computing's potential.


Foundation
Most experts agree that grid computing will let utility computing truly flourish.

With inherent Orwellian overtones, "the grid" sounds shadowy and sinister. In reality, a grid is a colossal network of computers that allows individual machines to share computational capacity with others on the network -- in essence, creating one massive computer.

Grid users can access applications and take advantage of the combined processing power of thousands of computers, and perform extremely high-level computations on their own machines. They need not invest in powerful mainframes because a grid uses idle desktops and servers to create a virtual mainframe of almost limitless potential power.

Carl Kesselman, director of the Center for Grid Technologies at the University of Southern California's Information Sciences Institute -- and one of the grid's pioneering researchers -- said he believes grid computing will change the way people work and interact with each other.

"The grid is an infrastructure that allows the controlled sharing of diverse resources to facilitate collaborative activities," he explained. "It's the idea of a virtual organization. On the Web, everything looks like a document, and you are basically just sharing documents. The grid is about understanding the process in back of that -- understanding that there is capability and learning to manipulate that. It's a very powerful idea."


Inception
Grid computing came to life in 1995 through the Globus project, which developed the basic mechanisms and infrastructure for grids, and created the Globus Toolkit -- the underlying infrastructure used by most major grid projects.

Initially the Defense Advanced Research Projects Agency funded the Globus project, and the U.S. Department of Energy and the National Science Foundation provided later funding. The project led to creation of the Globus Alliance, a research and development organization now working to create open source, standards-based software to enable development of grids worldwide -- and worldwide grids.

Other organizations, such as the Global Grid Forum, seek to develop standards and best practices for grids. The Global Grid Forum also hosts workshops around the globe that bring researchers and industry professionals together to discuss the grid's future.

The grid has two versions. As is often the case with new technologies, the grid started first as an academic resource and later begat a commercial offshoot. Presently academia and the private sector have their own grids, and emerging between the two is a government grid.

One famous grid, SETI@home, is an academic grid disguised as a screen saver. SETI, the Search for Extraterrestrial Intelligence, analyzes radio signals from deep space, hoping one will be artificial and prove there is life beyond earth.

SETI generates huge amounts of data, and processing such massive data is no simple task. SETI@home acts as a data-analysis program to take advantage of the processing power of idle desktops around the world. Users of SETI@home may not know it, but they are part of one of the first and largest grids.

"When a problem is too hard for one computer, you can slice it up, give it to lots of different computers and bring those answers back together to solve it," said Tim Hoechst, senior vice president of technology for OraclePublic Sector. "A great example of this is SETI@home. We call these 'academic grids' because in academia, they are building large arrays of computers to address computationally difficult problems."

On the flipside of academic grids are those used for commercial applications. The technology is the same, but applied differently.

"We use the term 'enterprise grid,'" said Hoechst. "For us, that means multiple computers sharing the same disk. To an application, these computers look, smell and act like one computer, but in reality, they're multiple computers cooperating."

With academia and the private sector creating, or hoping to create, grids of their own, where does that leave government agencies? The trend toward resource sharing and consolidation is evident, and the grid has potential to create functional, efficient IT environments.

The problem is that most agencies have program-specific infrastructure. Some rely on old legacy mainframe systems that only a few people know how to manage. Creating a grid would require administrators to not only share control of the resources, but also to manage a beast that can grow quickly and wildly.

That is, unless someone else could manage it.


Buying Time
Utility computing is very much like setting up a grid, except someone else sets it up and charges for its use.

"Sometimes the terms 'grid' and 'utility' get kind of muddied," said Sara Murphy, HP's marketing manager for grid computing. "Utility computing is a model of how you pay for computing resources. It's purchasing computer resources in a pay-per-use model. Grid and utility are complementary concepts. The grid is the infrastructure for sharing resources. Utility is the concept of paying for what you need."

In the real world, grids providing utilities such as gas, electricity or water were created to supply on-demand access to consumers who want to use those services and will pay for them.

Utility computing is no different. The concept should especially appeal to government agencies that experience seasonal spikes in demand, which require more power but may not justify purchasing -- or the agency simply can't afford -- new hardware.

George Westerman, a research scientist at MIT's Center for Information Systems Research, said this notion of utility computing makes the concept valuable.

"If you have demand that varies greatly, like at the end of the month but not at the beginning of the month, you can buy processing power for when you need it, instead of having a lot of computers sitting around doing nothing," Westerman said. "That's the key value proposition for utility computing. In addition, somebody else is managing your computers so they're going to work right."

Another benefit of a grid-based utility model is that there is safety in numbers. If one, five or even 100 computers go down, the remaining computers work together to make up for the loss.

"When a big machine fails, it has failed," said Hoechst. "In the grid, you replace a node. The grid itself never goes down."

Utility computing allows users to figuratively flip a switch and access vast computing resources only when they need those resources. The model is similar to the way wireless phones access the Internet. Some wireless providers charge on a per-kilobyte basis, so users are charged for the amount used.

If someone has an instantaneous need where a ton of processing is required, he or she can access the grid, grab everything he or she needs for a second, and then disconnect from the grid, Westerman explained. "If I just have an hour's worth of work but it takes a thousand computers to do it, I don't need to have those thousand computers." The user can connect to the grid for an hour, he said, and then disconnect when the work is complete.

Utility computing should be particularly appealing in times of lean budgets and high demand. Mark Forman, former administrator of the federal Office of Management and Budget's E-Government and Information Technology Office, said he believes the approach presents an extraordinary cost-savings platform for governments at all levels.

"The cost-savings from taking advantage of grid computing in an on-demand environment are huge," said Forman. "Most local governments are strapped for money and would love to take advantage of other people's assets on information and applications."

Dan Hushon, chief technologist for Sun's Strategic Development Business Unit, said government agencies can save substantial amounts of money with the utility computing model.

"The government's average cost of delivering IT is somewhere in the $9-$18 per-CPU-hour range," Hushon said. "Here, we are at $1 per CPU-hour. If you decide it's cheaper to rent or simply utilize computer space rather than buy it and operate it yourself, you have that option."


Practical Purposes
Two states have launched projects implementing grid-based utility computing, both of which have unique goals, and harness the power and potential of on-demand computing in a grid environment.

Robert Horton is a state archivist and head of the Collections Department at the MNHS, which is working with the San Diego Supercomputer Center (SDSCC) at the University of California, San Diego, to test whether the SDSCC can host terabytes of spatial data from the MNHS while simultaneously allowing access to any requested data.

"The primary data is digitized material," said Horton. "Surveys and maps from the 19th century, for example. They are big maps, very high-quality with very high resolution."

The SDSCC plays the role of utility provider by hosting the data and making it accessible to those who ask for it. Horton admits the MNHS staff is not technologically sophisticated enough to manage such large data sets. By creating an on-demand, grid-based environment, Horton hopes to someday manage all of their data within a virtual organization.

"We're archivists, not technology experts," said Horton, adding that government won't ever likely have expertise equivalent to the private sector or higher education with these types of networks. "Governments have primary business functions, and that's what they know best. They need some collaboration along these lines to manage the data they create and use."

At the WVHTC Foundation, CEO Jim Estep is creating the Global Grid Exchange to boost West Virginia's position in the academic world while simultaneously luring new business to the state.

"The idea is that we put our computing nodes on all the various computers that the state has in its inventory," Estep said. "In turn, the state can use the computational capability of our grid for their work. They can leverage the grid and save, we hope, millions of dollars."

But bringing new life to West Virginia's economy is the primary goal of the Global Grid Exchange. Staff members are applying the utility model themselves, essentially reversing how a standard government utility model looks.

"We hope the big bang for the buck is going to be the businesses that spend millions every year doing computations," said Estep. "We as a state want to offer a package to them to improve their margins and bottom lines with this resource we have built. We can use that as an enticement for businesses to locate in West Virginia, thus creating jobs for the people."

By making use of existing state resources, the Global Grid Exchange can create tremendous new business opportunities that benefit all of West Virginia.

"This is, for all intents and purposes, the movement of computation into the utility environment," said Estep. "We are basically building a utility. In the same way electrical grids are organized, you'll see our grid organized."
Chad Vander Veen is a former contributing editor for Emergency Management magazine, and previously served as the editor of FutureStructure, and the associate editor of Government Technology and Public CIO magazines.