Utility computing, to some observers, is part of the Internet's coming evolutionary advance and the logical future of how we operate. Also called on-demand computing, it allows users to access more computational power while maintaining fewer resources.

Numerous endeavors have been established in the worlds of academia and science, and the West Virginia High Technology Consortium (WVHTC) Foundation and the Minnesota Historical Society (MNHS) are already taking advantage of utility computing's potential.

Foundation

Most experts agree that grid computing will let utility computing truly flourish.

With inherent Orwellian overtones, "the grid" sounds shadowy and sinister. In reality, a grid is a colossal network of computers that allows individual machines to share computational capacity with others on the network -- in essence, creating one massive computer.

Grid users can access applications and take advantage of the combined processing power of thousands of computers, and perform extremely high-level computations on their own machines. They need not invest in powerful mainframes because a grid uses idle desktops and servers to create a virtual mainframe of almost limitless potential power.

Carl Kesselman, director of the Center for Grid Technologies at the University of Southern California's Information Sciences Institute -- and one of the grid's pioneering researchers -- said he believes grid computing will change the way people work and interact with each other.

"The grid is an infrastructure that allows the controlled sharing of diverse resources to facilitate collaborative activities," he explained. "It's the idea of a virtual organization. On the Web, everything looks like a document, and you are basically just sharing documents. The grid is about understanding the process in back of that -- understanding that there is capability and learning to manipulate that. It's a very powerful idea."

Inception

Grid computing came to life in 1995 through the Globus project, which developed the basic mechanisms and infrastructure for grids, and created the Globus Toolkit -- the underlying infrastructure used by most major grid projects.

Initially the Defense Advanced Research Projects Agency funded the Globus project, and the U.S. Department of Energy and the National Science Foundation provided later funding. The project led to creation of the Globus Alliance, a research and development organization now working to create open source, standards-based software to enable development of grids worldwide -- and worldwide grids.

Other organizations, such as the Global Grid Forum, seek to develop standards and best practices for grids. The Global Grid Forum also hosts workshops around the globe that bring researchers and industry professionals together to discuss the grid's future.

The grid has two versions. As is often the case with new technologies, the grid started first as an academic resource and later begat a commercial offshoot. Presently academia and the private sector have their own grids, and emerging between the two is a government grid.

One famous grid, SETI@home, is an academic grid disguised as a screen saver. SETI, the Search for Extraterrestrial Intelligence, analyzes radio signals from deep space, hoping one will be artificial and prove there is life beyond earth.

SETI generates huge amounts of data, and processing such massive data is no simple task. SETI@home acts as a data-analysis program to take advantage of the processing power of idle desktops around the world. Users of SETI@home may not know it, but they are part of one of the first and largest grids.

"When a problem is too hard for one computer, you can slice it up, give it to lots of different computers and bring those answers back together to solve it," said Tim Hoechst, senior vice president of technology for OraclePublic Sector. "A great example of this is SETI@home. We call these 'academic grids' because in academia, they

Chad Vander Veen  |  Editor, FutureStructure

Chad Vander Veen is the editor of FutureStructure.com