Imagine an employee who spends 90 percent of the workday wandering the halls and taking catnaps. In most organizations, that person would be sent packing before closing time on Friday. But not so in data centers. Governments all over the world keep them packed with servers that are kind of like this hypothetical slacker - they don't live up to their potential.

"Average CPU utilization is probably somewhere in the range of 5 percent to 7 percent, maybe 10 percent to 12 percent on a server that's really being taxed," said Tennessee CIO Mark Bengel.

Just as each person in an office - productive or not - commands a desk and chair, a paycheck and benefits, each server box in a data center takes up floor space, arrives with an initial price tag, and generates ongoing costs for software licensing, power, cooling and maintenance. These costs accrue whether the box performs at 5 percent of capacity, or 75 percent.

The desire to get a much bigger return on hardware investments is one reason some government organizations turn to virtualization technology. Virtualization uses software to simulate multiple entities inside one physical server box. Each entity has its own personality and performs as though it is a real component.

Virtualization can mean, for example, creating 20 independent servers in one physical box, each with its own operating system and configuration, and the ability to run a separate application. It might mean creating a dozen desktop PCs from one server and 12 thin clients, or dividing the resources of one network infrastructure into what looks like three entirely separate networks.


View Full Story
Merrill Douglas  |  Contributing Writer