Walk into any public-sector data center today, and you'll likely see the same thing: rows upon rows of racks that hold servers, servers and more servers (though these days, there are far fewer racks and servers thanks to virtualization).
And to help maximize energy usage and keep the room cooler, governments often utlize the hot aisle/cold aisle layout design, in which racks are lined up so that cold air intakes all face one way while hot air exhausts face the other. The rows composed of rack fronts are the cold aisles and typically face air conditioner output ducts, and the rows the heated exhausts pour into are called hot aisles, which will typically face air conditioner return ducts.
In September 2012, The New York Times reported that a yearlong examination revealed that "most data centers, by design, consume vast amounts of energy in an incongruously wasteful manner."
While the public sector aims to control these costs by using such methods as the hot aisle/cold aisle layout design, cooling this equipment is still of concern -- and there's now a new solution for doing so: by housing that equipment in vats of mineral oil.
By submerging system components in oil, heat can be dispersed far more efficiently than through air, says Andy Price, director of business development for Green Revolution Cooling, an Austin, Texas-based company that's dedicated to changing the way data centers are cooled. And oil cooling, he said, is particularly effective when it comes to high-density data centers, which is why the company’s technology is gaining interest from both private and public industry.
Environmental problems such as dust and extreme temperatures can be solved with oil-cooling, while power consumption can be reduced by 40 to 45 percent, Price said. And a server room on a forward-operating base run by the military offers extreme conditions that illustrate the benefits oil-cooling has to offer. Though the military offers a good example of where the technology is useful, Price notes that everyone can benefit from oil-cooling, particularly in an era of budget constraint.
Reduced energy consumption means lower operating costs, but the initial investment to build a data center can be reduced, too. “They have to manage their costs,” he said. “If they’re tasked with building a new data center, or even retrofitting, they have to look at the equipment. And our solution, from a capital standpoint, is less expensive than building out a traditional air-cooled data center.”
Cooling computer equipment with air requires air flow management systems, specialized rooms, raised floors, as well as additional generators and uninterruptable power supplies (UPS) to support the air cooling systems. But when using oil cooling, Price said, “those things can typically be cut in half, and generators and UPS’s are a significant expense.”
By making simple modifications to traditional computing equipment, old servers and equipment can be used in an oil-cooled system. Cooling fans are removed, and thermal paste is replaced with indium foil.
And there are several solutions for managing storage devices. Hard disk drives (HDD), for instance, used to be sealed, Price said, but now there are drives sold that come pre-sealed, like the helium filled drives sold by Hitachi -- or solid state drives (SSD) can be used. Alternatively, HDDs can be mounted outside of the fluid, attached to heat sinks that are submerged in oil.
As with most new technologies, oil cooling has its detractors: Some are understandably hesitant to believe that submerging computer parts in liquid is a good idea. But Price said the technology is now beyond the testing period. “The technology works,” he said. “We’re beyond the point where we have to demonstrate that servers can survive in a dielectric fluid and that they’re actually more reliable.”
The oil isn’t just safe for components, but it’s safe for people too, Price said. “It’s not a harmful solution, it’s very, very safe for humans to be exposed to,” he said. “It’s baby oil without the fragrance. It’s safe for human exposure, even safe for human consumption.”
At 104 degrees Fahrenheit, it might even be good for the skin if someone were to, say, take a bath with the servers, he said.
At the end of 2012, Intel completed a yearlong test to measure the benefits of Green Revolution’s oil cooling system -- and the semiconductor giant endorsed the technology.
“We can reduce cooling energy use by 90 to 95 percent while also reducing server power by 10 to 20 percent," Intel reported upon completion of the pilot. (And the company is reportedly continuing to evaluate the long-term viability of the technology to see how data center costs might be reduced.)
While the technology is best suited for such places as research facilities, national labs, military bases, weather modeling and national weapons research labs, Price said scale is not the main factor driving cost savings. The savings, he says, come from power density – the more dense a data center, the more benefit oil cooling confers.
Though no public-sector entities are known to have deployed this system of cooling just yet, some are going to keep their eyes on it.
Officials at the city of Sacramento, for instance, said they recently learned about the technology, and though they're not eager to become an early adopter, CIO Gary Cook said it looks like it has promise ... though there are some potential percieved drawbacks. “I’d hate to work on the machine after you pull it out of the mineral oil," he said. "It’s going to be a mess.”
Despite that, he admitted that even a 5 percent savings on cooling overhead could make the technology an attractive investment. The city manages an ever-shrinking server room of about 40 to 50 server racks. The equipment footprint has been shrinking thanks to server virtualization. “We’re about 65 percent virtualized right now,” said Darin Arcolino, IT manager of technical infrastructure.
While virtualization offers many of the same benefits as oil cooling, such as savings on power and a decreased equipment footprint, the technology could someday become the norm, Cook estimated. “Five or 10 years ago, people weren’t virtualizing servers and now it’s the norm,” he said. “So this could be the next generation of the norm for cooling systems.”
Photo via Green Revolution Cooling