Back in 2011, the U.S. Office of Management and Budget (OMB) announced that consolidation of federal data centers could reduce the number of datacenters by 40 percent and save the government billions of dollars -- and while that goal remains, the path to this goal has taken a turn.

On Mar. 27, U.S. CIO Steven VanRoekel announced in a blog post a revision to PortfolioStat, the OMB’s approach to analyzing and rehauling IT.

These revisions, he said, are the result of lessons learned over the past 18 months. Though people were obeying the letter of the project, they were taking shortcuts and not embracing the project's spirit.

“It wasn’t creating the right incentive structure to get to the right end product I wanted, which wasn’t just closures,” he said in an interview with Government Technology. “What ends up happening is you get people that will just grab little data centers and drop them into a bigger room. Or you get two data centers that are literally next door to each other where someone just takes down the wall between them and two become one and you’ve ‘closed one.’”

To make matters worse, agencies involved in the consolidation efforts were not cooperating fully. A U.S. Government Accountability Office report released last year showed that most agencies were not reporting complete inventory and server information. Identifying what assets each agency had was a crucial step in the consolidation process outlined by PortfolioStat. And PortfolioStat 2.0, VanRoekel said, is intended to fix these problems and keep the government on track to meeting its 2015 goal of closing 820 data centers.

Rather than focus on data center “consolidation,” VanRoekel said, the priority needed to be shifted. “IT and datacenters are not the end you’re trying to achieve -- it’s a means to a broader end," he said. "The end you’re trying to achieve is around the mission of the agency. How am I driving better service to Americans? How am I saving money doing that? And how am I increasing the productivity of the employees?”

So VanRoekel and his team identified nine metrics for data centers that included things like energy footprint, utilization and cost of ownership. Data centers that could show they met six of nine metrics would become “core” data centers, toward which other data centers would gravitate. The goal, VanRoekel said, is no longer just to close data centers, but to optimize what is there and improve service.

VanRoekel's position, he said, is one grounded in statute, meaning his authority comes from what the laws allow him to do. Two of his powers, he said, are setting policy and budgeting funds across government, which are both ways of getting other agencies to cooperate. Perhaps even more importantly, he said, he tried to make senior leadership at other agencies understand that data center optimization is important because it’s central to U.S. government technology.

And including cloud computing with data center upgrades was intended to be a seamless part of the process that would support other federal programs, such as the Federal Risk and Authorization Management Program (FedRAMP), the government’s effort to instate a standardized approach to cloud services across government.

PortfolioStat originally identified $2.4 billion in cost savings, $300 million of which has been saved so far, VanRoekel said. And the revisions to their approach should improve the progress of those savings.

Also, PortfolioStat will continue to be updated as more lessons are learned. “We’ve closed 420 data centers to date; we’re on track to close over 800 by the end of this fiscal year, which is this fall."

While federal data centers will continue to close, VanRoekel said what will be seen as part of this process is more of a shift in the federal government to service orientation and cloud computing. "And part of this motion just won’t be about closures and consolidation," he said, "it’s going to be about when we build the next generation of applications, either citizen-facing or internal to government, we do that in a modular way."

The government will focus on using open data, machine-readable data and using APIs in new ways, he said. “That’s what will come as products of this work, and I think the mission delivery side will be better, faster and cheaper because of it."

Photo courtesy of Shutterstock

Colin Wood  |  Staff Writer

Colin has been writing for Government Technology since 2010. He lives in Seattle with his wife and their dog. He can be reached at cwood@govtech.com