IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Has the Time Come for Software-Defined Data Centers?

Completely virtualized data centers may be the future for large, sophisticated jurisdictions. But they’re still a few years away.

Government agencies have been steadily consolidating and in some cases outsourcing their IT infrastructure over the last several years, seeking improved efficiency and better performance. And while that activity continues, some organizations have their sights set on the next phase of the technology evolution — the virtualization of entire data centers.

While many would imagine the topography of data centers to consist of rows of server racks with cables snaking among hundreds, if not thousands, of machines, there’s a new vision of how those centers should be run and organized. Instead of having people physically plug in wires based on client needs, there’s a push to revolutionize data centers so those changes can be made remotely.

Welcome to the concept of software-defined data centers (SDDCs). The clinical definition sounds simple: a data center where all the infrastructure and four major layers — compute, storage, network and security — are virtualized and delivered as a service.

Thanks to software-defined networking, most data center tasks can now be done from a keyboard. Once servers are plugged in and installed, all the configuration that previously was done manually can be completed from a single desktop.

Chris McClendon, technology services officer for the Georgia Technology Authority, got a glimpse of the future during a visit to one of Microsoft’s Azure data centers in Virginia. The data center needed people to “run around with crash carts” replacing hard drives, but the real technical work was done at the company’s headquarters in Washington state, he said. “All the administration, all the logical stuff is done out in Redmond [Wash.].”

Although growing use of virtualization is pushing data centers in this direction, experts say fully functioning SDDCs are still a couple of years away for government agencies.

Adoption Challenges

Utah and Georgia have started the transition toward running SDDCs, but neither state is close to full deployment. Utah completed the virtualization of its compute layer, but the state is still undergoing active testing of technologies in the other three layers. So it’s only about a quarter-mile down the road to completion.
So if you want to make a sidebar in a web story, all you have to do is create a table that’s a single cell, width of 200 pixels, and paste the text/photo inside. Once that’s done, you’ll have to go into the source code and add in a few things so it looks like this:

What’s In an SDDC?

In a software-defined data center (SDDC), all infrastructure is virtualized and delivered as a service. Generally, it’s made up of a business logic layer and three or four core components:

  • Compute
  • Storage
  • Network
  • Security
Hardware configuration of a software-defined data center is done remotely, using software. This is different than traditional data centers, where infrastructure is made up of physical hardware and devices are set up manually.
If you want to change the color of the table, you can find tons of color codes here:
Georgia’s data center is split in half. Steve Nichols, the state’s chief technology officer, explained that when you walk in the door, the left side is dedicated to legacy equipment and the right side is “all the new stuff” that’s controlled remotely. But like Utah, the state has only completed a fraction of the changes needed to launch a true SDDC. However, Georgia has been running a software-defined network for years.

Nichols said the new data center technology is a tradeoff. While the state doesn’t need personnel to physically stand up new servers because it can be done virtually, there’s been an increase in lead time and planning to determine just how to make it all happen remotely. “It hasn’t necessarily made things go faster for us,” Nichols said. “It’s just sort of shifted the work around.”

Bob Woolley, Utah’s chief technical architect, added that the subject is complex for government agencies, because there are policy and skill issues to address, along with the four separate layers IT personnel need to consider. It could take up to a decade for some governments to reach a full SDDC implementation.

“You just can’t flip the switch,” Woolley said. “The compute layer is already happening, as virtualization is pretty common now. The storage layer will probably start happening this year, and the network layer isn’t going to be too far behind. But states have huge investments in network infrastructure. I think I have $16 million worth of switches; you just don’t throw that away. It’s a practical matter.”

Moving to an SDDC can provide a variety of benefits, but not necessarily immediate cost savings. The primary gain is the same as virtualizing any component of an agency’s IT environment — flexibility, according to Miguel Gamino, CIO of San Francisco.

Many experts use the word “scalability” to describe SDDCs, but Gamino thinks that’s too limiting a term. “I refer to it as ‘elasticity,’ because it also allows you to shrink your footprint as dynamically as you can grow it,” he said. “It’s a bidirectional opportunity that lets you be more efficient with how you’re using your infrastructure and resources.”

Financially the jury’s still out on whether governments will save a significant amount of money by moving to an SDDC. Utah has saved more than $4 million by virtualizing the compute layer and expects similar financial savings as it progresses to the others.

But Georgia isn’t pursuing an SDDC thinking about the return on investment, said Nichols. Instead, it’s about getting the state’s data center in a place where everything can be managed remotely and homogenously.

Both Nichols and Woolley said the transition to an SDDC will have an impact on the IT workforce.

“We need smarter people who really understand routing and … we needed those people to begin with,” Nichols said. “But it sort of changes the work so that instead of them designing it and having some guys with screwdrivers in their back pockets running around on the floor, then they also become the guys who are typing the commands in the console to make it so.”

Woolley agreed that moving to an SDDC requires an upfront investment and an “upskilling” of IT staff. But he also said most public-sector technologists already have many of the skills necessary to make the move.

From an architectural standpoint, there’s opportunity to think very differently about what the IT agency does. That shift may already be happening, Woolley said, noting that most governments are going to wind up with hybrid computing environments that include both on-premises and cloud-based resources.

That combination begins to look “an awful lot like a software-defined data center if provisioned correctly,” he said. “I think the migration is more natural than unnatural, but it’s going to take time.”

It’s a mistake to consider SDDC and cloud strategy as separate issues, Woolley added. “CIOs are focusing on how they get their customers and themselves to use the cloud and how to make it make financial sense. They’re not thinking about rebuilding infrastructure on the SDDC level. They look at that as something to do later; I think that lends itself to a lot of potential errors.”

Gamino likened the transition to software-defined data centers to the changeover to voice over Internet protocol (VoIP) phone service. He recalled that when VoIP went mainstream with government agencies, “classically trained” telecom engineers had to choose whether to retrain or retire. Those working for Gamino when that happened were able to not only make the change, but also to thrive in the new environment. “It’s a matter of the attitude and appetite of the workforce itself, rather than a question of whether the people are competent enough to learn new skills.”

Good Candidates

So which governments should look to make the move to an SDDC? It depends on size and maturity. These deployments may make the most sense for very large, sophisticated organizations.

“We’re formal, but there are people a lot more formal than us,” Nichols said. “We’re large, but there are others hundreds of times larger than us. So we’re kind of on that border of getting some value, but we’re not in a situation of a Google, where they might have to do something like this because it’s the only way they can scale to being able to manage millions of servers.”

Woolley cautioned that there are other challenges to consider too. First and foremost are the impacts on an agency’s budgetary environment. Once services are automated, that will impact how chargebacks, metering and billing are done, because all the various IT pieces aren’t controlled in the same way.

Georgia’s McClendon said individual organizations either need to work toward moving their IT operations into a managed service provider environment or they need to keep it fairly simple. He added that pursuing an SDDC will depend on just how much an organization believes it can handle work-wise.

Despite those questions, Gamino appears to be all-in. He plans to get there in the next couple of years, calling San Francisco’s IT maturity at the front edge of the curve. “We’ll let the private sector and other people nick their fingers on the bleeding edge, but [we’re] right behind that,” Gamino said.

Utah and Georgia are moving closer to having operational SDDCs. But neither Woolley nor Nichols would reveal just when they thought their respective shops would get there. It’ll likely be a few years. 

Brian Heaton was a writer for Government Technology and Emergency Management magazines from 2011 to mid-2015.