Governments at every level are looking into moving information, tools and processes to the cloud, but public-sector officials warn against writing off data centers anytime soon.
The public sector is still finding new ways to use the cloud more than a decade after its arrival, a process that may roll on for years. But even agencies with cloud-forward policies have found reasons why brick-and-mortar data centers still make sense — a parallel narrative that is also likely to continue.
In a series of recent interviews with Government Technology, arguably the most consistent theme to emerge from conversations with state, county and local public officials was that, while it’s worth considering what other agencies have done, a one-size-fits-all approach is probably not the best when weighing a move to cloud.
Or, as Steve Reneker, RIVCOconnect Chief Broadband Officer at Riverside County, said recently: “Every organization is a little bit different on where they are and what their needs are.”
Riverside County has invested around $5 million in an effort to consolidate “a little over” 50 data centers agencywide into its central data center, said Reneker, the county's former CIO who is spearheading efforts to bring broadband to more than 2.3 million residents. As that initiative takes shape over roughly the next five years, Reneker estimated the agency will save around $12 million in operational and related costs.
The cloud itself “really is a room full of servers,” said Jeff Gomes, a traffic engineer at the Massachusetts Department of Transportation (MassDOT), which has begun using solutions from private-sector provider Miovision to gauge traffic flow in impacted areas. Having Miovision host that data ensures MassDOT has less to worry about in the information technology realm, Gomes said — but the agency is exploring migrating data to the cloud.
“Right now, that’s where all of our systems are heading, toward the cloud,” Gomes said. “Ideally we would like to host our own data servers because you know, then, at least, it’s secure.”
But other public officials cautioned against automatically assigning a higher security value to cloud-based storage. Ann Dunkin, chief information officer of Santa Clara County, said security in the cloud “really isn’t any different” than its on-prem cousin, with some providers and agencies doing it well and others not.
“Going to the cloud relieves a certain level of administration, but it doesn’t mean that we get to walk away and forget about system administration and security,” Dunkin said in an email.
Some data at some organizations — in Santa Clara County, information pertinent to its hospital and 911 system — is mission critical, the CIO explained, and must handled accordingly. But, she noted, cost can be a factor as provider agreements often make it “extremely expensive” to pull data back out of the cloud, a scenario that has given her pause.
Other solutions simply won’t be cloud-ready “in the foreseeable future,” necessitating ongoing on-prem support, Dunkin said. She believes large organizations providing essential services will “always” have some capabilities onsite and that, over time, “the majority” of workloads will move to cloud.
Boston’s former CIO Jascha Franklin-Hodge, who stepped down in January, said in an interview prior to his departure that he believes the city may be “in a transitional period” as it invests in cloud infrastructure, using more software-as-a-service (SaaS) applications, and migrating legacy to cloud.
The cloud has the potential, Franklin-Hodge said, to “be the place where all of our IT infrastructure lives.” But like Dunkin, he acknowledged there may be instances — as with the sheer volume of security and traffic camera footage generated in Boston — when storage in the cloud just isn’t economical.
“To do that in the cloud right now is not at all economic compared to streaming over a fiber network that we own and operate,” Franklin-Hodge said.
Former New York City Department of Information Technology and Telecommunications (DoITT) Commissioner Anne Roest said there are too many variables to determine which data center model is cheaper. Roest characterized data center solutions as “ever evolving” in an email interview shortly before her retirement and pronounced it difficult to speak in “’absolute[s],’ because who knows where we’ll be in five to 10 years?’”
She said DoITT partnered with its client agencies in 2008 on a five-year data center consolidation effort. That led to the construction of a “state-of-the-art” 10,000-square-foot data center, which established DoITT “as a centralized, cloud-like provider” to city agencies.
“There may be an argument for the city of New York continually maintaining its own data centers for specific systems or data,” the former commissioner said, pointing out that public safety systems built in “fortified, resilient locations” ensure continued operation during events that compromise power or connectivity.
And Oklahoma CIO Bo Reese said in a recent telephone interview that the state stood up a “very, very significant” data center in 2010 that he and others indicated has facilitated an IT unification. Long-term, Reese said "we see more and more adoption of cloud services."
But, he said: "I still think this is a significant investment in the state of Oklahoma."
Oklahoma Chief Information Security Officer Mark Gower said it’s likely that cost models will change as more agencies use cloud differently than providers intend, and noted that costs have already shifted with the rapid adoption of cloud during the past couple of years.
“You have to identify what information and what systems and what software, and what it makes sense for the state to maintain direct control of,” Gower said, describing the general thought process involved in migrating data to cloud or keeping it on-premises.
“What we’re doing through consolidation, what states are doing, [is] building their own government private cloud in their own way,” Gower added.
Looking for the latest gov tech news as it happens? Subscribe to GT newsletters.