Virtualization is not easy for government, where the demands for seamless data access are typically layered upon numerous legacy systems -- but it's a necessity.
A 2012 study of federal, state and local government IT decision-makers predicted virtualization could save government $30 billion by 2015. Whether those types of savings are realized by the end of this year remains to be seen, but more government agencies are moving to virtualization, primarily in an effort to save money and increase efficiency.
“For governments, virtualization is becoming more of an imperative – it’s no longer a new cool capability to aspire to,” said John Lucker, principal, Deloitte Consulting LLP and Global Advanced Analytics Market Leader. “One reason is that government entities are being required to make information more readily available to various constituencies and stakeholders while increasing transparency and timeliness.”
Rick Wall, director of Information Services for the city of North Myrtle Beach, S.C., agrees.
“In government we are constantly being challenged by our user community to solve problems with technology,” he said. “As soon as we solve one problem for one group, there will be another group that has another problem that needs to be solved. We’ve been in an environment for the last six years where our workforce has not increased. So we’ve had the challenge of increasing service levels to the community through the use of technology.”
But virtualization is not easy for government, where the demands for seamless data access are typically layered upon numerous legacy systems – often across numerous agencies or functions with varied data technologies, data structures and schemas, security protocols, and maintenance cycles.
“The inherent diversity of government data and the loud cry for increased access to it means that public servants charged with making ‘the people’s data’ more available must use tools and methods that allow for data to be harmonized and shared without massive manipulation, normalization, and physical movement or consolidation,” said Lucker. “This is why data virtualization is so important – it allows all of these objectives to be met in a feasible, more cost effective way via virtual views from the primary data stores without massive data movement to new repositories.”
North Myrtle Beach, a community of about 14,000, has been working on virtualizing the city’s servers for about seven years.
“It all started because we needed to provide mobile city workers with access to specialized applications and large databases,” Wall said. “One of the driving factors for us has been to empower employees – particularly mobile employees – with tools to do their jobs better and more efficiently.”
North Myrtle Beach chose VMware for the initial project, and since then have been gradually moving away from a large number of individual physical servers to a small number of virtual servers.
“Once we got started down the virtualization path, we quickly began making plans to virtualize the low hanging fruit,” he said. “We started quickly moving the servers over that we felt were going to virtualize easily.”
Over the years, Myrtle Beach has virtualized desktops for Public Works employees and city building inspectors using VMware View. Employees access databases securely and enter job-related updates throughout the day to perform their job functions more efficiently. Beach Services auditors are also using iPads to conduct field audits with concession workers.
“It’s been an ongoing thing and I’m sure there will be more problems we’ll need to solve soon,” said Wall. “But it’s made us realize virtualization in government has really become a necessity.”
For those entities looking to move to virtualized environments, the following are 6 recommendations for implementing data virtualization in government.
Agencies should first determine how the data is to be used by constructing a plan of short/medium/long-term objectives that are logically sequenced to make services available incrementally without years going by before value can be realized.
For North Myrtle Beach, it didn’t happen quite that way, though officials were flexible and evolved their plans along the way.
“In a perfect world you definitely want to know where you’re going before you start moving,” said Patrick Sanders, systems administrator/virtual environment analyst for North Myrtle Beach. “We dove in head first. We knew we wanted to virtualize every server we could and we knew we wanted to utilize VMware, and that was pretty much our first use case. Then once it took off, we found other uses for it and it’s evolved year to year. We did have a plan in mind at first, but it was very flexible.”
Before proceeding down the virtualization path, government agencies should inventory the necessary data and understand where primary data is located and how it is maintained, its accuracy, currency, history and historical retention.
Detail the specifics of the data into a data virtualization platform for the purposes of metadata composition.
There should be logical relationships between the necessary data to be sure that combinations of disparate data would make business sense if it is logically virtualized.
As Myrtle Beach marched forward with its virtualization plans, officials quickly realized that although all servers could be virtualized, not all of them should.
“Some servers virtualize better than others depending on the workload,” said Sanders. “We found that out the hard way. There are a couple servers we have actually pulled out of the virtual environment and put back in the physical environment.”
Agencies should understand how internal government data may need to be combined with external vendor or constituent data for insight or foresight purposes, said Lucker.
Finally, Lucker said that data privacy and security requirements should be rationalized across the disparate data ecosystems being virtualized.