How well do you know the cloud? What are the roles and responsibilities of the companies that provide cloud services? What part does the state or local government IT organization play when it comes to cloud technology?
These questions and others were at the center of a panel discussion at the Florida Digital Government Summit held May 12 in Tallahassee. During the 90-minute session, Florida’s Chief Technology Officer Eric Larson and two private-sector representatives weighed in on the four most important considerations and steps that need to be made before diving headlong into cloud migration.
1. Know your Costs and Applications
Among the top considerations for any organizations looking at a move to cloud services, Larson said IT needs to be cognizant of the role the applications play in the organization, the operational costs associated with its use in the new environment and the type of usage expected.
While bandwidth is no longer limited by space on physical servers in the basement of a government data center, the CTO said costs become the new limiting factor.
“There’s an expectation of infinite bandwidth, or at least no limit to bandwidth, but obviously in the cloud context there are costs associated with bandwidth, so you have to be cognizant of that.”
In addition, the utility model of cloud services may hold substantial cost savings – as long as you don’t overextend yourself. Systems that are accessed as part of daily operations and run constantly will incur higher costs.
For this reason Larson warns that cost savings – the type generally thought to be a major benefit of cloud services – may not always be what you expect and costs will not always decrease.
“A lot of the cost savings in the cloud model comes from the utility model, from the pay-as-you-go. If you have a server that you load and run for ten years until it breaks and it runs at 100 percent all day every day, that workload in a cloud environment may be more expensive.”
The CTO also urges that organizations consider the workload requirements of their applications. While it may require less resources on the front end, Larson said the opportunity to leverage the system’s bandwidth without immediately visible costs can quickly get out of hand as employees embrace the “copy, copy, copy” mentality that can comes with the new, “free” space.
Examining which applications and systems are most used is also an important step in any cloud migration, according to Larson. Some applications may not see heavy, regular use, and could be a good fit for full cloud implementation, while other, more heavily used applications could be a better fit for a hybrid model where they won’t incur large utility fees.
“It’s the same thing as if I only need a truck once a year, I’d rent that and keep my car. But if it’s periodic, then it’s a straight pure cloud; move it out and keep it there.”
2. Planning and Execution
Once you have a confirmation that the applications in question will actually work in a cloud environment, the move becomes its own challenge. From system downtime to actually moving system components, Larson said there will likely be peripheral challenges and the need to audit system compliance.
“Those activities take time, they take resources and it’s work that has to be done in the context of migrating an application out of a data center, an on-premise infrastructure, to a rented infrastructure,” said Larson.
From the CTO’s perspective, it is important to take these additional costs into consideration.
“Even if your application is perfectly able to work in the cloud, there is still the friction in getting from one place to the other,” he said. “It’s similar to moving from one house to the other, it may be straightforward, but there are still costs associated with actually executing that.”
After the move has been successfully executed, it may be necessary to re-engineer the processes that have changed, processes as simple as accessing the new user interface. Likewise, IT leaders should consider the licensing of each application and the limitations it could pose when shifted to a new environment.
“You have to have some sort of awareness that you may be out of compliance if you just forklifted it [the application] as it was built from your local infrastructure to a different context.”
3. Know and Protect Your Data and Network
A new cloud environment will also require organizations to think about security in a different way. Though many cloud service providers boast of their own protection, the systems and applications may not have the kind of protection IT organizations rely on within a data center.
“A lot of people, when they design an application, inherited the security of the infrastructure. I’ve got locked doors, I’ve got prox cards, I’ve got mechanisms to keep bad actors away from the data,” Larson said. “But that falls apart when you go into a different environment, where you don’t inherit those protections. Now you have to bring those protections with you.”
The encryption or tokenization of data can be an effective tool to protect the data at the source, rather than relying on outside security measures. Larson reminded attendees not to forget about protecting the networks as well.
“A lot of people forget that the Internet is a best effort medium and there are bad actors out there. You’re one bitcoin away from a distributed malware service attack,” he said.
4. Have an Exit Strategy
While most in the public sector are looking to enter into the more flexible cloud space, Larson warned, “beware the walled garden.”
For whatever reason, organization might need to remove themselves from a cloud services agreement, which could trigger hefty penalties.
Though operating costs may be lower than those incurred by maintaining a data center facility and the equipment within it, retrieving data can be several times more expensive than it is to host it.
“The reality is that they want to keep the data inside the system because they know services get built around data,” Larson said. “So, keep an eye on the data transfer costs.”