"What happened to the generation of power a century ago is now happening to the processing of information. Private computer systems, built and operated by individual companies, are being supplanted by services provided over a common grid -- the Internet -- by centralized data-processing plants. Computing is turning into a utility, and once again the economic equations that determine the way we work and live are being rewritten."
Many IT executives, vendors and consultants have embraced author Nicholas Carr's historical analogy. They point out that in 1907, 70 percent of the industrial electrical generation in the United States was in-house, but by the 1920s that same percentage was generated by utility companies. Initially you had to own your own plant, but later it became a disadvantage.
They believe the same thing is happening now with computing, and that many activities that take place in the data center will soon move "into the cloud," where they can be done more economically.
White papers and trade shows are touting cloud computing as the next big thing. But it's not at all obvious what it will mean to public-sector CIOs.
A recent Forrester Research report defines cloud computing as "a pool of highly scalable and managed compute infrastructure capable of hosting end-customer applications and billed by consumption." In other words, users subscribe to computing services hosted by very large service providers, like Amazon or Google.
A May 2008 white paper by Kishore Swaminathan, Accenture's chief scientist, described some early applications:
What distinguishes cloud technology, according to Hewlett-Packard (HP) executive Russ Daniels, is that it's a highly automated, self-service and paid for incrementally. "That variable cost component is an essential driver," he said. "The cloud delivers scaling at low cost. We've been doing high-performance computing services for a long time, but the cloud allows the workload to vary from large to small as you need capacity."
Daniels, HP's chief technology officer for cloud services strategy, noted that many Web hosting companies offer virtualized Linux containers for $15 a month. "But what's new is the ability to easily add another one incrementally for a shorter period of time."
Big Changes Coming?
Are we rapidly approaching an era when most data center storage services will be hosted by hardware vendors? Will most software customers rent access to applications hosted externally? IT research firm Gartner recently predicted that by 2012, one in every three U.S. dollars spent on enterprise software would be a service subscription rather than a product license. Gartner also estimated that by 2011, early technology adopters would purchase 40 percent of their IT infrastructure as a service. If Gartner is right, what will such a sea change mean for government IT organizations? Will they be early adopters of this new approach or will they be reluctant to allow data to flow outside their firewalls? Another issue will be integrating that data with applications run locally.
Public-sector CIOs who see cloud computing only as a way to reduce costs will likely wait for the market to shake out to see whether the theories about its value are proven. But if they see it as creating value and as a way to improve performance, then they may choose to be early adopters, HP's Daniels said.
He suggests thinking about the cloud where there are opportunities to solve new problems and where the requirements require scaling and flexibility. "Over time, it may be the right answer to replace what you are doing now," he said, "but you have to weigh the costs and risks associated with reimplementing it."
Cloudy in Arizona
Despite reluctance to be among the first to plug into the Internet computing cloud, some government CIOs are either early adopters or have begun thinking through policy concerns related to cloud computing.
Arizona State University's (ASU) strategic plan for IT simplification involves what it calls the Concept of One: Do it once. Do it right. Use it everywhere. But the Phoenix-based university is embracing an even more basic approach that University Technology Officer Adrian Sannier has labeled the Concept of Zero: Don't do it at all. Let someone bigger do it.
In October 2006, ASU became an early adopter of the Google Apps Education suite, which provides e-mail and other applications for ASU's students through Google's Web services. Sannier is an early advocate of cloud computing.
His strategic plan notes that "when a university technology service can be replaced by one provided commercially, by a firm operating at a scale hundreds to thousands of times greater than the university can ever attain, efficiency and progress result."
"The university is just a microcosm of what's happening," Sannier explained. "We are going from departments running their own servers to consolidating that at the university level. If we can achieve efficiencies that way, then it's a clear jump to the next layer of scale, and it's a direction we will absolutely move into."
ASU, which has ambitious goals for growth, had a technology strategic plan in place when Sannier arrived three years ago. The university's IT leaders observed forces in the marketplace and replicated what worked. But even though the university was getting incrementally more efficient at IT, Sannier saw an "exponential explosion" going on in Internet-based services. It led him to believe ASU should start getting out of every technology service provision that wasn't part of its core mission and turn it over to these state-of-the-art service providers.
ASU started with e-mail. Like most universities, it provided e-mail service and a maximum 50 MB of storage per person to 65,000 students, faculty and staff. But because of those storage limitations, ASU found it was losing students to commercial services. "The value of our service was eroding," Sannier recalled. "We decided to be an early adopter of Google Apps. It saved us $500,000 a year, and now those students get an order of magnitude better service." Students went from 50 MBs of storage to 6 GB each. "Google is adding features such as Google Talk and a calendar that we couldn't have dreamed of producing, and as the service evolves, it just happens," Sannier said. "They don't have to send us software updates to install. Working on a cloud level, the pace of improvement is staggering."
A month after launching the e-mail program, Google helped ASU offer a student-customizable portal called MyASU. "A month later we had collaborative documents and spreadsheets, with enormous capabilities, and we get it all by just agreeing to use it," Sannier added.
The potential flexibility and cost savings of cloud computing have state and local governments developing teams to study how it might be applied. Carolyn Lawson, CIO of the California Public Utilities Commission (PUC), is a leader of an informal grass-roots group of California state IT executives looking at both the benefits and policy implications of Web 2.0 and cloud computing.
Before taking the CIO post in February 2008, Lawson worked in California's eServices Office, and in October 2006 she started working with the state's webmasters on ways to integrate wikis, blogs and technologies such as YouTube, MySpace and Facebook. "Talking about cloud computing was a natural evolution of these other discussions," she said. Another consideration is that California is facing difficult financial times, so anything that might save money is worth looking at.
But concerns about privacy and security may limit the state's willingness to experiment. "We started with a list of what we cannot do," Lawson said. "Anything that has your name, address, Social Security number or driver's license, we can't put that in a cloud for privacy concerns."
Although vendors tout the potential of putting data centers in the cloud, Lawson doubts the state will ever be able to do that because it needs to maintain responsibility for that data.
"I hate to be the bureaucratic stick in the mud, but the vendors don't seem to take into account that we have a responsibility to the public trust that's on a different level than the private sector," she said. "As CIO, I am personally responsible. I have to be able to sit in front of the Legislature and describe how that data is being protected."
ASU's Sannier and several vendors countered that argument by saying public-sector IT executives shouldn't automatically assume no one can protect their data as well as they can. "The analogy I use is the virtue of banks over mattresses," Sannier said. "It's absurd to believe the 25 IT people on your staff can do a better job of keeping your data safer than Google can." The real key, he added, is legal and contractual requirements as the cloud providers try to figure out what kind of indemnity they can provide. "The most powerful player on your team is going to be your lawyer to negotiate contracts and figure out liability issues," Sannier said.
Despite her security concerns, Lawson believes there are opportunities for California. "We could look at what we have across all departments that could be put in the cloud. Any data that is not personal is feasible," she said. For instance, at the PUC it could be used for public comment on proposed regulations.
Lawson's sense is that there's a battle for territory going on between traditional software vendors and Web 2.0 vendors. "We want that to settle out and to see what this is really going to help us accomplish beyond the bells and whistles."
Looking for More Flexibility
Although his organization hasn't begun any pilot projects yet, Andy Blumenthal, director of enterprise architecture and IT governance for the U.S. Coast Guard, believes cloud computing holds enormous potential. Agency IT departments have long struggled with individual lines of business that buy and maintain their own stovepiped systems. They often find that they don't work well with other systems the organization has, are not cost-efficient, and that they become obsolete within a relatively short period of time, Blumenthal said. "With cloud computing, there is the possibility of obtaining systems more flexibly, on an as-needed basis, and then modernizing once better technology is available," he said.
He admitted that cloud computing for government agencies, especially those in defense and law enforcement, is fraught with security and privacy risks. "[But] rather than dismissing cloud computing because of the inherent risks involved," he said, "we should work to overcome them so that the government can more readily adopt it."
The Next Generation
Perhaps the earliest cloud adopters will be younger CIOs who grew up with the Internet and are most comfortable exploring its transformational capabilities.
Washington, D.C.'s chief technology officer (CTO), 33-year-old Vivek Kundra, has implemented Google Apps and IT portfolio management software Planview in a hosted environment and also plans to do cloud pilot projects with Amazon's storage business.
Kundra said these developments are modeled to match employee and consumer behavior. "We want workers to have access to information wherever they are, so they don't have to be chained to a desk to be productive," he said. Kundra is also working to replace the district's landline phones with mobile phones. "The next generation of workers will be mobile," he said. "By investing in that mobile work force, we can attract some super-smart people."
He also said he wants to drive down costs while delivering services with increasing velocity.
"My role as CTO is not to own as much software and hardware as I can," he said. "It's to make a difference in residents' lives. So the question is: Can we do it cheaper, faster, better in the cloud or with the equipment and services we are using right now? I have a feeling I know what the answer is, but we will see."