The data center — traditionally at the core of the job for most public CIOs — has been the target of serious rethinking in recent years. Aggressive consolidation across all levels of government has cut the number of public-sector data centers and made the remaining facilities more efficient. Technologies like virtualization have reduced the need for floor space, and cloud-based services are removing applications from data center racks altogether. Public CIO asked Peter Doolan, chief technologist of Oracle Public Sector, how government data centers will evolve.
Government will never go to an all outsourced cloud model like a commercial company has the ability to do. There are two main reasons for that. First, we as an industry underestimate the power and complexity of compliance. Second is that even though we often say all governments do the same thing, I don’t believe they do. Every politician has a different perspective, and IT reflects the mission of government in that constituency.
You can boil a commercial business down to probably less than a dozen business processes. In government you can’t. Whoever won the last election — that is the business process. Change is constant, and that is the issue for us. Cloud will be there for the common components that we can automate and place into a third party’s environment to drive costs down. However, the core of government will remain close to government. Those things will remain an instrument of government and its mission.
Correct, I believe they have been sleeping giants who have not had an active seat at the table. They’ve been much more engaged at the federal level. However, in the state and local area, where I think many of the vendors in our industry expect the growth to come from over the next five to 10 years, [they] will be surprised to find a highly agitated compliance component. The complexities of discovery are so significant and the potential risks are so huge, that governments will start to take a much more active view into what the CIO is doing. IT folks will need to engage far more actively from a compliance perspective than we have in the past.
If you look at a data center today, in many cases you’ll see raised floors and tons of power and cooling. Think about how inefficient IT has become over the last 35 years. There are a huge number of moving parts inside software and hardware environments because IT is consumed horizontally. Every horizontal layer in the architecture had a competitive market — storage, networking, security, databases, applications and so on. Customers saw value in driving a competitive market at each layer of horizontal architecture. It allowed them to drive point-in-time cost efficiencies. However, when you put all of that together on your data center floor, what you tend to find is an overly complex and energy-inefficient platform.
In the future, you’ll see a far more appliance-like environment, where you have units of computing and units of storage, etc., that are unique to function and to mission. If you look at Oracle Exadata, for example, its job is to do one thing really well, which is to be a database machine. It has all the superfluous components removed. If you look at the Linux operating system distribution that we have on that platform and look at the processes that fire up when you boot that thing up, all that’s present is the minimal footprint required for the purpose of serving data.
We firmly believe that if we vertically integrate the pieces and have the IP for data, operating systems, networking and applications, we can drive efficiencies. The building blocks in the data center will go from horizontal components to a much more vertically integrated world.
No, but the competition will come in a different form. I believe our industry is tilting away from a horizontal approach back to a vertical approach we had in the ’70s and early ’80s. We have kind of squeezed that horizontal business model as far as it can go.
Instead of looking at 15 vendors horizontally, you now may look at five vendors vertically. So you will say, “For this mission over here I am going to use Oracle from disk to screen.” But for another mission, it makes more sense to use another vertically integrated platform.
The impact is absolutely enormous. Our infrastructure used to be a set thing — we could count the number of desktops, ERP systems, HR systems, supply chain systems, etc., so data flow was well understood. Today’s mobile systems empower billions of people to generate vast quantities of information, much of it geo-located, which becomes very intelligent and useful.
For instance, Chicago is exploring the use of geo-tagged tweets to spot public safety and transportation issues. They have a very dense population, and if you look at the geo-location of the tweets, they are all stacked on top of each other. That means the data center is required to manage millions of tweets per day. It becomes very demanding.
The data center of the future needs to be responsive to the needs of government and citizens in an age where there is a huge amount of consumer-generated data from mobile devices. That may be a phone, your car, the thermostat in your house or the water meter. Now we can track usage or leakage; we can plan how to build smarter. We are going to have that intelligence, and we are going to have gobs of it.
Big data is not a discrete function. It is a component or a new tool within the data center. We acquired a company called DataRaker, which is an in-cloud, big data provider for utilities. So we already believe that’s the way forward — discrete capabilities that provide optimization within a business process, either on-premises or in the cloud. We also released a Java platform for the sensor community, so instead of having custom machine-based languages that are difficult to use on these various sensors, we’ve created a very simple Java API [application programming interface] that works across ubiquitous sets of sensors from various vendors. That gives us the ability to take enterprise applications that may not have had a sensor input before and enable them to source and send information to and from the sensor grid seamlessly.
Think about it for a moment: The sensor world up to this point has been a very discrete and specific world. You write low-level languages like Assembly and C directly onboard these sensor platforms. For sensor vendors to make cost-effective, simple throwaway sensor platforms, we have to move to a more common infrastructure. Things like Twitter have simple APIs. The power of the Twitter development community is that you can go online, look at a Twitter API and figure out how to subscribe to it, how to search it and how to push information back into it.
In our Oracle labs, where we do pure research and development, we are looking at photonic computing, which uses light — photons — for communications between memory and CPUs, and CPUs to CPUs to make those scale up to massive numbers without an equivalent requirement in power. That is something we have been working very aggressively on. And, of course, nirvana for us is quantum computing. If we get there, then the game changes fundamentally. That’s a long way out still, although “long” in our industry shouldn’t be considered very long. So we are looking at significant leaps forward.
Steve Towns is the former editor of Government Technology, and former executive editor for e.Republic Inc., publisher of GOVERNING, Government Technology, Public CIO and Emergency Management magazines. He has more than 20 years of writing and editing experience at newspapers and magazines, including more than 15 years of covering technology in the state and local government market. Steve now serves as the Deputy Chief Content Officer for e.Republic.