A new survey of more than 1,200 public-sector IT officials nationwide reveals a loss of confidence as agencies grapple with issues around IT operations, but may also document an inflection point in systems modernization.
The research report by Ponemon Institute, Challenges & Trends in Public Sector IT Operations: United States, which was released Wednesday, July 19 and commissioned by operational intelligence software provider Splunk, generated answers from 1,227 respondents, 18 percent of which comprised state and local governments and 73 percent of which comprised federal agencies.
A majority of respondents, at 62 percent, said their confidence in their ability to manage data center upgrades had stayed the same or worsened during the past 12 months, with 36 percent of that 62 percent reporting “somewhat” or “significantly” worsening confidence levels. (Conversely, 38 percent of respondents reported confidence levels “somewhat” or “significantly” better.)
The Ponemon Institute chairman said in a statement that the confidence gap “maps to other industry and government technology trends” including growing public scrutiny, resource limitations and increasing expectations of technology from end users.
“It’s a challenging time to work in government IT, but there are plenty of reasons to be hopeful for the future,” said Larry Ponemon, the chairman and founder, pointing out that public-sector IT leaders are looking to analytics, cloud and DevOps to boost IT performance and management.
Ted Ross, the CIO for the city of Los Angeles and one of Government Technology's Top 25 Doers, Dreamers & Drivers of 2017, told Government Technology that its tech officials keep their confidence levels up by approaching modernization “with both eyes open.”
“I think it’s very important to level-set expectations, and so we come into this with the expectation that there will be challenges, that it’s difficult to move from platform to platform, and to stay abreast and to stay modern,” Ross said.
Samir Saini, CIO of Atlanta and another 2017 Top 25 winner, agreed change is hard for any organization, but said from an IT standpoint, from the office of the CIO, he and officials see their role as “change agents overall” because they’re in the center of the change.
“Government, I think, is particularly challenging in some aspects because generally speaking, there’s a degree of risk aversion and change aversion, Saini told Government Technology. “Nothing’s perfect, but certainly if you have a leadership team that’s helping support change or introducing through technology projects or any projects, it goes a whole lot better.”
Kevin Davis, Splunk’s vice president of public sector, told GovTech that the survey’s larger point here was “about what is coming to really shift, to help state and local governments modernize.”
Automating, leveraging machine data and learning, and breaking down silos can help agencies achieve end-to-end visibility — the lack of which was an issue identified by nearly three-quarters of respondents — Davis said.
“I do think we’re at a pivot point. I do think that confidence will continue to improve. I also think how we used to do IT, we will see a shift there and we’ll really see a convergence of IT operations and security, all of this being more blended together,” Davis said.
Respondents weren’t so certain; 56 percent said IT operations and security would not converge, and 16 percent weren’t sure. On the other side, however, nearly one-third of those surveyed, or 28 percent, said yes.
Of this 28 percent, the largest subgroup or 29 percent, said the convergence would happen in one to three years. The second largest subgroup, at 27 percent, said convergence would happen in less than one year.
Ann Dunkin, the CIO of Santa Clara County, Calif., who arrived in February from her previous post as CIO of the U.S. Environmental Protection Agency, told Government Technology that she’s a “big believer” in IT security reporting to the CIO if that official is to have overall accountability for security.
The county CIO also said departments can innovate and work more quickly if they move beyond earlier eras in which IT was siloed by design, to teaming development, operations and security personnel.
“In the mainframe data [era], it kind of worked, but it doesn’t work anymore when you go fast,” Dunkin said. “We have to be working together so that … we release the code as soon as developers are done with it, because security’s been working on it the entire time.”
Elsewhere, 73 percent of respondents identified their biggest risk in managing IT operations or app development as a lack of real-time, end-to-end visibility across the enterprise. In fact, 29 percent, the largest group, said better organizationwide network visibility would be the most effective way to strengthen IT operations.
Agencies were narrowly divided on the question of whether their agencies are using the same data across the organization to solve multiple problems; 27 percent disagreed and 12 percent strongly disagreed, but 23 percent agreed and 13 percent strongly agreed. One-quarter, or 25 percent of respondents, were unsure.
Asked how long staffers needed to restore systems to operational after their last three outages or interruptions, 32 percent — the largest group — said they had needed one to two days, with another 20 percent, the next largest group indicating they needed two to five days.
Additionally, 71 percent said system outages or interruptions in the past year had left employees unable to do critical tasks, while 49 percent said citizen-facing or mission critical services had been unable to operate; and 41 percent said their organization had suffered “reputational damage” due to bad press or social media.
Davis said that on average when a system was down, it required 44 hours and the work of 12.5 staff members to become operational again.
“I was alarmed or stunned by that number. And in addition to that, so, this makes these IT operators really be firefighters. They don’t sign up to be these firefighters,” he said.
Ross said needing one to two days to become operational again is not something that Los Angeles “would consider an acceptable baseline,” noting that the city strives to minimize the impact of systems outages with 40 percent less IT staff than it had in 2009.
Los Angeles, a Splunk client, according to Chief Information Security Officer Tim Lee, uses a Security Information Event Management solution to aggregate system logs from four different security operations centers into one central platform; it currently handles about 1 billion security records a day.
The city also deploys Splunk for Critical Access Protection to ascertain how many attempts or attacks may target critical infrastructure or assets.
Firefighting in crises, Ross said, is not a value-add activity for Los Angeles, where officials find far more return on investment by maintaining and ruggedizing systems and trying to prevent issues before they happen.
Increased use of cloud and shared services ranked fifth of six areas that respondents felt would be most influential at driving greater IT efficiency in their agency, but nearly half (47 percent) of respondents indicated “some” or a “significant” increase in their operating budget for cloud during the next 12 months.
Atlanta had adopted cloud to improve how IT provides core services, and to add capability and scale out, Saini said; but on the business side, it had also found agility and improved supportability in adopting solutions for permitting.
In Los Angeles, Ross said he considers cloud “an entirely different business model” with much efficiency to be gained because it’s not merely a “data center somewhere else,” but rather a solution that’s as scalable as an agency needs.
That said, he echoed what officials in Seattle and elsewhere have told Government Technology: that a need does still exist for physically-managed, on-premise data centers.
Ponemon’s report also projected DevOps spending will grow significantly during the next year, with 45 percent of respondents indicating “some” or a “significant” increase.
A vast majority (78 percent) indicated their agency has adopted or plans to adopt DevOps practices, with 74 percent and 71 percent, respectively, indicating the adoption focus will be for mission-critical and back-office systems.
Davis said he thinks agencies’ ongoing efforts to leave behind legacy systems and migrate to more modern architectures such as cloud are leading to this inflection point, particularly with respect to DevOps.
“I do feel that it really is pointing to a shift of confidence in the future as we look at the adoption of DevOps,” he said.
But Ross said DevOps can be “a pretty loaded statement,” and applied to everything from a better connection to your infrastructure staffers to running a private cloud.
“As a large government, I think most large governments aren’t as effective as they’d like to be on the DevOps side,” he said, noting the city has convened a DevOps working group to try to refine its strategy.
Theo Douglas is a staff writer for Government Technology. His reporting experience includes covering municipal, county and state governments, business and breaking news. He has a Bachelor's degree in Newspaper Journalism and a Master's in History, both from California State University, Long Beach.