State and Local Govs Need to Improve Data Sharing, Big Data Use

Like the feds, state and local agencies have improved how they share and use big data, but a new report shows areas that still need improvement.

  • Facebook
  • LinkedIn
  • Twitter
  • linkText
  • Email
Like their federal counterparts, state and local agencies have made great strides in acquiring and using big data — but they still have a long way to go, according to a source in the industry.

A new report released Tuesday, Jan. 30, called Acing the Big Data Test: How Feds Can Support New Missions With New Insights, gave federal agencies a mixed report card on big data.

Alan Ford, director of presales consulting for Teradata Government Systems, a division of Teradata Corp., which co-sponsored the report with MeriTalk, told Government Technology that federal agencies — spurred onward in part by Congress and the White House — are increasingly embracing big data.

MeriTalk, which conducted an in-person and online survey of 100 federal IT managers to look at the fed’s ability to leverage big data and foster data sharing, found 72 percent of those surveyed reported they were in fact improving mission outcomes by leveraging big data.

That compares favorably to the early days of big data five or more years ago, Ford said.

“A lot of federal agencies were absolutely flummoxed by big data, what they should be collecting, why they should be collecting it, what they should be doing with it,” Ford said, noting that at the state and local level, agencies have begun to warehouse, analyze and open public data streams too — but they still haven’t reached the levels of the feds.

“I think they’re going to be lagging a little bit. There’s a lot of cascade-type learning, and actually that goes from the federal to the state agencies. They may take a year or two to adopt some of those standards and practices,” Ford added.

He referred to the so-called “cascade” effect that happens when, for example, a federal agency learns a technique or adopts a tool and that knowledge or skill later spills down to lower-level shops.

An example he gave was how, since passage of the Affordable Care Act, state and federal agencies have worked together to report fraud and illegal activity to the Centers for Medicare and Medicaid Services (CMS). By using an integrated data repository, he said CMS was able to save $900 million by uncovering Medicaid fraud, working with the federal Department of Justice.

“Again, the cascade effect was there," Ford said. "They’re sort of forcing the states to get good at that as well."

He singled out the state of Michigan, which he praised for its data warehousing capability, pulling data from multiple agencies onto a single platform, then allowing multiple agencies to access it. “I would put Michigan up there on par with any federal agency.”

According to the report, 45 percent of feds using big data improved their operational efficiency, 44 percent executed cybersecurity analytics and 41 percent established performance tracking/goal-setting.

Whether at the federal, state or local agency levels, operational efficiency and cybersecurity analytics can go hand in hand, Ford said, as agencies automate and otherwise figure out how to run their networks more efficiently. Moving humans from mundane to strategic tasks can help.

Federal agencies surveyed agreed they had room for improvement. Just over half, or 51 percent, said they could do a better job of improving mission outcomes by enhancing data governance.

Nearly one in five said they were outright not supporting data collaboration across teams; and elsewhere in the report, only 35 percent of feds described themselves as very successful in sharing data across platforms.

Ford said he’d like to see that 35 percent rise and noted the numbers reflect the feds’ siloing and traditional reluctance to share data — traits some states share.

“It’s hit or miss. Some states don’t get it yet and at the other end of the continuum like Michigan, they do get it and they do share information very well,” he said.

Again with respect to mission outcomes, 39 percent of federal agencies said they could improve at enhancing analytics for their team.

That percentage, Ford said, sounds “about right” because federal agencies have so many new tools to use in data analysis.

“It’s almost like analysis paralysis when they go to figure out what they want to use,” he said. The rise of big data, however, has made analysis tools easier to come by — including open source tools — and to use.

He offered a key bit of advice for state and local governments eager to move beyond collecting and warehousing data to analysis: experiment.

“With all the open source tools that are available, they’re effectively free. It’s OK to go out and try new things because it’s not very expensive to do that,” Ford said, cautioning that like bringing home a free puppy, some of the open source tools available to do data analytics may come with a learning curve.

Asked to predict their most impactful uses of big data this year, federal agencies picked cybersecurity analytics, predictive analytics for forecasting and pattern recognition, and use cases around operational efficiency.

States could make the biggest gains by simply getting all their data in a single analytical ecosystem, Ford said.

“If they would share it and allow individual agencies to cross-pollinate their data with peer agencies," he said, "they would be able to get so much more information out of their data."

  • Facebook
  • LinkedIn
  • Twitter
  • linkText
  • Email
Theo Douglas is assistant managing editor for Industry Insider — California, and before that was a staff writer for Government Technology. His reporting experience includes covering municipal, county and state governments, business and breaking news. He has a Bachelor's degree in Newspaper Journalism and a Master's in History, both from California State University, Long Beach.