Analytics: Turning a Flood of Data into Valuable Information

At a recent event, state and local officials discussed insights to be gained from the exponential increase of data and where teams should focus their energy.

by / August 16, 2016
Government Technology

The benefits that come from data analytics are many — it's helped reduce inmate populations, improve reliability of emergency medical services and reduce traffic fatalities, to name just a few. Though some government agencies are slow to embrace it due to limited capital or sheer intimidation in the face of disparate systems and fragmented technologies, others have taken hold of the proverbial horns and started the process of improving their daily operations by way of the data.

And during the California Technology Forum held Aug. 11 in Sacramento, state and local officials delved into the insights gained from the exponential increase of data — and where teams need to focus their energy to turn this flood of data into valuable information.

Robert Schmidt, chief of the California Office of Technology Services, said that through his work with the California Department of Food & Agriculture, where he served as CIO from 2011 to 2015, he was able contextualize factors like volume, velocity, and variety or veracity that make data “big data.” When software was used to pair the wealth of agricultural data with such things as water sensors, weather information and water cost information, big data becomes a usable tool.

“From that, I understood that big data wasn’t just the amount of data we were talking about," he said. "There are many other things we need to consider.”

Schmidt said the efforts to put data in the hands of internal and external application developers is a major focus at the state’s innovation lab. And as he sees it, there is more growth potential for open data initiatives and sensor technology across government.

Looking to the local level, Los Angeles Mayor Eric Garcetti’s Operations Innovation Team has been working to reform three key areas of local government: worker’s compensation, procurement and the city’s real estate asset management.

Shmel Graham, manager of the Mayor's Operations Innovation Team, heads up the real estate asset management reform efforts and told attendees that data has played an integral role in shaping the conversation around citywide change. 

“The reform that my team is driving is anchored by data," she said, "which is very critical when you’re talking about the three big issues within the city."

But the effort has not been as simple as pointing to a singular problem and working to address it — disparate systems accumulated over decades, Graham said,  have posed a problem for staff throughout the larger city organization.

“That leaves room for a lot of error, lack of information or inaccurate information,” she said, adding that across the more than 40 different city and county data sources, there was “no one unique identifier for every real estate asset.” In fact, staff identified as many as 55 different ways to talk about the same parcel. 

To begin looking at the data, the team had to engage with a wide range of stakeholders to understand not only the data sets they were bringing to the table, but also the interests they sought to protect within the reform initiative framework.

Graham called the process employed in Los Angeles the collaborative leadership model. “Our team is modelling what collaborative leadership looks like,” she said. “Because we are in the mayor’s office, we have the flexibility to kind of crash everybody’s meeting, and show up and have a discussion, and champion the policies and the initiatives the mayor wants to see completed.”

This process of inclusive collaboration has also improved the collective understanding of how the various organizations within local government view the real estate assets through their own “lenses.”

Where one agency might be geared to see the inherent liability or revenue potential built into a vacant city-owned lot, public safety officials would likely bring a different perspective to the conversation. The involvement of external stakeholders, like the business community, also lends insights and helps to better map the overall course.

These varied viewpoints also helped to outline the goals and needs of the neighborhoods within which the city assets fell. Differences in income levels, demographics, businesses and community goals helped to better visualize the data in a wider context. 

Perhaps equally important is how internal stakeholder agencies receive, process and digest incoming information. And all stakeholders from all levels are important to the data analytics and policy reform process, Graham asserts. 

While the executive levels are more than capable of providing “big vision” direction, they are seldom in tune with the daily processes and procedures at the base of their organizations.

“You need support from all levels, because everybody here is not about the change that I am going to recommend," she said. "But if I’ve built my political capital and I’ve shown that I have skin in the game, it gives me more time and ability to sell them on why I think it’s important because now I can speak to their need as well with the reform that I’m doing."

Despite the complexity of the overall process, Graham boils it down to three simplified steps: centralize, analyze and visualize.

“What I have found is that there is an endless amount of information," she said, "but sometimes people just need practical tips so they can take and say, ‘I can utilize that one tip to spearhead change in my organization.’ Because you can’t knock down the whole building with one swing."

Eyragon Eidam Assistant News Editor

Eyragon Eidam is the assistant news editor for Government Technology magazine, and covers legislation, social media and public safety. He can be reached at eeidam@erepublic.com.