As government agencies tackle the management of “Big Data,” industry executives suggest developing teams of specialists.
SACRAMENTO, Calif. — Big Data is a big idea making its way into more state and local governments. But what exactly is Big Data, and how can it lead to more efficient and proactive management?
IT industry executives shared their thoughts, advice and guidelines on the topic during a GTC West Conference panel discussion on Tuesday, May 29.
Big Data is closely related to data analytics. Generosa Litton, director of Big Data marketing at EMC, said Big Data are data sets “that are so large, they break traditional IT infrastructures.” This can be anything from gigabytes to terabytes of data. If it’s more data than an organization can structurally handle, it constitutes Big Data.
But it’s not just about the amount of information storage. It’s also about the medium. Big Data can be anything from social media to video files, according to Litton.
These giant and varied stores of information can be used for a wide range of purposes, the conference panelists said. David Steier, director of information management for Deloitte Consulting, said Big Data can be used by the public sector for analytics. Take, for example, enforcement of child support payments. Data can help health and human services agencies predict when or if a parent will pay child support, or assess risk of non-payment. Agencies also can leverage Big Data to proactively intervene in order to avoid delinquent payments. Deloitte has helped five states build child support enforcement systems, Steier said.
Outside of health and human services, Big Data also can be a useful tool for reducing fraud, waste and abuse. For instance, Big Data can assist with tax claim processing by ensuring that people are who they claim to be.
Litton said government agencies that are ready to move into Big Data should ask five questions before getting started:
1. At what point do you get where Big Data can provide what you really need?
2. Do you have data availability?
3. Do you have the right architecture?
4. Do you have analytics tools and a platform?
5. Do you have people who can manage the data and get value from it?
On this last point, the panelists said that handling Big Data requires a special skill set, one that many government organizations might be unequipped to staff for this crucial position. A “data scientist” is best suited for the role, Litton said. A data scientist is an individual with programming, technical, communication and mathematical skills — with additional knowledge of how websites function.
And preferably data scientists should not work alone. Steier said since no one person has all of the skills required for managing Big Data, agencies should assemble a team.
Noelle Sio, a data scientist for Greenplum, a division of EMC, said a data scientist does more than create and analyze pie charts and spreadsheets. There’s simply more data involved than what can be represented through those forms. Data scientists help make information useful, consumable and analyzable.
Sio said agencies must determine “organizational alignment” by deciding where its data science team would reside within the agency. Big Data conceivably pertains to both the IT and business sides of an organization.
Then there’s the matter of IT architecture. In order to handle a large volume of data, organizations should move to an expandable architecture that utilizes nodes, providing the ability to ingest mass quantities of data all at once.
To tackle Big Data, Steier said government agencies must build out a road map by doing the following:
1. identify opportunity;
2. access current capabilities;
3. identify and define use cases;
4. implement pilots and prototypes; and
5. adopt strategic ones in production.