Experts from the public and private sectors provide tips on issues of data privacy, education, culture, hiring and the many obstacles impeding development.
There is typically much talk -- but few guidelines -- when it comes to data usage. Industry experts, however, are working to change that -- they gathered virtually for the 2015 Data Innovation Day, hosted by the Washington, D.C.-based Center for Data Innovation, to discuss where things stand and forecast upcoming trends.
The event questioned data insiders about savvy methodologies in analytics. And the panelists, drawn from both the private and public sectors, were tasked with elaborating on a full list of issues ranging from data privacy, education, culture, hiring and the many obstacles impeding development.
In its third year, Center Executive Director Daniel Castro moderated the dialogues through a Google Hangout that used the Twitter hashtag #datainnovation.
“The purpose of Data Innovation Day is to engage in a worldwide celebration about the benefits of data,” Castro said. “And also to participate in conversations that can lead to a better understanding of the opportunities to leverage data positively for social and economic progress."
From beginning to end, experts provided an ample set of data analytics success stories. Michael Flowers, New York’s former chief analytics officer, stood as a testament to the city’s pioneering projects — such as work in fire risk assessment and 911 response. Stefan Heeke, executive director of the data science non-profit SumAll.org, spoke of a group project that can predict homelessness in New York City four months in advance by assessing eviction filings. And then Michael Wilde, representing Splunk, spoke of New York Airbrake, a company that harnessed data to save the rail industry millions in fuel costs by simply calculating optimal timing for train braking.
Behind these successes are, of course methodologies, which is where the discussion turned.
Chris Surdak, author of the book Data Crush, said many pitfalls in analytics stem from the fact that companies and organizations are constantly using the same kind of data, asking the same questions, yet expecting to find new insights.
“The people who are starting to succeed in big data are taking data sets they’ve never looked at before, combining them, and then asking questions that weren’t askable before” Surdak said.
In government, where finances don’t always permit in-house analytics teams, Heeke said government shouldn’t assume it has to take on the whole task of finding insights and can act as a curator, opening data up and letting third parties do the rest. Officials, he said, can use tools such as open data portals to publish public data and allow civic hackers, entrepreneurs and volunteer data science groups to decipher potential takeaways.
“I think there is a lot of energy out there in terms of good talent,” Heeke said.
Another option he suggested for government might be to invest in affordable tech — such as from startups — for smaller projects.
Tackling the issue of data compatibility from across jurisdictions and departments, the OpenGov Foundation’s Executive Director Seamus Kraft said partnerships are critical since each entity is unique in the way it defines and produces data.
“Each city does it their own way, each codifier does it in their own way,” Kraft said. “Bridging that cultural gap is another massive thing that we need to do. If there is one major thing we can say about this information, it is [for government] to focus on one data set and [standardize] it as nationally as possible.”
To hear more from the group, view the Center for Digital Innovation’s videos of each panel on page 2 of our story.
Looking for the latest gov tech news as it happens? Subscribe to GT newsletters.