#debug #unset #debug #unset #debug #unset #debug #unset #debug #unset #debug #unset #debug #unset#unset
Over the last decade, state and local governments have invested in open data initiatives. Have those efforts done what we thought they should?
Since about 2010, state and local governments have poured time, money and resources into publishing data sets and building portals to facilitate access to this wealth of information.
Are these open data initiatives succeeding?
Well, that depends a lot on your definition of success. If transparency alone is the goal, success might lie in the sheer number of available data sets. Some say open data should deliver more, that it should drive economic benefits, enhance public trust, boost government performance.
These are issues worth considering, as the first decade of open data turns into the home stretch:
Answers begin to emerge when we consider the goals of open data. What is it actually supposed to be doing?
The goalposts for open data have shifted. Early efforts simply looked to publish the information and make it reasonably accessible. More recently, chief data officers (CDOs) and others have looked to raise the bar.
“At first the thinking was: We’ll find a use for this later on, but being transparent is inherently the right thing to do,” said Kansas City, Mo., CDO Eric Roche.
In fact, some early implementers found that transparency, while laudable, was not a sufficient end in itself.
“Going back five or six years, everybody made a mad rush and they made all these wonderful data sets available, but what I saw was that nobody was coming to them,” said Cincinnati’s City Manager Harry Black. He compared notes with other civic leaders, many of whom had reached the same conclusion. “We counted the clicks and we saw that these portals just weren’t being used.”
Today, data stewards increasingly are looking beyond data availability, making data usage a key policy goal.
Among external users, this means that data should spark civic engagement, a free flow of new ideas, said Julia Richman, chief innovation and analytics officer (and interim CIO) for Boulder, Colo. Internally she’s looking to open data to drive new efficiencies. “Cross-departmental collaboration in government is tricky, tracking down the finance person or the information from public works. Those things take time, so there is real value in opening up information across departments,” she said.
Some say this pragmatic mindset is helping to drive government adoptionof open data.
“We proposed that a robust open data platform could be used by agencies to proactively publish often requested information and mandated publications,” said Liz Rowe, New Jersey’s chief data officer. When she highlighted these practical outcomes, agencies “recognized the benefits — ultimately reducing open records requests and freeing up resources for mission execution — and were more interested in participating in the effort.”
Texas has had a data portal since 2014 — so the transparency is there — but Statewide Data Coordinator Ed Kelly is pushing to get more agencies involved and more people tapping that data. “For us, the goal now is to make it more flexible and easier for the constituent to get information,” he said.
As cities’ open data ambitions evolve, advocates admit that it can be difficult to align disparate and sometimes-vague open data policies and goals with anythingresembling ROI.
The Cost of Open Data
Open data advocates are not enthusiastic about notions of “return on investment.”
“I’m not against the term ‘ROI,’ but it is not dissimilar to saying: What’s the ROI on a mile of paved highway? It serves the public good, but how do you measure the positive impact of that?” said Jon Walton, CIO of San Mateo County, Calif.
Walton actually knows the cost of his open data effort: With a $200,000 Socrata subscription plus staff time, it’s between $400,000 and $500,000 a year. Others find it difficult to be so precise.
Many in government say that without its own line item in the budget, open data typically has to piggyback onto other expenses. That makes it hard to do the math.
“We spent nothing on the platform, because we utilized tools we are already using,” said Gilbert, Ariz., Data and Technology Analyst Derek Konofalski. “For the open data portal, we run the back through our existing Esri ArcGIS hub, which already hosts a lot of town data. In terms of manpower, we looked at data sets that people were already using, that we already had access to. We don’t have the money to do data in a way that adds to people’s workloads, so we try to manage it around things we are already doing.”
Texas Statewide Data Coordinator Ed Kelly said he’s tried to quantify the cost of data, but it isn’t a straightforward calculation. “We’ve tried to identify the cost of posting a data set, for example. If we could identify what the agencies’ time is, we might get a better feel for the overall cost of the portal,” he said. “I know some of the costs at the administrative level — what my time is, what we pay a contractor to manage the portal. But we don’t know the overall cost.” How to work the finances around open data, when hard numbers are hard to come by?
New Jersey’s State Chief Data Officer Liz Rowe said the answer will become more apparent as data becomes more institutionalized as a government function. “Unless you have clearly defined business outcomes and the data strategy to achieve them, it will be very difficult to quantify the costs of your data efforts,” she said. One strategy would be to create specific “cost and profit centers” in support of data:
Last year New Jersey codified the role of the CDO, a move Rowe said should help the state to get a firmer grip on the financials around open data. “The creation of an enterprise organization for enterprise data management and governance provides an opportunity for greater visibility to the ‘data spend’ across the executive branch,” she said. “Once that is better understood, we can work on setting budget expectations.”
“The most common goals are lofty and conceptual: ‘We are seeking to increase transparency, innovation, collaboration, accountability,’ all sorts of democratic words. But it often leaves a lot of open questions about what success means,” said Stephen Larrick, open cities director at the Sunlight Foundation.
How, then, to measure success?
Government data leaders generally pursue a couple of tracks in their efforts to gauge the effectiveness of open data. There’s the objective approach, counting data sets and tallying clicks on open data portals. Then there’s the subjective angle, the use of customer satisfaction surveys and other soft metrics to determine whether people are engaging with the data on offer.
San Mateo County, Calif., uses open data to build public trust, to open a conversation between the county and its citizens. With that in mind, CIO Jon Walton said he leans heavily on user feedback forms and satisfaction surveys to track success.
With customer satisfaction charting steadily at over 90 percent, “we know people like it and we know people are using it. We know they are satisfied with how it is working technically,” he said.
It’s possible to take a quantitative approach to satisfaction. Cincinnati’s system tracks data requests from initiation to completion, following up with a short survey. “Based on that survey we have been able to see our customer satisfaction rate increase 7 percent in two years, which is significant,” Black said.
In Kansas City, Roche charts success by measuring how long it takes to fill “sunshine requests” under the state’s open records law. With open data, “we can close out a request almost immediately, and they are not waiting days or weeks to get that data,” he said. That’s a measurable outcome.
Some dig deeper still. In Boulder, Richman’s open data site includes a page tracking Open Data Engagement. The site charts progress toward specific goals, such as having 75 percent of city departments represented in the catalog. (Presently Boulder is on track to hit 65 percent soon.) With a goal to publish 100 data sets, the city had about 70 released at the close of 2017.
It makes sense that data people would be eager to apply data methodologies to their own efforts.
In Gilbert, Ariz., a city of 237,000 people, Data and Technology Analyst Derek Konofalski tracks a wide range of metrics. “We know how many times people have downloaded the raw sets. We know how many times people have viewed the data sets. We provide APIs and we know how many times people used them in apps to connect to the data sets. And we also know how many times people have looked at Alex, which is our data avatar, our tour guide to the portal,” he said.
Even in its early days, Gilbert’s data effort is racking up big numbers. An open data portal drew more than 3,500 hits in the first two weeks after its December 2017 launch, and the Alex avatar got 2,500 views in that same time.
In Texas, Kelly measures three key statistics in tracking the progress of the state’s open data effort. “We look at the total number of data sets that are out there, what we are offering up. We count visit clicks, and lastly, we look at how many downloads are actually being done off the open data portal,” he said.
How do those numbers look? At the close of 2017, a dozen entities were publishing through the Texas portal, including 10 state agencies plus two transportation authorities, with 370 data sets available. The site had drawn 119,923 clicks and 91,299 downloads since launching in 2014.
Counting data sets and tallying clicks helps to give Kelly and others a sense not just of how much information agencies are opening up, but also whether that data is being put into play.
How best to put that information to work? Increasingly, open data has evolved into a joint enterprise. No longer purely the province of technologists, open government is a shared responsibility, with data gurus and IT chiefs working hand in glove to develop the policies, procedures and protocols that ensure information is available, useful and usable.
With the emergence of the chief data officer, chief data scientist and similar titles at the state and local level, open data has enjoyed a kind of professionalization in recent years. Once it was the purview of IT leaders who had the tech chops to cobble together siloed data sets and make them visible to citizens. Increasingly, data specialists take the lead in driving strategy, with the CIO continuing to fulfill the vital back-end tasks of aggregation and publication.
Cincinnati City Manager Harry Black
In Cincinnati, City Manager Black has instituted an entire management layer around data since 2014, when he first established the Office of Performance and Data Analytics. He has created positions for a chief data officer, a chief performance officer, and a data and performance analyst.
All work closely with the chief technology officer, who puts their ideas into action. “The CTO is another teammate. The CTO and his office, they provide technical assistance as it relates to some elements of software and hardware, but they don’t drive the process,” Black said.
This division of labor is not uncommon today. In Gilbert, for example, Konofalski drives data under the umbrella of a chief digital officer, while IT operates on a parallel track. “My job is to go to the departments and do the culture change,” he said. “Then IT manages the databases. They do the technical work to collect that data, to make sure it has been pulled properly from the source systems.”
In other municipalities, the work of opening up government data is at once more collaborative and more complex. More than just a binary system — data gurus who chart the course, and IT leaders who fulfill — these cities see a more complex interplay.
In Kansas City, for instance, the chief data officer has the support not just of a chief information officer but also a chief innovation officer. The latter is able to coordinate with both agencies and vendors to track down needed data sets, something CDO Roche couldn’t easily do on his own. “Smart cities is a full-time job, so I am really grateful to have those people working on that. They can run the smart cities projects and I can go to them when I need to get the data out,” he said.
In Boulder, the mix is even more eclectic.
“Everything we do from an innovation platform standpoint, we do with resources in multiple departments collaborating on these processes. My open data team has a couple of IT people, a couple of GIS people, others who know what questions to ask about data,” Richman said. “Then we have data stewards in the various agencies — financial analysts or business analysts or administrative support people — folks who are domain experts who interact often and deeply with the data.”
Richman needs open data proponents who can go beyond the traditional bounds of either IT or data expertise. “They also need to be really effective storytellers,” she said. “It’s not just about accessing data, but about the ability to link that to strategy, to share that with our communities. It’s about putting a narrative around charts and graphs.”
Cities tell that story best, she said, when their data teams reach across agencies and embrace subject matter experts along with data visionaries and IT leaders.
Having looked at the policies driving open data, having considered the metrics of success and explored the talent mix, we’ll conclude by asking: Is it working? Is the open data movement meeting the expectations of the founding generation, those who have toiled for the better part of a decade to make government information more visible and easier to use?
What one may safely say, in a global sense, is that in the places where it is working, it’s working very well indeed.
The Sunlight Foundation reports that more than 100 cities, counties and states now have open data policies. Its U.S. City Open Data Census charts availability of data in almost 20 key areas — crime, zoning, budget and so on. The top performers are stars: some three dozen cities with solid wins in a large number of categories.
Then it trickles off fast. Culver City, Calif., which ranks No. 43 among the most open cities, reports that it has data available in just eight categories. Ferndale, Mich., makes the list at No. 65 and yet has data fully available in just three categories, with partial availability in four more.
“Overall, a lot of cities are meeting the threshold of availability. There is a lot of information that you can find and use. But the larger, better-resourced cities score higher. The cities that have been doing open data longer score higher,” Larrick said.
“Cities at the top have made a lot of progress. For the cities that are newer at this, many are not even meeting the minimum threshold: Is basic information about government operations available online? For the vast majority of mid-sized and smaller cities in this country, the answer is still no,” he said. “We can get distracted by the cities that make the most headlines, but there are a lot of cities that are still just starting out.”
Larrick is only talking about availability, and while that is a key metric, it’s not the only measure of success. Data managers say that in their fondest dreams, they’ll do more than count data sets and track clicks. They are looking for metrics that connect open data to social outcomes.
Are babies healthier because of open data? Are streets safer? That’s the holy grail of open data metrics, and data chiefs from cities large and small agree that we’re not there yet. In these still-early days of open data, there’s no algorithm that will cleanly and clearly describe the impact of open data on society at large.
“There needs to be more conversation at the national level about how to measure success, especially on the public side,” Roche said. “Maybe there needs to be a standard set of metrics across cities to let us benchmark the use of open data. It’s something we all need to be exploring.”#debug #unset #debug #unset #debug #unset #debug #unset #debug #unset #debug #unset #debug #unset#unset #debug #unset #debug #unset #debug #unset #debug #unset #debug #unset#unset #debug #unset #debug #unset #debug #unset #debug #unset #debug #unset #debug #unset #debug #unset#unset #debug #unset #debug #unset #debug #unset #debug #unset#unset