Stronger Together: Could Data Standards Help Build Better Transportation Systems?

The future will have a lot of data, and a number of transportation experts in both private industry and the public sector are pushing for a vast group of stakeholders to collaborate on setting up standards that make that data as useful as possible.

  • Facebook
  • LinkedIn
  • Twitter
  • linkText
  • Email
Visions of the post-Jetsons transportation future — where cars drive themselves, buses talk to trains and people get from A to B using several means of travel — keep coming back to one thing: data. In so many of those visions, data is the keystone connecting machine-to-machine, available resources to people who need them and information to decision-making.

And increasingly, the data is coming from disparate sources. Transportation officials track vehicle speeds with sensors on highways while transit agencies pool real-time location information; mobile apps pour location and routing data into private servers while smart parking schemes deliver information about which spots are available at the destination. And as sensors and connectivity proliferate throughout the landscape, there will only be more data available from even more sources.

Which means that the data is often collected and formatted in different ways, which makes it hard for somebody to pull it all together and get the maximum value out of it.

“Every car company is really speaking a very, very different language when it comes to sensors and data in the car,” said Dietmar Rabel, head of autonomous driving product management at mapping company HERE.

And this is why a number of transportation experts in both private industry and the public sector are now pushing for a vast group of stakeholders — from the tech startups of Silicon Valley to the municipal transit agencies that dot the country to the federal centers in Washington, D.C. — to work together on setting up standards that make the data as useful as possible.

In the technology industry, where proprietary information can often form the foundation of a company’s business model, it can be tough to convince people that it’s in their best interest to work with other people to make data standardized, according to Rabel.

And yet, he says he thinks it is in everybody’s best interests.

“In one word," he said, "it’s scale."

That is, all the value that comes from ubiquitous transportation data only comes when that data is truly ubiquitous. Applications and services are more valuable if they’re bigger, because that means they have more information to improve accuracy and applicability.

Take HERE, for example. The company is working on 3-D maps, along with other services, that are updated constantly to support automated driving.

“In a highly automated driving world, you want to use an up-to-date map because the roads, in reality, change frequently,” Rabel said. “It’s not just the aesthetics of the roads that are changing, but you also have incidents.”

But those updates rely on data coming in from a lot of vehicles, and realistically, those are going to come from many different companies that are all pitted against each other for sales.

“All those vehicles would send that data back to the cloud, to us," he said. "We would aggregate all that data, make updates in the background and then send that data back to the vehicles."

What’s more, the data collected from vehicles can be used to catch problems on the road early — vehicles with sensors can pay attention to the quality of painted lines and keep track as they deteriorate over time. Or if one car marks the location of a speed sign, and another car doesn’t see the sign in the same spot half an hour later, then it can tell officials both the fact that the sign has gone missing and during what timeframe it happened.

To make all that data easy to pool together, HERE has submitted a proposed standardization scheme called SENSORIS to a public-private collaborative organization called ERTICO. The standards cover vehicle sensors and have support from some major companies including Daimler, LG Electronics and TomTom.

Greg Slater is trying to get government to the table too. As chair of the Subcommittee on Data for the American Association of State Highway and Transportation Officials, he has a pretty good idea of just how large a task that is. States have passed laws that treat certain kinds of data differently, so information that might be easily accessible in one state might be veiled behind a banner of privacy in another. Or one state might simply collect the same data differently than another. And getting people to massage that data to the point where it can be put side-by-side with another state’s data takes time, resources and funding.

So any time a state wants to make a change to the way it collects data, it takes a lot more than a snap of the fingers.

“If you put a process out, it would take several years to adjust,” Slater said.

But it is undeniably valuable, he said. To illustrate, he pointed to a transportation agency developing a highway plan. The agency needs information about congestion, safety and infrastructure. Congestion alone means gathering data on where people live, where they work, what their travel needs are and more. And that can, in turn, guide decisions on where the agency needs to be spending time and money and what kinds of projects will truly be effective at meeting its needs.

But in order to effectively plan for the long term, the agency needs even more data than that.

“That gives us a good picture on where congestion is today, but also where we’re going to see growing conditions based on land use challenges,” he said.

There are some types of government data that are already widely standardized — namely, public transit data. Michal Migurski, vice president of product for Mapzen and a former chief technology officer at Code for America, said that transit agencies have widely adopted a standard called General Transit Feed Specification (GTFS).

“In terms of transit data, that particular world is substantially further ahead than other data [trends],” Migurski said.

Mapzen has taken advantage of that to launch an open-source, turn-by-turn routing service that puts transit alongside other modes of travel. But that means much more than just transit data. It means gathering city boundary lines, parcel information, place names, addresses and more. And that is not guaranteed to be available in easily digested, standardized formats.

As daunting as it might seem to get so many people to agree to doing things the same way, Migurski said that it would likely make things easier for everybody in the long run. GTFS, he said, is a good illustration of why.

“I think there’s a very clear benefit to essentially commoditizing this kind of data, which is what standardization gets you,” he said. “In the case of GTFS, for example, if every one of the transit providers who provided that information was kind of internally building their own processes and then working [with] companies like Google and Apple on how to share that stuff, it would be a huge amount of wasted effort.”

So how can government, industry, coders, business people and all stakeholders in transportation get started?

According to Slater, the ball is already rolling. “To start talking about it is really where it starts,” he said.

  • Facebook
  • LinkedIn
  • Twitter
  • linkText
  • Email
Ben Miller is the associate editor of data and business for Government Technology. His reporting experience includes breaking news, business, community features and technical subjects. He holds a Bachelor’s degree in journalism from the Reynolds School of Journalism at the University of Nevada, Reno, and lives in Sacramento, Calif.