(TNS) -- This summer, autonomous cars collided with reality.
All those sensors and cameras and spinning cylinders of laser beams don't know as much as we thought they did. At least not yet.
Sure, Tesla Motors is taking the brunt of the public relations and political damage. To be fair, we don't yet know whether and to what extent the upstart electric vehicle maker's Autopilot feature contributed to a series of recent crashes, including one fatality.
But Karl Benz and Henry Ford had setbacks too. Billy Durant, who assembled what would become the empire of General Motors, was ousted from the company twice before winding up running a bowling alley.
Don't expect development to stop driver-assist technologies or the search for the holy grail of full autonomy.
But still, it's time to ask what we can learn from recent events. Here are a few lessons to draw from Tesla's recent stumbles:
“Everyone in the autonomous vehicle industry understands consumer over-trust is a significant problem," said John Maddox, CEO of the American Center for Mobility, the proposed proving ground for connected and automated vehicles under development on 335 acres at Willow Run, where the U.S. produced B-24 bombers during World War II. "No one has an answer to that problem yet, to be perfectly blunt."
At the heart of the recent Tesla crashes is the question of how engaged or disengaged was the driver when things went wrong. Josh Brown, the 40-year-old who died in Williston, Fla., on May 7 when his Tesla Model S ran into a tractor-trailer turning left in front of him, was playing a Harry Potter DVD when the crash occurred.
"It’s a human nature that when we’re bored with a given task we find something else to occupy our minds. There are ways the industry can look at to maintain driver interest," Maddox said.
Later this summer, the National Highway Traffic Safety Administration is expected to release its federal guidelines for autonomous vehicles.
Kirk Steudle, director of the Michigan Department of Transportation, will also play an important role. Michigan wants to be a leader in the research and development of self-driving vehicles. That goal must be balanced against MDOT's mission to maximize traffic safety.
"These technologies will be significantly better when the vehicles can 'talk' to each other and to signals in traffic lights and elsewhere," Steudle said.
MDOT is working with General Motors, Ford and the University of Michigan to deploy vehicle-to-infrastructure (V2I) communication technology on more than 120 miles of metro Detroit roadway along I-96 and I-696, stretching from U.S.-23 near Brighton to I-94 in St. Clair Shores. The project will eventually cover portions of U.S.-23 and I-75.
The technology uses a Wi-Fi derivative called dedicated short range communication.
To simplify, DSRC can configure stoplights or alert drivers to adjust speeds in order to smooth traffic flows during rush hours.
To be effective in the long term, projects like this depend on a much higher penetration of technology within the vehicle population. In the shorter term, automakers, suppliers and technology companies are making important strides on their own.
Yet with every step forward there is a growing awareness of autonomous driving's limits.
Steudle said the owner of a Tesla with Autopilot offered him a test drive recently.
"We came up to a busy intersection with a traffic light and there were one or two cars ahead of me, and the car stopped on its own," Steudle said. "But if there had been no other car ahead of us I would have had to pump the brakes myself at the traffic light."
Michigan has 122,000 miles of public roads, Steudle said. Half of those will never have a paint line on them because they are either dirt or subdivision streets. Paint lines are part of the language that sensors, cameras, and LiDAR, an acronym for Light Detection and Ranging, can interpret in such a way that lets them guide the vehicle.
Self-driving advocates contend that when this technology is perfected we can save 35,200 lives a year — the number of auto-related traffic deaths last year in the U.S. — because many of us are bad drivers.
"The argument goes that human error causes 90% of all accidents, so let’s get rid of the human driver," said Maddox of the American Center for Mobility. "But that overlooks that we drive 100 million miles between fatal accidents. We overestimate the shortcomings of human drivers."
Early next year General Motors will launch its Super Cruise feature on the Cadillac CT6. As described by GM executives, it is similar to Tesla's Autopilot, but the company declined to comment on what it has learned from Tesla's recent incidents.
Google also declined to comment. Most of the pod-like self-driving cars Google tests on public roads in northern California operate at speeds of under 35 miles per hour. Often a Google employee is on board and knows how to respond if something out of the ordinary happens.
Consumers also have much to learn. Tesla's owners manuals make clear that drivers must be prepared to take the steering wheel if Autopilot acts unpredictably or can't process a traffic situation. But not all Tesla owners, like customers of every other automaker, read their manuals.
"An informed consumer is the best consumer," MDOT's Steudle said. "They really need to understand the limitations of what the technology can do currently. There’s a lot of information out there about where it can take us. But it’s not there yet."
©2016 the Detroit Free Press Distributed by Tribune Content Agency, LLC.