IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Tesla’s Autopilot is Raising Safety Questions Among Car Owners

The Tesla autopilot system involved in a California fatality has a history of unexpectedly swerving that has prompted concerns from owners.

(TNS) — Alex Holmes was heading home in his Tesla Model X one June night last year, using Autopilot to let the vehicle steer itself on Interstate 5 in the Los Angeles area, when it abruptly veered to the right.

He snapped the Model X back into its lane, and his startled wife asked if he or the SUV were responsible for the sudden shift. He told her it wasn’t him.

“I’ve got to send that to Tesla,” Holmes said.

Since Tesla first introduced Autopilot in 2015, drivers have sometimes noted unexplained swerves and lane changes, discussing them with each other online or, in several cases, complaining about them to the federal government. Holmes posted a dashcam video of his June 2017 incident in a forum for Tesla owners.

Autopilot’s performance has come under increased scrutiny since a Tesla Model X, steering itself, slammed into a freeway median on Highway 101 in Mountain View in March, killing its owner.

Relatives of the man, Apple engineer Walter Huang, said he had complained of the vehicle veering toward that same median before. The family is now preparing to sue Tesla, arguing that Autopilot is defective. Tesla has argued that the crash could not have happened if Huang had been paying attention, as Tesla advises all drivers using Autopilot to do. The National Transportation Safety Board is investigating.

When Tesla CEO Elon Musk unveiled Autopilot in October 2015, he emphasized that the system would constantly learn from all the cars running it, pooling their data and gaining experience, much like a human driver would.

“The whole Tesla fleet operates as a network — when one car learns something, they all learn it,” he told reporters at the time. “People should see the car actually improve with each passing week, even without a software upgrade, because the data is continually improving.”

Autopilot has now gone through two major iterations, with an expanded hardware suite introduced in October 2016, and numerous software upgrades. The most recent upgrade, issued in mid-March, has received positive reviews for improved performance.

And yet, the issue of unexpected veering, drifting and lane-shifting has popped up repeatedly. The most recent driver complaint to the National Highway Traffic Safety Administration concerning Autopilot, filed April 22 by an unidentified Los Angeles driver, said the driver’s Tesla had abruptly veered toward a bridge abutment in August, and added that similar problems had happened multiple times.

“The way Autopilot suddenly swerves out of the lane should not be considered normal,” the driver wrote.

After the recent Mountain View crash, another driver posted video online showing his Tesla, on Autopilot, veering toward the same median where Huang died. Teslas are a common sight on that stretch of Highway 101, at the junction of Highway 85, so cars running on Autopilot should have plenty of experience with it.

“I see the same things: The car makes the same mistake in the same place, and it’s not learning,” said Dave Sullivan, manager of product analysis at automotive market research firm AutoPacific. “The car doesn’t necessarily know what’s right from what’s wrong yet.”

The company's Autopilot program has also been hit by executive turnover. Last month, Tesla's top Autopilot executive, Jim Keller, left the automaker for a position at Intel, becoming the third senior Autopilot leader to depart in the last two years.

Tesla insists that Autopilot makes its cars substantially safer, although the company does warn buyers at length about the system’s limitations.

During a conference call last week to discuss the Palo Alto company’s latest quarterly earnings, Musk bristled at recent news coverage questioning Autopilot’s safety, saying such reports mislead drivers and could even put lives at risk. Tesla, he said, would start publishing quarterly reports on Autopilot safety, although he did not discuss what data would be included.

“It’s really incredibly irresponsible of any journalist with integrity to write an article that would lead people to believe that Tesla autonomy is less safe, because then people might actually turn it off and then die,” he said.

He also emphasized a point that researchers and entrepreneurs in the fast-growing field of self-driving cars have often made: Even fully autonomous vehicles will not completely eliminate accidents. Autopilot is not full autonomy, although it has been viewed as an important step in that direction.

“The thing that’s tricky with autonomous vehicles is that autonomy doesn’t reduce the accident rate or fatality rate to zero,” Musk said. “It improves it substantially.”

While Autopilot can handle many of the tasks of freeway driving — keeping in lane, switching lanes, adapting speed to surrounding traffic — Tesla insists that drivers pay attention while using it.

The system monitors whether drivers have their hands on the wheel and warns them when they don’t. That hasn’t stopped some drivers from abusing Autopilot, including a British man who lost his license after he engaged Autopilot on the M1 highway and climbed into the passenger seat to relax.

Tesla owner manuals list, in detail, situations in which Autopilot may not work properly.

Faded lane markings can throw it off, as can direct sunlight interfering with the car’s cameras. Sharp curves in the road, hills, construction zones, toll booths, seams in the pavement and weather conditions that cut visibility all can hamper Autopilot’s performance.

While a sudden swerve or lane shift may seem unexplained to the driver, it may represent the car reacting to one or more of those conditions.

In the dashcam video posted by Holmes, the car starts to shift lanes as it approaches an overpass that appears to be undergoing renovation or construction. Broad lines of what may be dust cut from his lane to the right, and Autopilot, which relies heavily on cameras to interpret its surroundings, may have been following them.

“It was, in fact, in a construction zone, and the reason for the car swerving was valid,” said Holmes, a former tight end for the Miami Dolphins and now co-CEO of Wild Hair Media.

While most companies developing self-driving technology use lidar — a laser version of radar — to scan a car’s surroundings, Tesla doesn’t. Musk argues that expensive lidar systems won’t be needed once the systems that interpret visual information from cameras become sufficiently refined. Autopilot relies on a suite of cameras, radar and ultrasonic sensors.

On another drive, Holmes’ Model X in Autopilot almost clipped another vehicle. And yet Holmes, who describes himself as “the biggest Tesla fan ever,” uses Autopilot almost every day. Despite those two incidents, he’s come to trust it, even though he still pays attention.

“Driving in L.A. is tough in traffic,” he said. “It’s a lot easier to not worry about it.”

©2018 the San Francisco Chronicle Distributed by Tribune Content Agency, LLC.