IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Case Closed: Tesla Autopilot in the Clear After Investigation of Fatal Crash

The six-month investigation into the nation’s first self-driving vehicle fatality found no problems with the design or performance of Autopilot.

TNS — Federal regulators said Thursday they have closed an investigation of a fatal Tesla Autopilot crash, finding no defects and declining to issue a recall notice.

The six-month investigation by the National Highway Traffic Safety Administration into the nation’s first self-driving vehicle fatality found no problems with the design or performance of Autopilot. The system, based on radar, camera and machine learning technology, allows Tesla vehicles to sense potential crashes, stay within lanes and adjust speeds automatically.

The probe did not uncover a “safety-related defect trend,” the report said, adding that “further examination of this issue does not appear to be warranted.” But the report said that drivers need to pay better attention when using self-driving technology.

Tesla, based in Palo Alto, said in a brief statement that it appreciated the thoroughness of the investigation.

“The safety of our customers comes first,” the company said.

Tesla CEO Elon Musk highlighted one finding of the report — crash rates dropped about 40 percent in Teslas after the system was installed.

The federal agency specifically looked at the design and performance of Tesla’s automatic emergency braking system, the interface between the driver and the vehicle, data from other Tesla crashes and changes the company has made to Autopilot.

NHTSA spokesman Bryan Thomas said the agency favored the changes Tesla made to Autopilot after the crash, including more aggressive warnings when drivers take their hands off the wheel and that the system disengages when drivers repeatedly ignore the warnings.

“It certainly addressed the issues we were evaluating,” Thomas said.

On May 7, a Tesla owner in a Model S was driving with Autopilot on a divided highway near Williston, Fla. A tractor-trailer made a legal, left-hand turn across the highway in front of the Tesla. The Tesla driver and Autopilot failed to brake, and the car crashed broadside into truck. Josh Brown, a 40-year-old Navy veteran and entrepreneur from Ohio, was killed.

The safety administration examined data from multiple Tesla crashes where the airbags deployed. It also evaluated the automatic emergency braking performance, designed to stop or slow a vehicle before impact.

The agency found that Tesla’s emergency braking worked as promised — although like all emergency braking systems on the market, its primary function is to limit rear-end collisions. It was not designed to brake for vehicles, like the tractor-trailer in Florida, crossing in front of a Tesla, the report said.

The report noted Tesla instructed drivers to “always keep your hands on the wheel” while using Autopilot, but suggested that the company could be more specific about the system’s limitations in its owner’s manual.

The crash led Tesla to re-examine and overhaul several parts of the driver assistance package. In September, Musk announced a new version of Autopilot, adjusting software to make the system more dependent on radar sensors. New Teslas rolling out of the factory also have greater computer processing power to handle data from its suite of sensors.

The company has been sending out software upgrades this month.

Karl Brauer, publisher of Kelley Blue Book and Autotrader, said the crash is an example of a driver putting too much faith into Tesla’s sensors and computing power.

“It almost certainly won’t be the last incident on this journey, but people need to remember one thing — there are currently no fully autonomous cars available for public purchase or use,” Brauer said.

The findings did not free Tesla from criticism. Consumer Watchdog, an advocacy group critical of Tesla’s marketing of Autopilot, said in a statement Musk should be held responsible.

“NHTSA has wrongly accepted Tesla’s line and blamed the human, rather than the technology and Tesla’s aggressive marketing,” the group said. “The very name “Autopilot” creates the impression that a Tesla can drive itself. It can’t.”

The agency acknowledged criticisms about the system’s name, but said the marketing issue was outside the scope of their investigation.

©2017 The Mercury News (San Jose, Calif.) Distributed by Tribune Content Agency, LLC.