IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

How Tesla Handled a Braking Bug in Public Self-Driving Test

In a recall report to federal safety regulators Friday, Tesla put the problems like this: The company discovered a software glitch that "can produce negative object velocity detections when other vehicles are present."

tesla
(TNS) — Tesla pushed out a new version of the experimental software suite it calls Full Self-Driving to approved drivers Oct. 23 through an "over the air" update.

The next morning, Tesla learned the update had altered cars' behavior in a way the company's engineers hadn't intended.

In a recall report to federal safety regulators Friday, Tesla put the problems like this: The company discovered a software glitch that "can produce negative object velocity detections when other vehicles are present."

In everyday English, Tesla's automatic braking system was engaging for no apparent reason, causing cars to rapidly decelerate as they traveled down the highway, putting them at risk of being rear-ended. Forward collision warning chimes were ringing too, even though there was no impending collision to warn about.

The company said no crashes or injuries were reported due to the glitch. Still, the incident demonstrates how complicated these systems are: Even a small change to one part of the system can affect how something as vital but seemingly simple as automatic braking will function. The incident raises the question of whether there is a safe way to test self-driving vehicles at mass scale on public roads, as Tesla has been doing.

Tesla's response to the glitch raises its own concerns. While its engineers worked to fix the software, they turned off automatic braking and forward collision warning for the software testers over the weekend, the company said. According to numerous messages posted on Twitter, owners were not informed that those safety systems had been temporarily deactivated, finding out only by scrolling through the menu on their cars' dashboard screens.

By Oct. 25, Tesla had knocked out a software fix and zapped it to 11,704 drivers enrolled in the Full Self-Driving program.

Tesla, which has disbanded its media relations department, could not be reached for comment.

Tesla's Full Self-Driving program is the company's attempt to develop a driverless car. It is markedly different from its driver-assist system called Autopilot. The latter, introduced in 2015, automates cruise control, steering and lane changing.

Autopilot is the subject of a federal safety investigation into why a dozen Teslas have crashed into police cars and other emergency vehicles parked by the roadside. Those crashes resulted in 17 injuries and one death.

Investigators are trying to learn why Tesla's automatic emergency braking systems apparently did not engage to prevent or mitigate such crashes. The National Highway Transportation Safety Administration is looking into which system software elements are in charge of automatic braking when a car is on Autopilot and a crash is imminent. Experts have raised the possibility that Tesla is suppressing automatic braking when Autopilot is on, possibly to avoid phantom braking of the sort drivers experienced after the Full Self-Driving update.

Tesla's automatic emergency braking issues are beginning to creep into the courts. A trial due to begin in 2022 seeks damages from Tesla for an Autopilot-related crash that killed Apple engineer Walter Huang in 2018 on a Mountain View, Calif., freeway. According to claims made in the lawsuit, Huang's Tesla Model X "was designed, built and introduced into the stream of commerce without having been equipped with an effective automatic emergency braking system."

According to a crash report from the National Transportation Safety Board, not only did the automatic brake system fail to engage, but the car also sped up into a concrete abutment.

Tesla has billed Full Self-Driving as the culmination of its push to create a car that can navigate itself to any destination with no input from a human driver. Tesla Chief Executive Elon Musk has promised for years that driverless Teslas are imminent.

The regulations on deploying such technology on public roadways are spotty around the country. There is no federal law — legislation on driverless technology has been gummed up in Congress for years, with no action expected soon.

And though California requires companies testing driverless technology on public roads to report even minor crashes and system failures to the state Department of Motor Vehicles, Tesla doesn't do so, according to DMV records. Companies including Argo AI, Waymo, Cruise, Zoox, Motional and many others comply with DMV regulations; Tesla does not, DMV records show. The Times has asked repeatedly over several months to speak with department Director Steve Gordon to explain why Tesla gets a pass, but every time he's been deemed unavailable.

In May, the DMV announced a review of Tesla's marketing practices around Full Self-Driving. The department has declined to discuss the matter beyond saying, as it did Tuesday, that the review continues.

Like Tesla, other companies developing autonomous driving systems use human drivers to supervise public road testing. But where they employ trained drivers, Tesla uses its customers instead.

Tesla charges customers $10,000 for access to periodic iterations of Full Self-Driving Capability software. The company says it qualifies beta-test drivers by monitoring their driving and applying a safety score, but it has not clarified how the system was developed.

YouTube is loaded with dozens of videos showing Tesla beta-test software piloting cars into oncoming traffic or other dangerous situations. When one beta-test car tried to cross the road into another vehicle, a passenger commented in a video, "it almost killed us," and the driver said in the video, "FSD, it tried to murder us."

As such videos appeared, Tesla began requiring beta testers to sign nondisclosure agreements. But NHTSA sent Tesla a stern letter calling the agreements "unacceptable," and the video posting resumed. The agency said it relies on customer feedback to monitor automobile safety.

The recall that resulted from the automatic braking bug — employing over-the-air software, no visit to the dealer necessary — marks a beginning of a major change in how many recalls are handled. Tesla has taken the lead in automotive software delivery, and other carmakers are trying to catch up.

Voluntary recalls are a rare event at Tesla. In September, NHTSA castigated the company for delivering software intended to help Tesla's Autopilot system recognize flashing emergency lights in the wake of the agency's emergency vehicle crash investigation. NHTSA told the company that safety fixes count as a recall, whether they are delivered over the air or whether a dealer visit is required. (Over-the-air software is delivered directly to a car through cell tower connections or Wi-Fi.) As other manufacturers adopt over-the-air software updates, such recalls will become more common.

Federal safety regulators are only beginning to grasp the profound changes that robot technology and its animating software are bringing about. NHTSA recently named Duke University human-machine interaction expert Missy Cummings as senior advisor for safety.

"The problem is that NHTSA has historically focused on mechanical systems," said Junko Yoshida, editor in chief of the Ojo-Yoshida Report, a new publication that "examines the intended and unintended consequences of innovation."

In an article titled "When Teslas Crash, 'Uh-Oh' Is Not Enough," she writes, " Tesla's behavior to date makes a clear case for consequential changes in NHTSA's regulatory oversight of advanced driver-assistance systems."

Asked for comment, NHTSA said the agency "will continue its conversations with Tesla to ensure that any safety defect is promptly acknowledged and addressed according to the National Traffic and Motor Vehicle Safety Act."

© 2021 Los Angeles Times. Distributed by Tribune Content Agency, LLC.