Tesla Autopilot Failed in Fatal Crash, Lawsuit Says

A preliminary report from the National Transportation Safety Board, one of the federal agencies investigating the crash, says the victim engaged the Autopilot system about 10 seconds before the crash.

by Brooke Baitinger, Sun Sentinel / August 2, 2019

(TNS) — The family of a Tesla driver killed in a crash with a semi-truck is suing the electric carmaker over its autopilot feature.

Four months ago, 50-year-old Jeremy Banner was killed in west Delray Beach, Fla., when a tractor trailer pulled out in front of his bright red Tesla Model 3.

A preliminary report from the National Transportation Safety Board, one of the federal agencies investigating the crash, said Banner had engaged the Autopilot system about 10 seconds before the crash. Less than eight seconds before the collision, his hands weren’t detected on the steering wheel, which would have prompted warnings from the car’s automated system.

The lawsuit alleges the system failed.

Banner’s family announced Thursday that they’re filing a wrongful death lawsuit against Tesla, the semi driver and the trucking company.

“Technology like self-driving cars is great to have, but we can’t put that before safety,” said Trey Lytal, the family’s attorney.

Lytal planned a news conference at his office in West Palm Beach, Fla., on Thursday afternoon, where he will read a statement from Banner’s family, including his three kids and his wife, Kim.

The crash occurred on State Road 7, near Pero Family Farms just north of Atlantic Avenue on March 1.

According to the preliminary report, Banner was driving southbound on State Road 7 when the semi trailer pulled out in front of the Tesla, attempting to cross the southbound lanes and turn left to go north. Surveillance videos and forward-facing video from the Tesla show the truck slowed and blocked the Tesla’s path, the report said.

The Tesla drove beneath the trailer at 68 mph, and the roof was sheared off, killing Banner.

The crash is eerily similar to another one involving a Tesla in 2016 near Gainesville, Fla. Joshua Brown, 40, of Canton, Ohio, was traveling in a Tesla Model S on a divided highway and using the Autopilot system when he was killed.

Neither Brown nor the car braked for a tractor-trailer, which had turned left in front of the Tesla and was crossing its path. Brown’s Tesla also went beneath the trailer and its roof was torn off.

The NTSB, in a 2017 report, wrote that design limitations of the Autopilot system played a major role in the fatality, the first known one in which a vehicle operated on a highway under semi-autonomous control systems.

The agency said that Tesla told Model S owners that Autopilot should be used only on limited-access highways, primarily interstates. The report said that despite upgrades to the system, Tesla did not incorporate protections against use of the system on other types of roads.

The NTSB found that the Model S cameras and radar weren’t capable of detecting a vehicle turning into its path. Rather, the systems are designed to detect vehicles they are following to prevent rear-end collisions.

Tesla has said that Autopilot and automatic emergency braking are driver-assist systems, and that drivers are told in the owner’s manual that they must monitor the road and be ready to take control.

In January 2017, the National Highway Traffic Safety Administration, which is the second federal agency investigating the crash that killed Banner, ended an investigation into the Brown crash, finding that Tesla’s Autopilot system had no safety defects.

But the agency warned automakers and drivers not to treat the semi-autonomous driving systems as if they could drive themselves. Semi-autonomous systems vary in capabilities, and Tesla’s system can keep a car centered in its lane, brake to stop from hitting things and change lanes when activated by the driver.

This is a developing story. Check back for updates.

©2019 the Sun Sentinel (Fort Lauderdale, Fla.). Distributed by Tribune Content Agency, LLC.

Platforms & Programs