Transportation

Who's at Fault in Self-Driving Car Crashes?

Intel and Mobileye believe a mathematical formula could help determine who caused an accident between human- and machine-driven vehicles.

by Carolyn Said, San Francisco Chronicle / October 18, 2017

(TNS) -- When a self-driving car crashes into a human-driven one, whose fault is it?

Answering that question will be a challenge for an emerging industry that desperately needs to win the public’s trust, and clarify liability for automakers and insurers.

Intel and Mobileye on Tuesday proposed a mathematical formula that provides specific parameters for that assessment — and seeks to make sure that any accident will never be the robot car’s fault. Intel wants to engage with the nascent self-driving car industry and standards bodies to create an open system for determining fault in autonomous-car accidents that can be used in all circumstances.

“As machines start causing collisions, there’s a lot of risk that consumers could turn against them and all the benefits and investments could really be damaged,” said Dan Galves, senior vice president at Mobileye, an Israeli company making sensors for robot cars. Intel bought Mobileye for $15.3 billion in March. “It would help a lot to know that there are predetermined rules for clarifying fault.”

Mobileye CEO Amnon Shashua, who is now an Intel senior vice president, presented his concept, also published in an academic paper, at the World Knowledge Forum in Seoul on Tuesday.

“The ability to assign fault is the key,” Shashua said, according to prepared text of his speech. “Just like the best human drivers in the world, self-driving cars cannot avoid accidents due to actions beyond their control. But the most responsible, aware and cautious driver is very unlikely to cause an accident of his or her own fault, particularly if they had 360-degree vision and lightning-fast reaction times like autonomous vehicles will.”

His system involves programming autonomous cars to always follow certain parameters such as calculating the exact safe distance to follow other cars on the highway, based on sensor input of their speed, information on their braking power, and other factors.

“A major positive of agreeing on these definitions of responsibility and fault would be to never put an autonomous vehicle at risk of violating any of these rules,” Galves said. “It creates a safety layer where, no matter what, the vehicle will never issue a command to the brakes or steering that would put it at risk of causing a collision that would be its fault.”

While Intel hasn’t yet reached out to the industry at large, it’s gotten positive response from its customers, Galves said, naming core customers as BMW, Nissan and Audi.

“This is a discussion that needs to happen now,” he said. “We’re not that far away from beginning to deploy these vehicles.”

©2017 the San Francisco Chronicle Distributed by Tribune Content Agency, LLC.