IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Placing Blame: Liability Questions Loom Over Autonomous Vehicles

How states choose to regulate insurance and liability for self-driving cars may impact how quickly consumers adopt them, but many questions remain around how and when to set these new policies.

A digital image of a car on a road with arrows pointing it forward.
Even the most advanced self-driving cars may get into crashes, and how state regulators assign liability for these accidents could have a significant impact on how consumers engage with the technology and how it develops, according to some thinkers in the space.

“Your state regulates you as the driver, but in [autonomous vehicles], who’s the driver? Is it the robot? Is it the automaker?” said Devin Gladden, AAA National manager for federal affairs and technology and public purpose fellow at the Harvard Kennedy School’s Belfer Center for Science and International Affairs, during a recent virtual event hosted by the Center.

“There are still lingering questions in the regulatory space that will need to be answered [and] that will support whether or not we see greater adoption and use of these vehicles,” Gladden said. “The insurance industry has been tiptoeing around this. State regulators…have been tiptoeing around this.”

With a U.S. senator duo recently urging the federal government to do more to develop, build and deploy autonomous vehicles (AV), questions around how and when some of these unresolved regulatory decisions will be made could get renewed attention.

The self-driving automobile industry is a “global market opportunity worth an estimated $8 trillion,” wrote Sens. Gary Peters, D-Mich., and John Thune, R-S.D., in a draft amendment they presented in April. That amendment takes its own stab at regulatory questions and would change how the safety of such vehicles is evaluated.


States are charged with regulating car insurance and liability, and 40 of them — plus Washington, D.C. — had enacted some form of legislation or executive order regarding AVs as of 2020, according to the National Conference of State Legislators.

Some of this legislation seems intended to ensure rules are in place before use of the vehicles becomes widespread. A 2020 New Jersey bill would make owners liable for their self-driving cars, for example, and a 2021 Wyoming bill would assign liability to “the dispatching entity, manufacturer, vehicle owner, or any combination thereof... if any or a combination of those persons or entities are at fault.”

Gary Biller, president of the nonprofit membership group National Motorists Association, told Government Technology that assignment of liability is a key concern for drivers. As is a related issue: uncertainty over motorists’ responsibilities when using types of automated cars that may ask drivers to re-take control at times. The question of how liability and responsibility are attributed among drivers and auto manufacturers or software providers in these situations is complicated by the latter’s financial resources, he noted.

“Auto manufacturers and system suppliers for self-driving cars have a lot more resources than an individual motorist does — so a lot more resources to defend themselves and protect their rights,” Biller said.


Fully self-driving car use by the general public may still be a ways off, and many states have focused insurance legislation on ensuring manufacturers currently testing their technologies are covered.

State regulators should consider making manufacturers exclusively responsible for AV liabilities during initial tests on public roads, said Gladden. His research found that concerns over liability are one of four major obstacles to public acceptance of AVs. Putting corporate funds on the line could demonstrate companies’ faith in their technologies, fostering greater resident trust, he said.

“I believe that if industry is willing to put all of these investment dollars into these vehicles to bring them to life, I think that means they should take on the liability, at least for a short amount of time,” Gladden said.

States that require the manufacturers attain insurance before testing often still leave questions unanswered, however. Many do stipulate a certain amount of insurance, with Connecticut law pegging it at $5 million, for example. But laws generally don’t outline other specifics, seemingly leaving testers with a lot of leeway about what kind of insurance plans they get, said Asaf Lubin, associate professor of law at the Indiana University Maurer School of Law and fellow at the university’s Center for Applied Cybersecurity Research, in a recent Government Technology interview.

“We know nothing in the academic insurance policy world about what these policies say,” Lubin said. “They’re about 80 testers across the U.S. … getting policies tailored for them.”


Stakeholders have traditionally debated the appropriate time for enacting detailed regulations on emerging technology.

Some argue for moving quickly to issue clear policies about liability and other matters, while others suggest holding off until the inventions are in greater use. Proponents of the latter say this allows policymakers to see how users end up interacting with the technologies, giving the regulators more insights and helping them avoid what may prove to be overly restrictive approaches, Lubin explained.

“It’s been commonly debated between those who say, ‘We need to regulate now and introduce a new liability framework,’ and those who say, ‘Wait, let’s see how this evolves [and] how the public reacts to it and allow for opportunities for regulatory sandboxing and experimentation prior to creating some kind of pre-emptive framework,’” Lubin said.

Such decisions could impact both driver trust and developer strategies, with Lubin noting it is generally accepted that “any kind of regulation we introduce today could scare off developers, shift development outside the United States or deny us the benefits that could come from a robust AI industry.”

With risks to both approaches, state regulators will need to weigh pros and cons carefully.


In Lubin’s view, policymakers should start public discussion over the kinds of insurance policies used to cover vehicle testing but can generally postpone much of the liability debate for now. Instead, he said, they should seize the moment to regulate other aspects of AV use that could be more pressing — such as data privacy and cybersecurity considerations. Stepping in now lets policymakers head off worrisome practices before they become ingrained into corporate revenue models and consumer habits, he said.

“Think about being able to regulate social media platforms in 2005 before they fully burst into the scene,” Lubin said. “Everything we criticize today of the surveillance capitalism that has been created by these companies based on an ecosystem of targeted ads and so on could have been regulated ahead of time to create an ecosystem that is perhaps consumer-protective.”

AVs collect data about their surroundings to enable the cars to navigate their environments and they may communicate back to smart city infrastructure, giving city planners insights for better traffic management. But with all this data flying about, regulators need to determine how information collected and transmitted by the system and the cars should best be handled.

“Who owns the data generated by the vehicle?” Lubin said. “Is that data owned by the vehicle owner? Is that data owned by the manufacturer? Can they share it with third parties? Will that data be encrypted [and] anonymized?”
Jule Pattison-Gordon is a senior staff writer for Government Technology. She previously wrote for PYMNTS and The Bay State Banner, and holds a B.A. in creative writing from Carnegie Mellon. She’s based outside Boston.