As more states pass laws authorizing testing of autonomous vehicles, key legal questions need to be answered.
A growing number of states are taking up legislation that addresses self-driving vehicles in an effort to make it easier for researchers to explore the technology.
But most of the legislation deals with how states can facilitate testing -- as opposed to consumer use of the vehicles -- largely because truly automated vehicles aren't yet available on the market. When that day comes, states will face a host of thorny questions. Historically, states have regulated drivers, and the feds have regulated vehicles. But what happens when the vehicle is the driver?
The National Highway Traffic Safety Administration has said issues such as licensing, driver training, and how the vehicles will be operated are best handled by the states. "Some states are going to act much faster than the feds can or will," says Bryant Walker Smith, a fellow at Stanford Law School's Center for Internet and Society, who has written extensively about the legal implications of automated vehicles.
Nevada became the first state to authorize the operation of autonomous vehicles on its roadways in 2011; since then California, Florida and the District of Columbia have all passed laws that open the door for self-driving vehicle testing on public roads as well. Those places -- and other states considering bills -- have largely deferred many of the questions of how to regulate autonomous vehicles to their motor vehicle departments.
"Legislatures usually break things when they create rules themselves," says New Jersey state Sen. Tom Kean, Jr., who introduced a bill this summer that calls on the Motor Vehicle Commission to develop regulations for self-driving vehicles. "Five years from now, or 25 years from now, we won't know what technology exists."
But not even those departments have all the answers yet. Florida's new automated vehicle law, for example, requires that the state's highway patrol submit a report to the legislature next year, but it's unclear whether they'll have learned much since the law was enacted in 2012; nobody has actually registered an autonomous vehicle in the Sunshine State yet. "At this point, we're not certain what we're going to report," says Julie Baker, chief of the Bureau of Issuance Oversight within the Department of Highway Safety & Motor Vehicles.
NHTSA expects its first stage of self-driving vehicle research to be completed within the next four years. Meanwhile, some industry officials say by 2025, fully autonomous vehicles could be mainstream, according to the Wall Street Journal. That gives states some time -- but not tons -- to start sorting out the legal questions that come with the technology. Here are six they should consider.
1. Will drivers need any sort of training?
Driver's education classes have been a staple of American teenage life, as have the nerve-wracking driver's tests that follow. Thankfully, drivers aren't required get a new license and go through the rigmarole of driver's ed every time they buy a new vehicle. But would drivers need new training before getting behind the wheel of a self-driving vehicle? If training became a requirement, would it be the responsibility of the the government, the manufacturer, the dealer, or some other entity?
The answer to this question probably won't be clear until the technology is further along and officials have an understanding of just how intuitive fully-automated vehicles will or won't be. "If you asked five years ago, should there be special training required for people who use adaptive cruise control... you really could have gone either way on that," says Smith, of Stanford. Today, training of that sort wold seem preposterous.
NHTSA, for its part, recommends states develop a special license or endorsement based on some sort of prerequisite like a test or certification from a manufacturer of autonomous vehicle systems. Smith says states will also have to decide whether people with disabilities that preclude them from driving traditional vehicles would be eligible for an autonomous vehicle instead.
2. Is it possible to speed?
Just about every driver has, at some point, inched at least a few notches past the speed limit either to save time on a trip or to overtake a slower vehicle. But speeding is illegal. One key question that both policymakers and manufacturers will need to answer is whether it should be technically possible for a self-driven car to go above the speed limit.
The big promise of autonomous vehicles is the assumption that they'd be much safer than typical cars. On one hand, if they have the capability of speeding, that would seem at odds with the goal of safety. On the other hand, if breaking the speed limit isn't possible in autonomous vehicles, they could be unpopular with some drivers.
Smith says that so far, automated vehicles in the testing phase allow the human tester to set the speed. Assuming that continues to be the case with consumer vehicles, violations of the speed limit, from a legal perspective, would clearly fall on the driver's shoulders. "You could make creative arguments that if there's no driver, the law doesn't apply," Smith says. "But that won't fly."
3. Is distracted driving allowed?
Self-driving cars have been touted as enormously safe -- even safer than a human driver -- but during the testing phase, they are still required to have an alert driver at the helm just in case of malfunctions.
If self-driving cars become available to consumers, lawmakers will need to decide just how alert drivers need to be. One big selling point to consumers would likely be freedom they'd enjoy during their commutes to work on other activities instead of paying attention to the road. Or, hypothetically, someone who's had a few too many drinks might find an autonomous vehicle to be a safe way to get home. But is the technology reliable enough to handle drivers who are completely disengaged from the experience, or even impaired?
Without clear legal language saying otherwise, the person using the autonomous vehicle is still considered the driver and would have the same legal obligations as any other driver in the state: no texting (if it's prohibited) and certainly no drinking.
But Nevada's law has a twist and specifically says autonomous vehicle drivers can text (though drunk driving is still prohibited). That would seemingly suggest states have the ability to create narrowly-tailored laws addressing specific types of distractions.
NHTSA recommends that states require properly-licensed drivers to be in the driver's seat and ready to take control, even when the vehicle is in self-driving mode. But in order to make the vehicles appealing, states may need to relieve motorists of some of the current duties they have when driving today.
4. Who's liable for accidents?
Perhaps the biggest question facing self-driving cars is who's ultimately responsible when things go wrong. If a self-driving car causes a collision, who's really to blame: the driver or the manufacturer? And can someone get a ticket while behind the wheel of a self-driving car? It wouldn't be hard to imagine a motorist arguing, "I wasn't driving; the car was."
Indeed, some skeptics have suggested the liability question is the biggest threat to the future of the technology, though Smith disagrees. "I think when the technology is ready and society considers it to be safe, then the law will find a way," he argues.
Human drivers will likely remain legally liable for parking and speeding violations, he says. But "the tragic stuff," like vehicular manslaughter, could be a different story. In that case, "to be guilty legally, a human has to have acted with some level of culpability," Smith says. "If you're sitting in an autonomous vehicle, and you're not behaving recklessly, and the vehicle swerves over and hits somebody, you are not guilty of manslaughter. You didn't even act."
But in the same situation, depending on state law, both the operator of the vehicle as well as the manufacturer -- and perhaps others involved in its design -- could be held liable if a defect caused or even contributed to the collision. And if computer programmers eventually play a bigger role in the way vehicles move than drivers do, it's likely manufacturers will build the cost of litigation and insurance into their vehicles.
5. What kind of registration would the vehicle have?
Generally, consumer vehicles have the same type of registration, but should a self-driving car be subject to a special category? And should autonomous vehicles have easily-identifiable license plates so other motorists as well as police are aware they're dealing with something different? States are still sorting out those questions. Nevada, for example, has a special tag for autonomous vehicles, but California has yet to make a decision.
Smith raises another question. Very soon low-speed, low-mass vehicles -- think glorified golf carts -- will have the ability to operate on a closed-loop to help shuttle people through places like airports and college campuses. Those vehicles might not need any sort of human driver at all, which could further complicate questions about registration.
6. How to transition from manual to auto-pilot?
Self-driving cars can be driven in a traditional fashion as well as an automated mode, but lawmakers and manufacturers will need to figure out how drivers should switch between those two functions. Would a motorist need to choose one or the other at the moment he starts his car? Or could he switch freely back-and-forth while in motion?
If the latter is possible, how would police know whether the car or the driver was responsible during an accident? At one time Nevada, considered an external light that indicating when a car was in autonomous mode, but that idea was killed. The question about transitioning modes is an early one, and the answer will largely depend on how the technology evolves.
This story originally appeared on GOVERNING.com.
Looking for the latest gov tech news as it happens? Subscribe to GT newsletters.