Researchers argue that if society is going to switch to automated driving, people must first trust the technology.
But what if they’re annoying?
It’s a real concern, and part of the conversation shaping the development of autonomous vehicles (AVs) today. After all, if car companies are going to ask people to hand over driving to the machines, they must first get those people to like the machines.
“When we get into the point of large-scale deployment of autonomous vehicles, we want to inspire trust and also promote collaboration between the occupants and the people who are around the vehicle,” said Ali Mortazavi, a senior researcher with Nissan’s self-driving car project, at the Redefining Mobility Summit held in Concord, Calif., on April 21.
In other words, people need to accept the vehicles.
What will that take? While much of the answer to that is guesswork right now, some of the common themes are safety, efficiency and not behaving unexpectedly.
Mortazavi said he sees some of the biggest challenges coming in urban settings where there are constant changes, lots of drivers and lots of rules.
“For the city driving, there is a collaboration between the vehicle, or the driver … and the people around, like pedestrians,” he said.
That collaboration involves some places where AVs might behave unexpectedly, drawing the ire of humans.
Example: An AV pulls up at a stop sign where a pedestrian is waiting to cross. The pedestrian waits just long enough for the AV to start moving — just as the pedestrian puts his first foot into the intersection. The car stops. The pedestrian stops.
That’s the reason Google filed a patent in December containing ideas for ways that cars could communicate with pedestrians — using everything from traffic signs to robotic hands — and it’s the reason Nissan showed off two systems for vehicle-to-human communication in October. The first involved a text display that could, for example, tell a pedestrian that she should cross the road. The second involved a light strip on the side of the car ensuring a bicyclist that the vehicle “sees” where he is.
Michael Clamann, a senior research scientist at Duke University’s Humans and Autonomy Lab, sees a potential problem with that. In the context Nissan showed in its demo, he said, it would likely be fine — but what if the AV was in a four-lane road, and vehicles coming up on the left didn’t see the pedestrian? What if the AV was approaching the intersection quickly?
“Using text as a warning in an emergency situation is not a good idea,” Clamann said. “Because first the person has to be looking in that direction in order to be able to interpret it. Then the person needs a certain period of time in order to interpret it.”
AVs will also need to drive efficiently, Mortazavi side. He cited the example of a prototype AV from Google that a police officer pulled over in November for driving 11 miles per hour below the speed limit.
But that’s just one area where self-driving cars might be annoyingly inefficient. Mortazavi talked about “piggybacking,” which is when one driver takes advantage of another driver’s actions — without causing delay to anybody else — even if it might appear to be against the rules of the road. An example would be at a four-way stop, where a car might be able to go straight across the intersection out of turn because the vehicle across from them is in the intersection going straight as well.
Building a computer system that can recognize those types of opportunities might, at times, take a little more than straightforward rule-writing. Which is why Mortazavi said Nissan is working on an artificial intelligence system that can learn not only what actions are appropriate, but when and where those actions are considered appropriate.
“We’re trying to kind of extract the standard social norms and what they are and how we can translate them into something that we can introduce into our artificial intelligence — [so] when the car drives, it doesn’t look odd,” he said. “It wouldn’t be something that is actually a public nuisance.”
Clamann points out that the public nuisance part is only part of the equation.
“The first thing we need to work on … is [whether] these cars are ready,” he said. “There are a lot of technical hurdles that we need to get past before these things are even ready to be deployed.”
Right now, a big hurdle is weather. Though the environments in which companies are testing are becoming more diverse, most of the testing was previously done in sunny, dry places with well-maintained roads.
“You start moving these up to, say, downtown Boston — you’re going to get a very different set of results,” Clamann said. “Roads with lots of potholes, snowy winters, a little bit more adverse weather, and you’re going to start to see a lot more problems.”
Another issue on the horizon is how people will react when the vehicles do get into accidents. If a driver kills a person now, the criminal justice system might get involved. What happens if a machine kills a person?
“Is that better? I’m not sure,” he said.
And Clamann does think people will still die in car crashes. He cited other introductions of technology, such as machines taking over manual tasks in factories. Early on, workers who weren't used to the machines would often get mangled in the moving parts.
“When you release the autonomous cars, we’re going to go through this same evolution again, where at first lives are lost because there are errors in the system, and then over time it gets better and better,” he said.
Ultimately, it comes down to safety.
“We say … we’ll trust them as soon as we can trust them to take our kids to school,” Clamann said.