It took the California Highway Patrol around seven minutes to stop the vehicle, which had been on autopilot while the allegedly intoxicated driver slept, officers reported. The situation highlights the potential for abuse of autonomous technology.
(TNS) — When a pair of California Highway Patrol officers pulled alongside a car cruising down Highway 101 in Redwood City before dawn Friday, they reported a shocking sight: a man fast asleep behind the wheel.
The car was a Tesla, the man was a Los Altos planning commissioner, and the ensuing freeway stop turned into a complex, seven-minute operation in which the officers had to outsmart the vehicle’s autopilot system because the driver was unresponsive, according to the CHP.
The arrest of 45-year-old Alexander Samek on suspicion of drunken driving reignited questions about the uses, and potential abuses, of self-driving technology.
Reached by phone Friday afternoon, Samek, a real estate developer who runs the Kor Group, said, “I can’t talk right now,” before hanging up.
Officers observed Samek’s gray Tesla Model S around 3:30 a.m. as it sped south at 70 mph on Highway 101 near Whipple Avenue, said Art Montiel, a CHP spokesman. When officers pulled up next to the car, they allegedly saw Samek asleep, but the car was moving straight, leading them to believe it was in autopilot mode.
The officers slowed the car down after running a traffic break, with an officer behind Samek turning on emergency lights before driving across all lanes of the highway, in an S-shaped path, to slow traffic down behind the Tesla, Montiel said.
He said another officer drove a patrol car directly in front of Samek before gradually slowing down, prompting the Tesla to slow down as well and eventually come to a stop in the middle of the highway, north of the Embarcadero exit in Palo Alto — about 7 miles from where the stop was initiated.
Authorities said the entire operation took about seven minutes.
Officers then walked up to the Tesla and “attempted to wake up Samek by knocking on the window and giving verbal commands,” Montiel said. “After Samek woke up and got out of the Tesla, he was placed in the back of the patrol car and taken off the freeway.”
At a nearby gas station, Montiel said, Samek was given a field sobriety test before being arrested.
Tesla, whose autopilot technology assists in steering, changing lanes and parking, has seen its vehicles involved in several notable accidents in the last few years.
In 2016, a man was killed in Florida after the Model S he was driving collided with a tractor trailer while the Tesla was in autopilot mode. In January, a man was arrested on suspicion of driving under the influence on the Bay Bridge. He told officers he had been using his Tesla’s autopilot mode. Also this year, a man driving a Model X on Highway 101 in Mountain View was killed after his vehicle, which was in autopilot mode, struck a concrete divider.
The company declined to comment on Friday’s incident.
On its website, Tesla notes that its autopilot technology is not synonymous with self-driving cars or autonomous vehicles, and that drivers must keep their hands on the wheel at all times.
William Riggs, a professor of transportation, technology and engineering at the University of San Francisco, said most cars have some variation of autonomous features, and as the technology advances, people will feel more comfortable being distracted when driving.
“The software and technology is so good that humans are going to violate it,” Riggs said. “The future of autonomy really invites a lot of opportunities to disengage from the driving activity as we see more and more AI in control of vehicles.”
John Simpson, privacy and technology project director with Consumer Watchdog, saw Friday’s incident as further evidence of Tesla drivers being inappropriately led to believe the autopilot function is akin to fully autonomous driving.
In May, Simpson’s organization and the Center for Auto Safety asked the Federal Trade Commission to investigate “deceptive and unfair” practices in the advertising and marketing of the autopilot feature.
“They’ve really unconscionably led people to believe, I think, that the car is far more capable of self-driving than actually is the case. That’s a huge problem,” Simpson said. “In this case, it sounds as if it played out with some incredible work by the state police — the Highway Patrol — to get the car to stop. But it’s really amazing they were able to do that.”
Jim McPherson, a Bay Area attorney and industry analyst, said the autopilot feature is supposed to cut out if the driver doesn’t keep a hand on the wheel and exert “a minimal amount of torque.”
“But the length of the warning is variable. It could be a few seconds, it could be a minute or two,” McPherson said. “It’s really quite a miracle that this guy continued for 7 miles without autopilot cutting out.”
McPherson said he would expect the autopilot system to have alerted the driver to keep his hands on the wheel.
“Also, if he’s asleep, any more than a slight nudge of the wheel or touch of the pedal is going to cancel autopilot,” McPherson said. “So for it to get as far as it did, it’s a surprise to me.”
Riggs said the best time to use autopilot technology is during long highway trips, but those situations also pose hazards because drivers are usually less engaged than they would be in more urban settings.
“You’re more likely to have a distracted driver,” he said. “That can create a false sense of security.”
As the technology advances, many experts argue that having more autonomous vehicles on the road will lead to safer driving and fewer collisions. However, the technological progression must be accompanied by policies that clearly dictate when drivers should take the wheel, Riggs said.
“We haven’t really grappled yet with policing what happens in the vehicle,” he said. “If we throw people in vehicles and assume they don’t have to be operators, that should be transparent to them.”
©2018 the San Francisco Chronicle. Distributed by Tribune Content Agency, LLC.