IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Recent Incidents Fuel Debate About Autopilot Misuse

Advances in autopilot technology in some cars has largely been heralded as a safety improvement, but a series of recent incidents is forcing some to question whether the technology is just making some drivers careless.

(TNS) — There’s been a lively social media debate over technology-assisted driving since the California Highway Patrol busted a Tesla driver officers said was passed out drunk while the electric car cruised 70 mph on Autopilot down Highway 101 last month in Redwood City.

Some say technologies like Tesla’s Autopilot — which detects other cars and objects to help drivers avoid crashes but still requires their attention — encourage dangerous behavior like driving drunk or texting instead of watching the road.

But others see something miraculous: a motorist apparently passes out behind the wheel while speeding down a major highway and — with help from the CHP — the car comes safely to a stop without anybody getting hurt.

“The guy would be dead, and others maybe too if it wasn’t for AP,” Elisabeth Soechting posted on Twitter this month, using the shorthand for Tesla’s Autopilot technology.

The Nov. 30 incident wasn’t a first: The California Highway Patrol made a similar arrest Jan. 19 on the Bay Bridge. In both cases, the CHP said Tesla’s Autopilot feature in the premium electric car that retails for upwards of $74,500 appeared to have been engaged, and nobody was hurt.

The Palo Alto carmaker hasn’t officially commented on the cases, though CEO Elon Musk said Dec. 2 on Twitter that he is “looking into what happened here.” The company hasn’t disputed that Autopilot was engaged.

Autopilot isn’t the same technology as that used by fully automated self-driving vehicles, which are still in development. The Autopilot technology is rated Level 2 by the Society of Automotive Engineers on a scale in which 5 is a fully autonomous self-driving car without a human driver.

Other premium automakers like Mercedes-Benz and BMW also offer Level 2 features, but Tesla’s Model S is perhaps the best known. Tesla’s Autopilot includes a cruise control that maintains the car’s speed in relation to surrounding traffic and features that help the driver steer within a clearly marked lane and safely change lanes.

While the latest CHP arrests bolster the case for Autopilot effectiveness, it isn’t foolproof, and suffered bruising press from a couple of fatal crashes.

But Tesla has defended the technology’s safety record. The company, which introduced Autopilot in 2015, said in a statement Oct. 4 the technology has demonstrated reduced accident rates. Tesla drivers using Autopilot registered one accident every 3.34 million miles, while federal data show drivers without crash every 492,000 miles.

On its website, Tesla stresses that “Autopilot is intended for use only with a fully attentive driver who has their hands on the wheel and is prepared to take over at any time.”

Drivers must first agree to “keep your hands on the steering wheel at all times” before engaging autopilot, and if it senses “insufficient torque” from the motorist’s hands it sends “an escalating series of audible and visual alerts.”

“This is designed to prevent driver misuse,” the company says, “and is among the strongest driver-misuse safeguards of any kind on the road today.”

But clever drivers have posted workarounds on the internet, like stuffing a large orange into the steering wheel.

The California Highway Patrol, needless to say, wasn’t amused.

“The driver is ultimately responsible for the safe operation of the vehicle,” said CHP Officer Art Montiel. “And they should not try these ‘hacks’ or any other ways to override the ‘driver assist’ feature since this is not how this featured was designed to be used.”

In the most recent CHP arrest, officers said they spotted a gray Tesla Model S traveling south on 101 near Whipple Avenue at 3:37 a.m. and noticed the driver “appeared to be asleep at the wheel.” The car was going about 70 mph.

After finding the driver “unresponsive” to lights and siren, the officers positioned their patrol car in front of the Tesla and began slowing “in hopes that the ‘driver assist’ feature had been activated and therefore the Tesla would slow to a stop as the patrol vehicle came to a stop,” which it did.

Officers woke the driver, and after a field sobriety test, arrested Alexander J. Samek, 45, of Los Altos, on drunken driving charges. He has not commented publicly. A court date is set for Jan. 4.

In his Twitter post after the incident, Musk said that while he is looking into the matter, “default Autopilot behavior, if there’s no driver input, is to slow gradually to a stop and turn on hazard lights.” Tesla, he said, “then contacts the owner.”

In the social media debate that followed, most seemed to share Soechting’s sense that the technology was a lifesaver, though several offered suggestions for updates that might frustrate drunks trying to use the system to get home. Pete Clay posted a suggestion that the car play music or vibrate seats to jolt the driver awake.

Others, however, suggested the technology encourages people to think they could just check out behind the wheel.

“Don’t enable stupid people,” Shay Smith posted.

But fans of the technology like Johnna Crider say it could not only spare the lives in cases where drivers ignore laws prohibiting them from being drunk or texting, but also in medical emergencies.

“Hopefully not only will this encourage people to NOT drink and drive,” Crider posted on Twitter, “but maybe could help people driving who have issues such as epilepsy or medical conditions that would render them unconscious.”

©2018 the San Jose Mercury News (San Jose, Calif.). Distributed by Tribune Content Agency, LLC.