IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Can a Self-Driving Car Glitch Threaten Passenger Safety?

Although the market is quickly moving toward adding assisted driving technology to cars, several questions regarding computer functionality and emergency situations have arisen.

(TNS) -- No one likes a backseat driver. Nagging, nannying. Questioning every decision, constantly attempting to correct what he or she considers to be your errors of judgment.

How about an it doing the same thing? One you can’t kick to the curb?

The it in question being the backseat computer. Under the dash somewhere, actually. The one that will in the not-far future take over the driving and not just second-guess yours.

It’s the self-driving or autonomous car. And it’s no longer science fiction. In fact, it’s already here. Bits and pieces of it, anyhow. Many new cars can park themselves, for instance. Others have collision avoidance systems that can completely stop the car without the driver even touching the brakes.

Next year, GM’s Cadillac division will debut vehicle-to-vehicle, or V2V, communications in some models.

The system makes it possible for cars so equipped to have electronic conversations among themselves — to be aware of one another’s relative position and velocity — in order to anticipate and hopefully avoid potential collisions such as might otherwise happen when, for instance, car A runs a red light because its driver wasn’t paying attention and strikes car B.

With V2V, the driver of car A would be safety-netted by his car. It would brake for the light and so avoid striking car B.

These are some of the elements of the fully autonomous, self-driving car. And some of it sounds good — and may well be. But taking the driver out of the equation entirely — or relying too much on technology — can have its downsides, too.

As anyone who owns a PC knows, computers develop glitches sometimes. It’s annoying when it happens at your desk. But it could be lethal when it happens at 75 mph on the freeway.

And it’s probably more likely to happen with an autonomous car, because the computer that controls it — unlike the computer on your desk — will be subjected to extremes of heat and cold, vibration and moisture, et cetera.

Over time, something’s likely to go on the fritz. If the human driver has become a mere passenger — no longer expected or perhaps even able to actually drive the car — what will happen?

And who will be responsible? Legally speaking, the driver is currently responsible for the safe operation of his vehicle.

But how can we hold him responsible when he’s no longer the driver?

Will the manufacturer of the self-driving car be liable in that case?

How will car insurance requirements and costs change?

If the driver no longer is a driver, why should he be required to buy insurance at all? Or have a license, for that matter? When you ride the bus you are not required to have a special permit — or carry coverage. Why wouldn’t the same principle apply here?

An even bigger problem with autonomous cars is how to program them to disregard traffic laws when it’s necessary to do so in order to avoid an accident.

For example, it’s illegal to cross the double yellow line — but what if a child runs into the car’s path and the only way to avoid hitting her is to swerve out of the way?

It’s illegal, technically, to cross the double yellow — but it’s the right thing to do. And a human driver would do it. An autonomous car wouldn’t. Because it is programmed to obey the law. Unlike humans, it cannot use judgment; does not do nuance.

Also, how will autonomous cars deal with cars not autonomous, and what about the reverse? Will people who own human-controlled cars be required to turn their cars in or no longer be allowed to drive them?

Technology is usually a good thing, but problems arise when technology is no longer under human control, as could happen here.

Technology that assists human drivers — that’s a great idea. But technology that pre-empts them and dumbs them down — that could be a very bad idea, indeed.

©2016 Eric Peters Distributed by Tribune Content Agency, LLC.