IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Pittsburgh Company Commits to Identifying, Fixing Bugs and Weaknesses in Autonomous Vehicle Software

It only takes only one bug or bad line of code in autonomous vehicle software to potentially make the entire system go haywire.

(TNS) — Automobiles have millions of lines of computer code running everything from dashboard displays to throttle controls.

Make that car autonomous, and the computing complexity could multiply by 100.

But it still would take only one bug in the software or a bad line of code to potentially make the system go haywire, said Mike Wagner, co-founder of Edge Case Research, a Pittsburgh company that tests and simulates computer software to identify and fix bugs and other weaknesses.

Wagner, however, is concerned that not all the companies working to take our hands off the wheel are paying close attention to their software.

“No, they are not yet doing this, and yes, they need to be doing it,” Wagner said of the companies actively developing and testing autonomous vehicles.

Edge Case Research, a Lawrenceville company of about 10 people, works with some companies developing autonomous vehicles and uses automated robot assessment tools to test the robustness of the software powering self-driving cars. The company simulates failing sensors or cameras to test how the software reacts. It feeds unexpected or unnatural data into the system, such as a black pixel in an image where one shouldn't be or a speed of negative infinity, to see what the car does.

“You get right at things you're not going to find on the test track,” Wagner said. “The space of possible behaviors and the ways that the logic could execute, you want to have the computer pull apart your code.”

Edge Case has about 25 clients and has been getting more work with autonomous vehicles heating up. The company is working with the Army on autonomous technology for convoying trucks. Other clients are in defense, automotive, finance and the Internet of Things, Wagner said.

Bad software in Toyotas caused the cars to suddenly accelerate, said Phil Koopman, co-founder of Edge Case with Wagner and an expert witness in Toyota legal proceedings. Toyota recalled millions of vehicles, faced hundreds of wrongful-death and personal injury lawsuits and paid a $1.2 billion fine in 2014 in a settlement with the U.S. Department of Justice.

Bugs caused problems with military fighter jets crossing the international dateline, computers handling leap years and sensors on a rocket engine, Wagner said. He attributed the hacking of a Jeep in 2015 to faulty software. Charlie Miller and Chris Valasek, the pair who hacked the Jeep, eventually were hired by Uber.

Google just started testing for bugs across all of its open source software, a sign that major companies are beginning to acknowledge robustness testing, Wagner said.

“We definitely have a cultural disconnect. The folks in the robotics world don't necessarily think about these kinds of issues. They are more concerned, and perhaps rightfully so, in building the right kind of algorithm,” Wagner said. “Right on the heels of it, when you're ready to deploy it, safety engineering says you need to test the robustness of it. You have to test the fault tolerance of it.”

Major companies working on autonomous cars have said they do pay attention to the integrity of their software. General Motors acquired Cruise Automation, a San Francisco-based autonomous vehicle technology company to help it develop the software inside the self-driving Chevy Bolt, Harry Lightsey, GM's executive director of public policy on emerging technologies, told the Tribune-Review. GM announced in December it immediately would begin testing the autonomous Bolts on Michigan roads and begin production of the cars in early 2017.

The car company has 40 test vehicles on roads every day, Lightsey said.

“We're running the software through simulations. We're running it on the road, trying to present it with as many scenarios as we possibly can to make sure that all the glitches are exposed and fixed,” Lightsey said. “And they are making corrections, and the system is learning itself. The system that you take out on Day 2 is not the system you took out on Day 1.”

Uber hired people who worked on autopilot software and people familiar with the risk and safety concerns of space travel, Raffi Krikorian, software director at Uber's Advanced Technology Center in Pittsburgh, told the Tribune-Review in November.

Ford, which aims to have a fleet of autonomous cars for ride-sharing on the road by 2021, also is paying close attention to software development and testing as it develops self-driving vehicles, a company spokesman said.

“We focus on the security of our customers before the introduction of any new technology feature by instituting policies, procedures and safeguards to help ensure their protection,” Alan Hall wrote to the Tribune-Review in an email.

Wagner and Koopman, however, weren't comforted by internal testing by major auto manufacturers. They said past bad software in cars points toward the need for an external review.

©2017 The Pittsburgh Tribune-Review (Greensburg, Pa.) Distributed by Tribune Content Agency, LLC.