IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Building Smarter Sensors for Robot Cars

With autonomous tech becoming more mainstream, businesses are racing to build the best solution.

(TNS) — To drive a car, you need to see the world around you. But computers are blind, so autonomous cars must rely on other ways to perceive their surroundings. Lidar sensors, a laser form of radar, have emerged as a powerful way for robot cars to navigate.

Lidar is so crucial to the self-driving industry that dozens of companies have sprung up in the past year to develop the sensors. It stands at the heart of the Waymo vs. Uber lawsuit, with Waymo, the self-driving unit of Google parent Alphabet, alleging that ride-hailing company Uber stole its lidar designs, potentially costing it billions.

“Lidar helps cars see very fine-grained information about what the world looks like,” said Raj Rajkumar, a Carnegie Mellon University professor and a leading autonomous-vehicles researcher.

“Lidar can do the job today. Computer vision can’t,” said Brad Templeton, a Silicon Valley entrepreneur who was an early strategy and engineering consultant on Google’s self-driving project. “Someday, computer vision will be good enough. Someday, lidar will be much cheaper. That someday for lidar is certain and soon. The someday for cameras is unknown.”

For now, the number of self-driving cars in the world is in the hundreds. Lidar sensors were a $230 million market last year, but as autonomous cars go mainstream, automotive lidar sales worldwide should hit $2.5 billion by 2026, according to IHS Markit.

A year ago, automotive lidar meant a $75,000 spinning object the size of a KFC bucket, but the sensors are rapidly getting smaller, cheaper and more reliable — and learning to “see” farther away.

Until recently, only a handful of companies made lidar sensors, with one — Velodyne Lidar in Morgan Hill — the clear market leader. That led to a backlog, as the scores of companies making self-driving cars waited months to get their hands on the crucial components.

The sensors are still thousands of dollars. But, as Templeton says, for now that doesn’t matter since robot-car makers are focused on ensuring their first models are as safe as possible, and they won’t be doing widespread production.

This month alone, three Bay Area companies have emerged with lidar solutions. Meanwhile, some major players developing robot cars are buying up lidar makers to bring development in house. General Motors’ Cruise Automation bought Strobe, while Ford’s Argo.AI bought Princeton Lightwave, for instance. Ford, Baidu, Daimler and Samsung have invested millions into lidar makers Velodyne and Quanergy. Other startups, like San Francisco’s Civil Maps, make use of the data lidar sensors generate, and stand to benefit as costs come down.

And Velodyne says it has dramatically revved up the performance of its sensor, announcing a lidar model with 128 laser beams, double the previous amount, that it says will be in test cars next year. It produces its sensors in a San Jose megafactory that it opened earlier this year and said would be able to crank out a million sensors in 2018. It has not announced how much the new model will cost.

Lidar works like radar, but instead of bouncing radio waves off objects, lidar pulses out laser beams and measures their reflections to determine distances to objects.

“Our human eyes can make sense of a picture because we know what the world looks like, so our brains can map a 2-D picture to a 3-D world,” Rajkumar said. “But cars are limited.”

Lidar on robot cars works in conjunction with other sensors, namely cameras and radar. One major robot-car player eschews lidar. Tesla CEO Elon Musk says his cars can drive autonomously with just cameras and radar. “I am very skeptical of that statement,” Rajkumar said.

Here’s a rundown on three new companies:

• Ouster. “With lidar, there are a lot of promises made and few products introduced to market,” said Angus Pacala, CEO and co-founder of this San Francisco startup.

With $27 million in backing (investors include Cox Enterprises and Ford Chairman Bill Ford) and two sprawling industrial buildings in the Mission District, Ouster plans to change that. It is shipping initial samples now, Pacala said.

After a three-month rehab of one of its buildings, it will have a clean-room factory able to produce 1,000 lidar sensors a month. Its workforce, now about 40 people, will be 100 by mid-year, he said.

Pacala co-founded Sunnyvale’s Quanergy Systems, which also makes lidar sensors. But Ouster’s approach is totally different, he said.

With his co-founder Mark Frichtl, who was his mechanical engineering classmate at Stanford, Pacala developed custom-built lidar semiconductors, reducing the need for hundreds of electronic components to work in concert. “We redesigned all that onto a single silicon chip,” he said. “It’s low cost and way more compact.”

Ouster’s sensors are about the size and weight (9 ounces) of a full coffee mug. Initially, pricing will be $12,000 per sensor.

• AEye. Pleasonton’s AEye says its lidar sensors fully integrate video-camera images, and overlay real-world color on their 3-D data.

“We want to create hardware and software that can perceive the environment as good as a human does,” said CEO and co-founder Luis Dussan, a former aerospace engineer.

AEye said its sensors have the intelligence to select the most-important objects identified by the camera — say, nearby bicyclists and cars, but not trees and light poles — and target them in lidar to create faster, smarter sensing.

“We can see not only the truck in front of a (self-driving car), but the license plate and the writing on the side,” he said.

It plans to use contract manufacturers, including some in the Bay Area, to produce its sensors. Currently the size of a tissue box, soon they’ll be as small as two decks of cards.

AEye has pilot versions available and will announce customers in early January. Dussan said they include a major automaker and a construction-vehicle maker.

The 45-person company has $19 million in funding. Investors include Kleiner Perkins, Intel Capital and Airbus Ventures.

• Innovusion. Los Altos’ Innovusion also says it fuses video-camera images with lidar. On a recent test drive, it showed how that gave its images much higher resolution than lidar alone, albeit in a narrower field of view.

It will have production samples in the first half of next year, said CEO Junwei Bao.

“For most startups, it’s hard to find customers,” Bao said. “With lidar, customers beg you for it.”

Bao previously spent 15 years at Chinese giant Baidu, where he worked on optical sensors and shepherded its investment in Velodyne, he said.

For lidar, “You have to be able to see faraway and dark objects,” he said. Innovusion said its sensors can detect dark objects up to 500 feet away, “which allows cars to react and make decisions at freeway speeds and during complex driving situations.”

Innovusion raised a seed round of $2.75 million last year and said it is raising additional millions. Its investors include China’s Banyan Capital, whose founding partner, Bin Yue, sits on Innovusion’s board. Innovusion declined to state how many employees it has; three people currently identify themselves as working at the company on LinkedIn.

©2017 the San Francisco Chronicle Distributed by Tribune Content Agency, LLC.