Drive.ai is creating a hardware and software kit to retrofit commercial vehicles, such as delivery trucks or ride-hailed taxis, for self-driving.
(TNS) -- In the race to develop self-driving cars, Mountain View startup Drive.ai is focused on making the vehicles think more like humans do, drawing on a kind of artificial intelligence known as deep learning, which trains computers to adapt to different circumstances.
“When you make a self-driving car’s brain, there are two approaches. One is traditional rules-based robotics and the other is deep learning,” said CEO Sameep Tandon, who co-founded Drive.ai with seven others, mainly alumni of Stanford’s Artificial Intelligence Lab. Deep learning “can adapt to harder environments and handle more-nuanced scenarios in everyday driving — who has the right of way at a stop sign, is it safe to make a right turn on red? It’s hard to write algorithms for all scenarios.”
Drive.ai is creating a hardware and software kit to retrofit commercial vehicles, such as delivery trucks or ride-hailed taxis — “any business with fleets of vehicles,” Tandon said — for self-driving. In the next few weeks, it will announce partners interested in using its technology. It is still hashing out a business model, such as whether it will sell kits directly, operate a service to install them or have customers pay per mile.
Pricing is likewise up in the air, although the company said its kit uses off-the-shelf components to keep costs low.
The 2-year-old startup has been testing its “versatile decision making” approach. Its four self-driving cars have been on Bay Area public roads since April, when it received a testing permit from the California Department of Motor Vehicles. “We’re focused on harder cases for self-driving cars in urban and suburban” environments, Tandon said.
One case he’s proud of: the cars’ ability to function in rain. The company released a video recently showing a car handling itself during a downpour. Rain is tricky for autonomous cars because it interferes with sensors — water splashes on cameras, and lidar, a type of light-reflecting radar, gets confused by reflections from the wet ground and raindrops.
Drive.ai is unveiling some of its technology on Thursday. For instance, it has developed software for quickly annotating all the data collected by autonomous-car sensors, such as taking a camera-generated image of a traffic light and tagging it red, green or yellow. Its training of the software is also showing rapid progress, he said.
Another advance: The computing power needed in the car is being minimized.
“On the back end we have a room full of servers to do simulations of different scenarios,” Tandon said. But the actual vehicle can operate with roughly the same computational power as a cell phone. “We’re taking what we train on a supercomputer and making it work on (the equivalent of) a cell phone,” he said. “We have a desktop PC in our (self-driving) cars with one CPU and one GPU (graphics processing unit); it’s basically a consumer PC.”
Last year, Drive.ai said that it had $12 million in backing without disclosing its investors. Tandon said it will have more funding news when it announces partnerships.
The 50-person company faces some Goliath rivals, including the major automakers, Tesla and Waymo, which was spun off by Google. But it still may successfully carve out a niche, said Brad Templeton, a Silicon Valley entrepreneur who was an early strategy and engineering consultant on Google’s self-driving project. “None of (the big firms) are going after local delivery vans, certainly not retrofit,” he said in an email.
In ride-hailing, Drive.ai will face plenty of rivals. Uber is pouring resources into robot taxis, including a $300 million partnership with Volvo and its acquisition of self-driving truck startup Otto. Lyft received a $500 million investment last year from General Motors, and Reuters recently reported that the partnership will have thousands of test self-driving Chevrolet Volts on the road next year.
“It’s whoever is first to safety that has a good shot at that market,” Templeton said. “My prediction is that companies will use all available tools — lidar, neural networks, radar, etc. Cost will not be the issue.”
For Tandon, the excitement is still palpable, even after almost a decade in robotics.
“Autonomous driving will be the first robot that people will interact with on a day-to-day basis,” he said. “We’re making a robot that people in the real world — parents, friends, family — could use. We’re motivated to put out a technology that’s safer and makes transportation more convenient.”
©2017 the San Francisco Chronicle Distributed by Tribune Content Agency, LLC.