CONCORD, Calif. — Machines today can do incredible things. Computer algorithms are advanced enough to best world champions in chess or the multi-week champion on Jeopardy! If you ask that same chess-playing computer to a game of checkers, however, complications arise.
While said computer may excel at one task, using that same computing power to solve an equally or even less complex problem is the largest hurdle for artificial intelligence (AI) developers. During a discussion on how AI will effect mobility at GoMentum Station's Redefining Mobility Summit on March 30, a series of speakers took on this question head on.
During a typical drive, hundreds of variables are constantly in motion. Lights are changing, pedestrians are crossing the street out of turn, emergency vehicles are signaling vehicles to get out of the way. The immense amount of computing power necessary for a vehicle to take in and appropriately react to all of these constantly changing factors illustrates that additional advanced computing is necessary for the vehicle to safely transport its passengers.
Artificial intelligence can be used not only to help vehicles understand their outside environments, but also to help drivers inside the vehicle stay alert and be aware of their surroundings. Because fully autonomous vehicles are at least three years away from public deployment, Nvidia is not waiting to use its AI to help drivers.
Earlier this year, the company showed off its Co-Pilot function, which helps drivers stay safe by providing helpful alerts. Facial detection software lets the computer know when drivers are not paying attention to the road and there is an event that calls for interaction, explained Tim Wong, a member of the company's autonomous vehicle team. The software also can recognize who is in the vehicle and remember which custom settings the driver prefers.
“The vehicle can recognize that it is me, adjust the seat, air temperature and what is playing on the radio depending on what time I am driving,” said Wong, adding that the Co-Pilot system also has improved the voice recognition for vehicles. “Speaking to a car is an awful experience,” he said, noting that lip reading software can understand what humans are saying with a 91 percent accuracy.
While these are relatively simple accomplishments, Nvidia this only a demonstration of what the technology is capable of.
Although some still believe that AI is more than 50 or 100 years away, two experts say that this is not the case. Dr. Ching-Yao Chan, program leader with California PATH at UC Berkeley, and Dr. Teresa Escrig, business and integration architecture executive at Accenture, said that the AI market is already beginning to materialize. Chan acknowledged the $1 billion investment into AI by both Ford and Toyota in their acquisition of Argo AI and development of a research center, respectively.
The four most common responses when discussing AI, according to Escrig, are that:
Chan explained that while full deployment remains elusive, we are in a period of rapid evolution. Throughout history, explained Chan, “there are long static periods, followed by short spurts of rapid development.”
While road-embedded sensors can help alleviate some of the computing onus from vehicles themselves, the trend has been to load the vehicle with the extra computing power, said Adam Kell, investment partner at Comet Labs, a venture capital firm focused on integrating AI into transportation.
The constant theme among all speakers was that we need to start planning for tomorrow’s future now. “What we do today," Chan said, "will have a huge effect on the next 50 years of automated transportation.”
Ryan McCauley was a staff writer for Government Technology magazine from October 2016 through July 2017, and previously served as the publication's editorial assistant.