Getting Smarter: Deep Learning Driving Intelligent Vehicle Development

Forget “2001: A Space Odyssey” – deep learning soon will enable artificially intelligent computers that will revolutionize automobiles and how we interact with them.

Bob Gritzinger, Editor-in-Chief

October 6, 2017

3 Min Read
Gary Silberg partner and automotive sector analyst at KPMG
Gary Silberg, partner and automotive sector analyst at KPMG.Photos by Joe Wilssens

NOVI, MI – Computers have been programmed to drive cars autonomously today, but deep learning will be necessary to enable autonomous vehicles that are capable of understanding all of the nuances involved in getting safely from Point A to Point B.

Deep learning, or machine intelligence, is where the highest level of development work is taking place to reach a point where the vehicle can flawlessly determine a course of action in context, says Gary Silberg, partner and automotive sector analyst at KPMG.

“The competitive advantage of the companies going forward is in deep learning,” Silberg says. “It is profound and it is important.”

Silberg, Jason Monroe, manager-Advanced User Experience Design at FCA US, and Ajay Divakaran, technical director-Vision and Learning Laboratory at SRI International, discuss the automotive applications of machine intelligence on a panel moderated by Autoline host John McElroy at the 2017 WardsAuto User Experience Conference.

Silberg says deep learning is capable of guiding a vehicle on snow-covered roads or on lanes without clear markings, but solving tougher “corner cases” where vehicles, people and bicycles all come together will require high-level object identification and decision making.

Human and machine interaction is key to creating smarter machines, says FCA’s Monroe. That collaboration can help users accept machine errors while teaching the machine to understand human preferences.

Artificial intelligence can aid drivers by alerting to hazards along a regular drive route, recommend coffee spots, suggest phone numbers of family or friends normally contacted during a drive or “know” the exact location to drop someone off instead of simply arriving at a general location, Monroe says.

For instance, in FCA’s Portal concept revealed earlier this year at CES, the vehicle could identify individual users and set their seating location and media options for their preferences. A Jeep driven in an urban environment would offer significantly different options from one headed out on an off-road expedition, tailoring its responses and suggestions based on the use of the vehicle.

“We can use artificial intelligence and machine learning to streamline and enhance experiences,” Monroe says.

A human-oriented computer will need to learn to rely on compound gesture recognition, using a camera to see facial expressions while microphones hear vocal tone and nuance, compiling the information to better understand human behavior, SRI’s Divakaran says.

“The computations should be modeled on human behavior,” he says. “‘I’m trying to perceive as a human would and I’m trying to react as a human would.’ That is how an ideal assistant technology should work.”

While major work remains to be done in artificial intelligence, the possibilities will be much clearer within the next five years, Divakaran says. The panelists agree smart vehicles will provide many constructive benefits to society, but those positives will require learning and modeling appropriate human behavior.

“There are wonderful people in the world and not so wonderful people in the world,” notes Monroe, and teaching a computer to differentiate between the two will be the hard part.

“Humans don’t always behave the way we think they do,” he says.

[email protected] @bobgritzinger

About the Author

Bob Gritzinger

Editor-in-Chief, WardsAuto

Bob Gritzinger is Editor-in-Chief of WardsAuto and also covers Advanced Propulsion & Technology for Wards Intelligence.

You May Also Like