Skip navigation
Jason Johnson Harman Joe Wilssens Photography
“How do I want an autonomous driver to drive? I don’t want it to drive like my mother-in-law or my mother. I want it to drive like me,” Harman’s Jason Johnson says.

How Will Artificial Intelligence Handle Mom, Mother-in-Law?

At this week’s WardsAuto User Experience Conference, a Harman executive raises a delicate question about whether it makes sense to use artificial intelligence to model and adapt to all driving styles – both conservative and overly aggressive.

NOVI, MI – Exhibiting almost human attributes, artificial intelligence can do some amazing things.

At this week’s WardsAuto User Experience Conference, panelists from Harman and Nuance, two companies that are employing AI in many areas, laid out a compelling picture of the future in which next-generation cars can get us safely to our destination while understanding our preferences, dealing with our mood swings and knowing if the driver is too distracted to take control of a self-piloted vehicle as a freeway exit approaches.

But Jason Johnson, director-User Experience Design for Harman’s Connected Car division, says artificial intelligence, a vital building block for autonomous vehicles, raises a delicate question about whether it makes sense to model and adapt to all driving styles.

As an example, Johnson refers to his mother, a very conservative and nervous driver, and his mother-in-law, an aggressive driver he describes as “very scary.” He says it’s difficult to drive with either one of them as a passenger.

“If I’m driving really slow and mother-in-law is in car, then she yells at me. If I’m driving too aggressively when my mother’s in the car, then she yells at me, too,” Johnson says.

“How do I want an autonomous driver to drive? I don’t want it to drive like my mother-in-law or my mother. I want it to drive like me,” he says. “That’s the deep understanding we can get with artificial intelligence. Understanding the occupant.”

What if the occupant wants to drive dangerously slow or twice the speed limit? Will autonomous vehicles have the fortitude to refuse the occupant’s command? Isn’t the point of autonomous vehicles, ostensibly smarter than humans, to save us from our mistakes by not replicating our bad behavior?

Both Johnson and Adam Emfield, principal user experience manager at voice-control specialist Nuance, say these are tough questions yet to be resolved with regard to public policy and insurance liability.

People looking to autonomous vehicles primarily as a mode of public transportation might be more tolerant of a vehicle that drives conservatively and makes safe decisions for all occupants, Emfield says.

“But if it’s my car, it better do what I want it to. It’s kind of that mindset for a lot of people,” he says. “And I don’t think we have the right answer for how to settle it out yet.”

Specific performance attributes for autonomous vehicles most likely will vary among automakers and brands. “It’s not a one-size-fits-all,” Johnson says. “If it’s an autonomous (Chevrolet) Corvette, maybe it would not be nagging you as much.”

Johnson describes the first time he found himself behind an autonomous vehicle, on Detroit’s US-10, the Lodge freeway. “The posted speed limit is 55 mph (89 km/h). No one drives 55 on the Lodge. But I’m stuck behind this car.”

Despite the legal and social hurdles lying ahead, Johnson says he is confident the technology will move forward. “We have all the technology to enable whichever way we need to go,” he says.

Emfield and Johnson were panelists on a session dedicated to “Creating a Happier, Less Stressful Life with AI in the Cockpit.”

At Nuance, Emfield leads DRIVE Lab, which opened in suburban Detroit one year ago to focus more intently on voice controls and AI as they are applied to next-generation vehicles.

It won’t be long, he says, before a “virtual assistant” can identify if the driver is surprised, angry in traffic, sleepy or relaxed and happy during a leisurely weekend drive in the country.

Part of DRIVE Lab’s mission is to understand these emotions, what causes them and, most importantly, how a vehicle’s virtual assistant can help the driver cope.

He refers to a 2016 study that identifies these as the most prevalent stress inducers, in order: Being stuck in traffic, finding parking spaces, getting lost and feeling the pressure of being on the road.

When digging deeper to find out how drivers want the vehicle to help with these issues, Emfield’s team found a gender split among respondents: Women want a virtual assistant that will help improve their emotional state by putting on their favorite music and by communicating ahead to turn on the house lights and preheat the oven.

Male respondents, on the other hand, said they want a virtual assistant that helps increase their productivity behind the wheel by helping with work tasks or ensuring the route home avoids traffic.

But Emfield (pictured below, left) knows the research still has much to reveal because so many other emotional situations “don’t fit into neat little boxes.” While humans are wired to recognize and respond to these situations, “so far machines, at least a lot of the mainstream ones,” are not, he says.

AI is intended to do some of the thinking for the user – in this case the driver.

“This gets even more powerful if you can think like another human, someone who knows you intimately well, knows your preferences, your quirks, the differences between your mother and mother-in-law as they interact with you,” Emfield says. “We can model your behavior over time and apply what we’ve learned about you.”

Joe Wilssens Photography

As an example, he says a driver may have programmed as a destination a parking deck that closes about the time the vehicle will arrive there. An intelligent virtual assistant will provide other parking options.

“We make sure that we’re doing the thinking for you and making it so we don’t even show you results that aren’t going to be relevant or practical for you as a user in an individual situation,” Emfield says.

He plays a video of the all-new ’19 Mercedes A-Class, which will go on sale in the U.S. early next year with MBUX, the Mercedes-Benz User Experience multimedia system.

Nuance supplies the voice controls, which adapt to the user, employ natural speech recognition, can be prompted with, “Hey Mercedes,” and will respond to simple commands or questions such as, “What’s my current fuel range?” or “Change ambient light to green please,” or “I’m feeling sick. Take me to a hospital.”

The MBUX system will even adapt to your sense of humor. When asking the system to tell you a joke, MBUX responds, “Sorry, my engineers were Germans.”

Harman’s Connected Car division also is focused on the driver’s mindset to ensure he or she is ready and attentive to take the wheel in a dangerous situation or when exiting the freeway.

Johnson refers to the “annoying coffee cup” that often appears in instrument clusters today, perhaps if the driver has been on the road a long time, even if he or she isn’t tired at all.

“The solution is occupant monitoring,” he says. “More specifically, we look at micro-fluctuations in the pupil. From that information, we also understand the context, and we can understand the deeper meaning and understand the user’s mood and then also if they’re drowsy. We can combine that with other sensing and provide more valuable feedback to the driver in each situation.”

Harman refers to this as “neurosense,” a way to provide cognitive insights and understand that a driver should not be notified that windshield wiper fluid needs to be topped off while negotiating a construction zone or during some other stressful time.

Johnson says his team has equipped a BMW test car with infrared cameras as a proof of concept for collecting user data and aggregating that data to the cloud.

The “handover” from driver to autonomous vehicle (and vice versa) may appear to be straightforward – either the driver is engaged or isn’t. But Johnson sees a “spectrum of engagement” and refers again to his mother, who is averse to new technology and will be “scared for her life” in an autonomous vehicle.

“I think she will fall on the spectrum where she will be monitoring and making micro-adjustments reflecting that – ‘you’re too close to the side, move over,’” Johnson says.

“She needs to be able to give that feedback to the autonomous vehicle in the same way she gave that feedback to me when she was teaching me how to drive.”

TAGS: Vehicles
Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish