Between now and when autonomous vehicles are ultimately deemed safe enough to be operated on the world’s highways, all of the sophisticated technologies being used in concert to “drive” the vehicles will be tested vigorously and improvements will be incorporated.
When the fully actualized robo-car gets the green light, the initial application appears to be ticketed for mobility services – a term that covers everything from ride-hailing companies such as Uber and Lyft to traditional taxis and buses.
When these various fleets are put into motion, imagine the world of opportunity that opens up for people who otherwise wouldn’t be able to get around, such as the elderly and physically impaired. In America alone, NHTSA says 53 million people have some form of disability. Imagine how the liberation of millions of people who previously couldn’t impact the economy in a positive manner.
This also will open up myriad possibilities for what happens inside the vehicles, especially in connection with ride-hailing operations. Passengers can elect to be productive, entertained or even to zone out. The environment of the cabin can be used as an office, a library or a living room. But to achieve those options, there has to be an upgraded interface between the car and the passengers inside.
As a rudimentary example, this type of sensing currently is provided by the taxi driver. In addition to providing an efficient, safe trip, we sometimes forget the driver is collecting and processing “data” and making constant decisions regarding who should be picked up and even who should be kicked out for unruly behavior. The taxi driver also would be sensitive to whether objects were left behind in the vehicle, whether passengers properly and completely disembark and whether the cabin was clean enough and ready for the next customer. For autonomous vehicles, sensing technology will provide similar observations and decisions based on what is happening inside the vehicle.
As we all know, the ability of a vehicle to be driven via technology instead of a human being depends on the collection and processing of data at speeds we cannot even comprehend. This data then is fed just as quickly into the algorithms that generate the actions or “decisions” that control the vehicle. The same concept applies to sensors inside a cabin.
Safety, Creature Comforts and Monetization
As such, there are three aspects that in-cabin sensors must address. The first, obviously, is safety. The industry always will be responsible for the safety of passengers. And sensor applications already exist that are connected to seatbelts, airbags and so on. The second is creature comforts: How comfortable are people inside the vehicle? Are the conditions warm enough or cold enough? Is there enough light to allow them to do whatever they want to do? Sensors will be part of the customization of the user experience. The third is monetization: How can ride-sharing companies and manufacturers make money out of direct services provided to a captive audience (under consent, of course)?
The fact is, the use of anonymous data by third parties is going to be crucial to the monetization of the adjunct services that can be provided either by ride-sharing firms or by the manufacturers of autonomous vehicles. Some sort of “personalization” of these services probably would even be expected, especially if passengers elect for entertainment as part of their ride-sharing experience. I even suspect the younger generations, nurtured by iPhones and elaborate video games, will find data privacy almost trivial by the time driverless cars are a functioning economic reality.
But none of this removes the responsibility of any company, car maker or supplier in the autonomous-vehicle industry to be very, very respectful of user data. Nobody wants data to be jeopardized or used for things the user is not interested in. Expectations should be that data not absolutely required to provide value to the passenger will not be used for other purposes. It goes without saying the proper safeguards should be a part of the debate and written into any legislation that ultimately regulates autonomous vehicles.
In-cabin sensors should be designed to collect the minimum amount of data for each feature it supports, and the data should be collected and delivered anonymously as possible. Flexibility can be instituted by suppliers through the development of a modular platform. This would facilitate the ability to opt-out of any data collection from features in-cabin sensors provide that a user may find undesirable. Yet it still won’t affect other features they or any other end users would find valuable.
A lot of what’s going to affect the reality of what’s happening inside the cabins of autonomous vehicles is still to be decided. Some automakers are more proactive, some are reactive. But none of them is indifferent. They are open and curious regarding how sensor technology can protect drivers and passengers by constantly scanning and tracking occupants and objects anywhere in the vehicle. Sensors can provide a kind of virtual model of what's happening inside the vehicle, and a proper understanding of this reality can be used to benefit manufacturers. The basics already are there and are being used in today’s human-operated automobiles. But we are entering a new era of mobility and in-cabin sensors, and more importantly, in-cabin intelligence will be able to support future advances with autonomous vehicles.
Gil Dotan is CEO and co-founder of Guardian Optical Technologies, which manufactures advanced sensor technology which includes 2D, 3D and motion analysis in a single unit enabling vehicles to become “passenger aware.”