Experts from AAA, AEye, Amazon Web Services, Honda and Toyota provide insight into the path to Level 4 autonomy during a CAR MBS session titled “Tapping Intelligence to Leap from ADAS to Autonomous.”
Panelists included: Reuben Sarkar, CEO, American Center for Mobility; Greg Brannon, Director-Automotive Engineering and Industry Relations, AAA National; Vijitha Chekuri, Autonomous Business Development LeaderAmazon Web Services; Nick Sitarski, vice president-Integrated Vehicle Systems Div., Toyota Motor North America R&D; Indu Vijayan, director-product management of automotive, AEye and Ehsan Moradi Pari, lead scientist-Honda Research Institute USA.
Q: Is going to higher levels of automation an evolution, or are they distinct steps?
The panel consensus is that moving from ADAS to Level 4 is a jump rather than an evolution because the two technologies do not use the same sensors, systems or AI, and the levels of complexity and liability are quite different. Therefore, development in most companies is along parallel paths, with separate teams that ideally develop alongside each other and share insights that can benefit each other.
Sitarski: “It is a distinct step due to removing the human from the loop. Level 4 systems will first be seen in commercial fleets due to the extremely high cost of sensors and software. There needs to be a way to amortize the cost, and this means commercial fleets as opposed to consumer vehicles.”
Vijayan: “In an L2 system the best sensor system is the human. Therefore, going from ADAS to Level 4 is a major step in development: trying to replace the human senses and decision-making with sensors and AI. This is a big jump.”
Brannon: “I’m not sure they move along a continuous development path. Based on testing by AAA, lane keeping and lane centering are where ADAS systems struggle the most, especially when they rely on cameras and lane markings. It’s a big step from improving lane keeping to becoming fully autonomous.”
Q: What is missing from ADAS in terms of sensors and compute compared to Level 4?
The panel agrees that L4 systems have much higher requirements in terms of edge computing, data usage, number of sensors and sensor capabilities, as well as cost.
Vijayan: “To be most effective and ensure the highest level of safety, sensors need to have edge computing capabilities and intelligence to look far ahead when driving on highways. Then they need to be able to sense all around in closer proximity while in urban settings.”
Pari: “Honda has a unique position with multiple types of vehicles, including motorcycles which have lower levels of compute compared to cars. Edge computing can improve safety to help reduce and minimize crashes and fatalities for both cars and motorcycles.”
Sitarski: “Level 2 is a mass market that is meant to provide convenience that the average consumer can afford. When you get into more sophisticated Level 4 systems where you use more sensors and compute to replace the human, that will first happen in commercial fleets.”
Chekuri: “While ADAS produces a lot of data, we see the scale of data usage increasing dramatically for customers with Level 4 technologies. The speed of development, testing, validation and delivery to the market is also quite rapid.”
Q: With the widespread deployment of ADAS technologies, are you getting the data that you need?
Sitarski: “At Toyota, the privacy of our customer data is critical and we need to be very careful how we use their data. Customer privacy is number one.”
Pari: “We use both road data and synthetic data. Synthetic data is very useful, but real data has more attributes and features regarding human driving behavior.”
Q: With vehicles becoming more hardware- and software-focused, and software being released over time, what can OEMs do to make it more robust?
Sitarski: “Years ago, there were many discrete systems in vehicles that were intertwined. So, managing that from different suppliers, it is important that suppliers work as a team so the systems work well together.”
Vijayan: “There is a need for togetherness in developing these technologies. Teams need to work together to ensure robustness.”
Chekuri: “It is important to initially separate software and hardware development and testing to reduce costs, improve quality and speed up development.”
Q: What’s it going to take to get consumers more comfortable with the technology?
Brannon: “Communication is the key for consumers to understand what is happening with OTA updates and how it affects their vehicle. But consumers don’t really care or want to know what is happening. Therefore, it needs to be communication in multiple ways.”
Sitarski: “The industry needs to be transparent and honest about what these systems are capable of and not capable of. This will improve confidence and safety.”
Pari: “The bridge from L2 to L4 needs to be a step-by-step approach for consumers to get used to the technology. Training is important so drivers know the capabilities and limitations of the systems. This will create more confidence and safer operation.”
Q: What about skipping Level 3?
Most panelists agree that Level 3 is problematic, due to driver inattention, the length of time to regain focus on driving (up to 25 seconds, according to Brannon) and the critical element of liability when something goes wrong.
Q: What is the difference between urban and rural autonomy?
Panelists say diversity is the main difference, with urban areas presenting a much more diverse and chaotic environment. Therefore, training AI to handle the diversity in urban areas is the challenge.
- Moving from ADAS to Level 4 autonomy is a major step forward rather than an evolutionary path.
- The sensors, AI and edge computing required for Level 4 are a jump from ADAS because the human is being replaced.
- Development, simulation, testing and verification for ADAS and for Level 4 are being done in two parallel teams in most companies.
- Early Level 4 applications will likely be for commercial fleets due to expensive sensors and software.
- Communication is key in getting consumers more comfortable with autonomous technologies.