As a car-mad child of the ’70s, I recall reading about new capabilities and features being deployed in high-end and luxury cars and wondering how and when these features would ever become available to the mass market.
But over the decades, features such as antilock brakes, electronic stability control, cruise control and automatic emergency braking have waterfalled from luxury models into the wider mainstream. This is a great testament to the innovation and cost focus of the automotive industry and its suppliers, especially of electronics systems.
But capabilities such as touchscreen-based entertainment systems, digital instrument clusters with head-up displays, blindspot monitoring, lane-keep assist, night vision systems, adaptive driving beams and high-speed wireless connectivity, are still out of many global consumers’ reach today.
Informative and immersive in-vehicle experiences and enhanced driver and road user safety capabilities should be available to a greater consumer base.
With recent advances in computing capability, and the cost and performance benefits of semiconductor integration, we are entering an era where automakers can start thinking about advanced driver assistance systems (ADAS) and the digital cockpit not as optional features, but instead as an opportunity to move toward Vision Zero – a multinational approach to road safety that seeks to eliminate all traffic fatalities and severe injuries – while also making our vehicles an immersive driver and passenger experience.
Bringing immersive, affordable in-vehicle experiences and safety capabilities to all vehicles will be enabled by silicon integration. Consolidation of individual electronic control units (ECUs) due to evolving system architectures, and the move toward software-defined functionality in vehicles, is leading to a new paradigm in computing in the vehicle with its attendant functional safety requirements defined by the industry’s ISO 26262 standard.
The traditional infotainment system and instrument cluster are evolving from single entertainment or driver information functions to becoming an integrated and immersive “digital cockpit” capability. Functions such as driver and cabin monitoring, digital rear- and side-view mirrors, vehicle surround view and head-up displays are being subsumed into the digital cockpit. Similarly, driver safety augmentation capabilities such as vision and hazard monitoring, lane keeping, adaptive braking and dynamic acceleration are consolidating into fewer ADAS domain controller ECUs.
These evolving trends have two key implications: First, the computing performance needed for this consolidation of cockpit and safety functions increases tremendously, and second, because these features are increasingly performing critical safety functions or relaying safety-critical information and alerts to the driver, ISO 26262-based safety capability is becoming mandatory for multiple vehicle electronic systems.
Increasing the digital cockpit ECU’s performance by simply deploying a system-on-chip (SoC) with a higher-performance CPU – or a CPU with multiple cores – can address these inexorably growing computing demands, but it’s an incredibly power-inefficient approach (especially as we move to a world of electrified and battery-driven powertrains).
Instead, through the intelligent integration of heterogenous computing elements such as CPU, GPU, ISP and NPU in the SoC, the individual software and application workloads can be deployed to the compute-processing element best suited to process a specific task.
Specifically, in the case of the SoC, by supporting sufficient malfunction or failure diagnostic capability at the silicon level, and by following and documenting ISO-based systematic SoC development processes, the path to ASIL B certification at the ECU system level can be eased for solution integrators.
Samsung’s Exynos “Auto V9” digital cockpit SoC is a great example of how heterogenous computing has been enabled with ASIL B safety for next-generation processing; Arm-based multi-core CPU and multi-cluster GPUs are combined with an NPU to meet the computing demands of both rich in-vehicle experiences and safety functions such as driver monitoring.
In the face of growing consumer demand, an extensive technology ecosystem to support automaker in-vehicle infotainment (IVI) and ADAS development has never been so vital. Developers of future immersive in-vehicle and safe electronic systems are looking for the hardware and software capabilities to bring their vision to life while doing so cost- and power-efficiently and addressing critical industry safety standards. This needs unprecedented levels of partnership and collaboration.
And it’s worth highlighting that while these new capabilities are being deployed in more vehicle models each year, Arm’s foundational semiconductor IP technologies enabling these capabilities are developed up to 6-7 years in advance to take into account the long and stringent development cycle of automotive electronics.
We are doing this by developing our next-generation IP to deliver performance with power efficiency and with safety enablement as a priority. And as these electronic systems increasingly become more software-defined, we continue collaborating across the industry to make sure the right developer tools and software ecosystems are in place and supported to ensure these new capabilities can be widely deployed.
Through the strength of industry partnership from both a hardware and software perspective, I am confident we can maximize the benefits of these new vehicle technologies, make them more accessible to mass markets and make greater strides toward Vision Zero.
Chet Babla (above) is vice president-automotive for Arm, a processor design company whose technology powers more than 180 billion chips that are the brains for everything from sensors to smartphones to supercomputers.