At the end of 2021, Dan Loop, vice-president and general manager of Automotive Edge Processing at NXP Semiconductors, wrote an article in All About Circuits entitled, Understanding ADAS Limitations and V2X Communication Technology.
He claims that ADAS in general increase road safety for the driver. However, he finds that they are mostly isolated from their surroundings. Quite of this is because they don’t communicate with other vehicles, roadways or with other kinds of smart city infrastructure. He adds: “Overall, the main limitations of driver-assistance systems are their cost and limited availability beyond higher-end models. ADAS is costly, and so entry-level vehicles don’t tend to have it to ensure because carmakers have an eye on remaining competitive. Yet there is a need for ADAS to proliferate because ADAS and indeed V2X communication work better when there are more participants. The challenge for OEMs is to therefore develop and offer more cost-effective ADAS to the lower end of the market “to increase the safety and comfort of all road users.”
In response to some questions from TU-Automotive about the limitations of ADAS, Vehicle-to-Vehicle (V2V), and Vehicle-to-Everything (V2X) communications, Loop commented: “Today, V2X systems have limited interactions with ADAS systems. ADAS relies on vision and radar/lidar and ultrasonics to determine obstacles and path planning, while V2X relies on messages sent from other entities, including vehicles or infrastructure. This means the V2X system is essentially limited by a network effect that relies on the adoption of the technology to reach a critical mass in order to become useful. ADAS is limited by the cost of implementation of the technologies in vehicles.”
Uptake of ADAS
Matthew Avery, chief research strategy Officer at Thatcham Research nevertheless comments that his organization has been very impressed with the uptake of ADAS over the last decade. “We have got to the point where it used to be a high-end option but now it’s becoming a standard, even on mass market vehicles like Fiesta’s and Clios.” He explains that this shift is being driven by Euro NCAP’s five-star requirements: “While ADAS fitment to car and vans will be mandated by the European General Safety Regulation 2 rules. The benefits of ADAS in terms of crash-avoidance and Vulnerable Road User protection are clear. However, ADAS is not without its limitations.”
These weaknesses include the fact that radars can often struggle with identifying objects. “If there is too much reflectivity, there is no recognizable radar return and the system can’t identify what it is,” he reveals. So, for example, if a car is hidden behind a metal pillar, or if a wheelie bin is obscuring a parked car, the system will find it difficult to ‘see’ that vehicle.
He adds: “Cameras have limitations too; they can be susceptible to ‘atmospheric distortion’, in other words, if the driver can’t see clearly, nor can the system.” This issue is often caused by bad weather such as fog, snow, mist, or heavy rain. Automotive manufacturers, he says, have been finding ways to work around this issue by using sensors. So, the camera identified the object, the radar works out where that object is in order to calculate the vehicle’s path to avoid it.
Avery also points out that: “New sensors are being introduced for automated driving, such as Lidar sensors which are quite expensive, but very accurate in terms of Near Field Discrimination. At close quarters they can accurately tell where objects are.”
So, how concerning are the limitations and what is being done to resolve them? Loop comments: “Legislation in multiple regions is driving V2X adoption. For ADAS, cost pressure and innovation are taking cost out of the systems and some OEMs are narrowing their focus to use only specific technologies for their ADAS platforms. With automation, you are going to need all three sensors and, even then, that combination of sensors can still struggle in conditions affected by snow or fog, especially when it comes to identifying white lines.”
The sensors are both costly to git and even costlier to repair. “Manufacturers do try to reduce the costs of these systems. However, in some cases this can impair efficiency,” he says. Much of this could be improved if V2V and V2X were to have common standards. At present, they don’t. Another issue is network latency, and so the systems don’t gain “a very accurate understanding of where objects are in relation to each other.” He nevertheless thinks that a combination of these sensors will improve and enable automated driving in the future.
Sensors: “constantly improving”
In fact, Avery says the sensors are constantly improving. They include solid-state high-resolution radar sensors, and new camera sensors that have three cameras together to give a wide-angle view. He elaborates: “You can then see what’s next to the car, and a telephoto optic to look at long distances in front of the car. The distance the sensor can see restricts the speed of the vehicle as the faster your vehicle, the longer the range required, to be able to react in time. So, we are starting to see radars with longer ranges and better resolutions.
“More complicated sensors will inevitably cost more. The solution is the combination of sensors in terms of their efficiency. Additionally, the issue of repair still needs to be addressed. For instance, we want self-aligning sensors that don’t require unnecessary and expensive calibration, and we’d like sensors placed in areas where they are not susceptible to parking knocks and low-speed shunts.”
To what extent are the sensors and systems really “mostly isolated from their surroundings,” and in why does this pose an issue? Well, there is an issue with ground-based sensors; they need to be able to communicate with the traffic from a protocol and communications language perspective to gain line of sight to pick up information.
Avery reveals that sensors placed at the roadside are more limited than sensors placed high up on gantries, and that wholesome solutions are cloud-based. “You have to ensure that you have a good signal which is critical for ADAS and automated driving,” he says. To ensure good communication, with V2X all vehicles on the highway have to speak a common language, and work any communications blackspots have to be removed.
He also asks: “With V2X, can all cars speak in a common language? Can the sensor communicate with all vehicles, or are there blackspots? How timely is the signal and can the vehicle react to the message?” Furthermore, vehicles that are close together will find that the information they need to be safe on the road is often coming too late. The sensors and the systems just can’t work.
How will of ADAS, as well as of V2V and V2X communications, improve over the next 5 years. To what extent will the limitations be resolved? Loop believes that once critical mass with V2X has been achieved, the data will become useful enough to be fused with other ADAS data. This will impact on other ADAS technologies by reducing their requirements. This is because the position, the speed and the distance of all vehicles and infrastructure around them will be known. However, safety could be jeopardized when one vehicle is not broadcasting message – creating a data gap and requiring ADAS to recover by using other technologies.
Avery concludes: “Work is ongoing at the UN and at the EU on common V2V communications standards and protocols. One of the issues is that the car and truck manufacturers have two different systems, creating a need for harmonization and common standards for these systems to work. ADAS sensors are improving over time, becoming more complicated and capable.” He also believes that it’s important to reach critical mass – perhaps over the next 5 to 10 years. Mass market adoption will create economies of scale, reducing the price point of the technology to make it more accessible.