YPSILANTI, MI – It’s not often a business sector asks to be subject to federal regulations, but given several high-profile accidents with vehicles using autonomous or semi-autonomous technologies, that day seems to have arrived for autonomous-tech suppliers.
“I think when it comes down to making sure we are making this technology as safe as we can possibly make it, we have to have confidence when you put it out there that it has passed a gauntlet of tests,” Jada Smith, vice president-advanced engineering and external relations for Aptiv tells media here at an automotive lidar backgrounder.
Smith advocates for smart rule-making though, noting certification shouldn’t just be a matter of racking up miles on vehicles with autonomous technology, but needing to determine, with a certain amount of repeatability, what scenarios they have successfully accomplished. Suggested scenarios include an unprotected left turn (yielding to oncoming traffic at a green light) or crossing a busy pedestrian crosswalk.
“Those types of things are what we have to make sure we have supreme confidence in before we (offer autonomous vehicles to the masses),” she says.
There have been efforts made toward autonomous-technology regulation – the AV Start Act is in the U.S. Senate and the Self Drive Act is in the U.S. House of Representatives.
The bipartisan bills are stuck in neutral, however. The AV Start Act is being held up due to objections of some senators who are concerned about the safety implications of allowing autonomous vehicles beyond those in test fleets on public roads, while the Self Drive Act has been criticized for its effort to prohibit vague “unreasonable restrictions” at the state and local level, giving an opening to automakers or suppliers to sue states or cities it believes are interfering with the rollout of autonomous vehicles.
Kaice Reilly, a lidar analyst at Autoliv and Volvo’s Zenuity joint venture, wants to see standards set for lidar, which uses pulsed laser light and sensors to detect distance from a vehicle to objects, animals and people and is seen as a key technology enabling self-driving vehicles. He calls the lidar business the “Wild West” right now, as there are dozens of companies proclaiming specifications that may or may not be trustworthy.
“There are no standards (for lidar) at this point in time,” he says. “I feel like it would be nice to completely understand what it means when somebody says, ‘I can pick up a target at 250 m (820 ft.)’ and have some type of certification that says, ‘This lidar does all the things it says it does.’”
Angus Pacala, CEO and founder of lidar-maker Ouster agrees with Reilly. He notes that while no company in any industry is crazy about the idea of regulations because they make it more difficult to build a product, safety and performance standards for lidar are necessary.
“Our products are going on to self-driving cars in a pretty large scale, and it’s totally unregulated. All that’s keeping people safe is … (being able to) rely on the ethical running of the companies that are involved.”
Not only are a lack of federal regulations a concern, but so are outdated federal and state rules that could stymie the AV sector. For instance, a 1971 law in New York requiring a driver keep one hand on the steering wheel at all times has been cited for Audi’s reluctance to equip the A8 sedan in the U.S. with its Traffic Jam Assist technology. The SAE Level 3 system A8 buyers can opt for in Europe stipulates hands can be kept off the wheel unless there is an emergency, a step further than the Level 2 Cadillac Super Cruise and Tesla Autopilot systems in the U.S. require a driver to be ready to take over steering at all times.
Autonomous-technology safety concerns don’t just spring from poor regulation, but also exaggerated marketing by manufacturers, say panelists here.
Aptiv’s Smith says marketing is a “big, big aspect” of people’s comfort levels with autonomous technology. She notes the term Autopilot “implies that you can turn your mind off, but that is not a mind-off system. Even a Level 4 system with a safety driver is a mind-on system, because the safety driver is ready to take over at any moment.”
“When it comes to consumers, and consumer acceptance and consumer marketing, we need to make sure we’re educating consumers on the technology, what they can and cannot do,” Smith says.
Tesla vehicles with Autopilot engaged have been involved in several high-profile accidents, some in which the driver of the Tesla was killed.
And in March, a Volvo in Uber’s autonomous test fleet struck and killed a pedestrian in Tempe, AZ. A report last month blamed a software glitch for the car’s failure to stop before impacting the woman.
Jim Schwyn, a chief technical officer for Valeo, maker of the A8’s lidar (claimed to be the first laser scanner in automotive volume production) says achieving high levels of safety in autonomous vehicles will be the result of their components – lidar, radar and cameras – working together and interacting.
“It’s an ongoing optimization process,” he says.
Both Schwyn and Zenuity’s Reilly like the idea of taking baby steps in bringing higher levels of autonomous technology to market.
Reilly advocates for using very low-range lidar, of 5 cm to 30 cm (2 ins. to 12 ins.), at very low speeds.
“You want to have it on a vehicle, but there’s a lot of lessons we have to learn about how to make the perception software work correctly,” he says.
“So do we try a fully autonomous vehicle at 40 mph (64 km/h), or do we try it at 15 (24 km/h)?” Reilly continues. “I vote for 15, because the consequences should be less. The risk is less. So I would say let’s get people used to these systems at lower speeds, lower risks, and build from there.”