Ability to see like humans one of driverless carsrsquo biggest challenges

Ability to see like humans one of driverless cars’ biggest challenges.

Oz-China Research Partnership to Help Vehicles ‘See’

The Australian government’s Commonwealth Scientific and Industrial Research Organization’s Data61 and Chinese technology company ZongMu will work to prevent vehicle-pedestrian collisions by equipping vehicles with computer vision.

Australian researchers say while the auto industry is seeing huge technological advances that will bring driverless cars closer to reality, there’s still a lot of stuff to finesse before people jump behind a nonexistent steering wheel.

A new partnership between the government’s Commonwealth Scientific and Industrial Research Organization’s Data61 and ZongMu Technology, a Chinese self-driving company, aims to close that gap. They will work to solve the problem of avoiding vehicle-pedestrian collisions by equipping vehicles with computer vision.

The CSIRO’s Data61 – Australia’s largest data-innovation group – says giving autonomous vehicles “human” sight is one of the big technical challenges.

Most self-driving cars use laser sensors, and while lasers are good for detecting objects and their distance from the vehicle, they can’t read and follow street signs or distinguish between the road and a sidewalk.

Data61 and ZongMu are working to give vehicles computer vision, an intuitive way to allow a machine to see and understand the environment the way humans do and react to hazards.

The Smart Vision Systems Group at Data61, led by Nick Barnes, will work with ZongMu to develop algorithms to estimate the space between objects according to the vehicle’s motion and predict the potential hazards of moving objects.

The market for self-driving vehicles is expected to jump from $42 billion in 2025 to nearly $77 billion by 2035 as more organizations compete to develop a truly autonomous Level 5 vehicle – a car that can handle all tasks and drive anywhere.

“Computer vision is the technology that allows autonomous vehicles to determine the difference between what is pavement and what is a drivable road,” Barnes says.

“Unlike laser sensors which rely on a series of points to identify hazards, computer vision offers richer information and a deeper understanding of road scenes through 3D image analysis, enabling safer automated driving.”

Data61 senior research scientist Shaodi You says the technology would allow autonomous vehicles to quickly react to any hazards at a distance of 32.8 ft. (10 m) or further to avoid collisions.

“Our technology will allow self-driving cars to more quickly detect and avoid hazards, understand and obey road rules and to determine their exact location in relation to other moving vehicles and landmarks in a given environment,” You says.

“The laser sensors used by the majority of companies are prohibitively expensive. On the other hand, the computer vision algorithms we’re developing with ZongMu cost one-tenth the amount and will allow commercial and truly autonomous cars to reach the road in a much shorter time frame.”

Shanghai-based ZongMu produces advanced driving-assistance-systems technology used in vehicles to enhance driver and road safety.

ZongMu CEO Tang Rui says the company is bringing cutting-edge AI-based algorithms into automotive grade computing platforms to make self-driving cars a commercial reality.

“Our self-driving technology is already being used by China’s leading car makers, but Data61’s expertise in computer vision will be imperative to our goal of bringing self-driving cars to market,” Rui says.

Data61 will work with ZongMu from research through to development with the final product available to the company’s customers in China and internationally, including original equipment manufacturers and partners in the mobility service industry.


Hide comments


  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.