Warning issued over drive-assist functions

0 Comment(s)Print E-mail China Daily, August 23, 2021
Adjust font size:

Experts are urging car owners to cut their overreliance on smart vehicles' driving-assist functions after a 31-year-old driver died in a crash in Southeast China's Fujian province when his Nio car failed to detect another vehicle ahead.

A passenger in the Nio car said the vehicle was on the pilot mode when it crashed on the highway on Aug. 14. A police investigation is ongoing.

Nio said the pilot mode is for driving-assist functions and not autonomous driving, and drivers should keep their hands on the steering wheel and eyes on the road at all times.

Cui Dongshu, secretary-general of the China Passenger Car Association, said drivers should not rely on driving-assist functions.

"The death is heartbreaking and sounds a warning to us. Autonomous driving will not be achieved in one go. We need to educate drivers and take measures to prevent them from becoming over-reliant on such functions," said Cui.

Cui said many car owners are not aware of potential risks such functions may pose, so there will be big problems if they over-rely on them.

Many drivers are confusing driving-assist functions with autonomous driving, and when the functions fail, the consequences are usually fatal.

Cui said carmakers are obligated to inform drivers that they must be ready to take over the vehicle at any time if needed.

Some carmakers are overstating their achievements in driving-assist functions by using words such as "auto" in their driving-assist systems, such as Tesla's Autopilot.

Li Xiang, founder of new energy vehicle startup Li Auto, suggested that carmakers adopt a standard naming strategy when it comes to their driving-assist systems.

For those functions at Level 2, words including "auto" should be banned, said Li. Autonomous driving functions fall into five levels, from Level 1 to Level 5, according to the Society of Automotive Engineers International.

Currently, the only vehicles allowed on the road, except those in designated areas, are Level 2 ones, which means they are capable of some automation but drivers must sit behind the wheel and get ready to take over at any time.

In the United States, auto safety regulators have started a formal safety probe into Tesla's Autopilot after a series of crashes involving Tesla models and emergency vehicles. The probe will take in 765,000 vehicles with Autopilot built since 2014.

The National Highway Traffic Safety Administration said it had reports of 17 injuries and one death in the 11 crashes, including the December 2019 crash of a Tesla Model 3 that left a passenger dead after the vehicle collided with a parked fire truck in Indiana.

The 11 crashes included four this year and so the administration has opened a preliminary evaluation of Autopilot in 2014-21 Tesla Models Y, X, S, and 3. The crashes involved vehicles "all confirmed to have been engaged in either Autopilot or Traffic Aware Cruise Control", the administration said.

The NHTSA, which closed an earlier investigation into Autopilot in 2017 without taking any action, has come under fire for failing to ensure the safety of the system that handles some driving tasks and allows drivers to keep their hands off the wheel for extended periods.

The National Transportation Safety Board has criticized Tesla's lack of system safeguards for Autopilot and the NHTSA's failure to ensure the safety of Autopilot, according to Reuters.

Follow China.org.cn on Twitter and Facebook to join the conversation.
ChinaNews App Download
Print E-mail Bookmark and Share

Go to Forum >>0 Comment(s)

No comments.

Add your comments...

  • User Name Required
  • Your Comment
  • Enter the words you see:   
    Racist, abusive and off-topic comments may be removed by the moderator.
Send your storiesGet more from China.org.cnMobileRSSNewsletter