Tesla is in hot water again, raising more safety concerns after a Model S car crash in China was blamed on the car's "self-driving" technology.

The Tesla Model S in question caused a minor accident in Beijing, and the autopilot mode is apparently to blame — sort of. The owner alleges that the sales staff who sold him the car portrayed the autopilot function as "self-driving."

Selling the feature as "self-driving" is a hefty overstatement of its actual capabilities, as the technology requires drivers to be alert and keep their hands on the wheel at all times. Autopilot is designed to take control of braking and steering in some scenarios, but not take over altogether.

After reviewing data, Tesla confirmed that the car was indeed in autopilot mode, but the driver should have maintained control of the vehicle. In this case, Tesla says the driver's hands were not detected on the car's steering wheel.

This is the first-known incident involving a Tesla in China, but it's certainly not the first Tesla car crash. A few months ago, Tesla faced the first fatal car accident involving its autopilot technology, drawing a firestorm of criticism over automated driving efforts.

The latest accident was not fatal, but it certainly adds more fuel to the fire. As Reuters reports, 33-year-old Luo Zhen, a tech programmer, was driving to work when he activated the Tesla Model S autopilot function, as he frequently does on highways. Luo filmed the whole thing with a dashboard camera and said the vehicle hit another car parked halfway off the road. The accident did not cause any injuries, but it did scraped both cars.

The Tesla driver did not have his hands on the wheel and therefore did not steer to avoid collision with the parked car, a Tesla spokesperson told Reuters. However, the driver insists that the technology was not promoted to him as assisted driving but as self-driving. That goes against Tesla's general claims.

The auto-steer function of the vehicle is not intended to do everything by itself. On the contrary, it's designed to assist but requires the driver to keep their hands on the steering wheel and maintain control at all times. The driver must be prepared to control the car at any point, should anything go wrong with the auto-steer technology.

Luo did not take these precautions, but he blames the crash on the Tesla autopilot system nonetheless, arguing that the Tesla sales staff strongly pitched the autopilot system as "self-driving."

A closer look into these claims found that four other Tesla drivers in China, not connected to Luo or each other, were under the same impression — that the technology was self-driving. This means that, despite Tesla's efforts to explain that its autopilot mode does not equal fully-autonomous driving, its sales staff still sold the technology as "self-driving" rather than an advanced driver assistance system (ADAS).

According to the people surveyed, the sales staff described the autopilot system as "self-driving" in Chinese, although Tesla typically refrains from using this term in English. Moreover, the sales people reportedly demonstrated the technology by taking their hands off the wheel, further indicating that it would be OK to do so.

"They all described it as being able to drive itself," said one Tesla Model S owner from Shanghai.

In that context, it seems that Tesla might want to provide more training to its staff in regard to what exactly its autopilot mode can and cannot do, and what its technology entails. Right now, it seems that consumers who trust Tesla's sales staff could put their lives in danger thinking that the car is self-driving, when in fact, it can only assist them.

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion