Tesla Model Y Track Mode
(Photo : Craventure Media on Unsplash)

A Tesla Model Y allegedly crashed while in "Full Self-Driving" or FSD beta mode in Brea, Los Angeles, on Nov. 3. This is the first incident involving Tesla's controversial driver assist feature. 

Tesla Model Y Got 'Severe Damage'

According to the story by The Verge, the incident was most likely the very first incident that involved the company's very controversial driver assist feature that has been long-awaited by Tesla buyers. Luckily, no one was injured during the crash. The vehicle, however, was said "severely damaged" aftermath.

The crash was directly reported to the National Highway Traffic Safety Administration, which already has multiple, overlapping investigations directly into Tesla's Autopilot system. The incident report now appears to have been made by a certain owner of the Tesla Model Y.

NHTSA on Tesla

NHTSA spokesperson did not immediately respond to The Verge's request for comment. According to the report, the vehicle was still in the FSD Beta mode while it took a left turn. The car then went directly into the wrong lane and was then hit by another driver that was in the lane next to the Tesla Model Y's lane.

The car gave an alert halfway through the actual turn, so the driver tried to turn the wheel to avoid going into the wrong lane, as noted by Futurism. However, the car took control and forced itself directly into the incorrect lane.

Car Created an 'Unsafe Maneuver'

The car then reportedly created an unsafe maneuver which then put everyone involved at risk. The car was then severely damaged on the driver's side. Tesla's main decision to test out its own "Full Self-Driving" driver assistance software with untrained vehicle owners directly on public roads incidentally attracted quite a massive amount of criticism and scrutiny.

All throughout, the company has rolled out and even retracted a number of software updates that were meant to upgrade the system while addressing software bugs. There have also been a number of video clips uploaded online showing Tesla owners while using the FSD beta, which, as seen on video, had varying degrees of success.

FSD is Not an Autonomous Driving System

A number of the clips show the driver assist system quite confidently handling complex driving scenarios. Others, however, showed the car drifting into the wrong lane or even making some other serious mistakes.

Despite its name, the FSD is not an autonomous driving system. Drivers are still required to be vigilant while keeping their eyes on the road and leaving their hands on the steering wheel.

Read Also: Police Warn That Tesla Owners Could Be Targeted by Professional Thieves

Level 2 Automated Driving System

Vehicles with highly automated driving systems that would still require human supervision are now classified as Level 2. This classification is given by the Society of Automotive Engineers' taxonomy.

The government of the United States has taken a renewed interest in Tesla. Recently, the U.S. government announced that they were investigating incidents that involved Tesla cars operating Autopilot that have directly crashed into parked emergency vehicles.

Related Article: Subaru vs. Tesla Unveils Its Very First EV | Introducing the Solterra SUV

This article is owned by Tech Times

Written by Urian B.

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion