Tesla's Full Self-Driving Beta 8.3 update, which CEO Elon Musk planned to release in April 2021, has been on hold for "significant architectural changes," as reported by Inside EVs on Monday, March 22.
In a tweet meant to answer Whole Mars Catalog's query on March 20, Musk answered that the changes, including "fundamental improvement to pure vision," yielded limited value during testing.
Given significant architectural changes, including fundamental improvements to pure vision, there is limited value to testing 8.x. Hoping to upload V9.0 & button next month.— Elon Musk (@elonmusk) March 20, 2021
Musk continued that instead of the Beta version 8.3 update, the EV manufacturer will instead rollout, version 9.0 will be released in its place.
The Beta version 9.0 is slated to be uploaded along with the "download beta button" around April 2021.
Tesla's FSD Beta Expansion Ambition
FSD is an optional software upgrade that gives Tesla vehicles the feature to drive themselves with little to no intervention from its drivers.
Musk initially said on March 7 that the company is adding the "download beta button" function to Tesla vehicles due to the high demand for the FSD Beta.
He also encouraged Tesla vehicle owners who purchased the FSD option to join the development program and let their car drive itself to their heart's content.
On March 13, Musk announced on Twitter his plans to expand FSD Beta to more than 2,000 owners, as well as its cancellation of some beta driver's access for not paying enough attention to the road.
FSD Beta Safety Concerns
Road & Track revealed that the FSD Beta requires split-second intervention and constant human supervision, making it "potentially dangerous."
In the report published on March 19, Road & Track concluded that the FSD Beta program is technologically limited and morally dubious.
Using a Model 3 with FSD Beta 8.2 installed, the Road & Track team captured that the EV had made "embarrassing mistakes" and "extremely risky, potentially harmful driving."
This includes the EV breaking a number of traffic laws, such as attempting to execute an illegal lane change, crossing a hard line, and suddenly disengaging while attempting to make a left turn.
The EV apparently also tried to make a right turn at a red light, proving that the feature must require the driver's intervention to prevent it from doing something illegal and dangerous.
Tesla FSD: Is it safe?
The US National Highway Traffic Safety Administration (NHTSA) had been eyeing Tesla's FSD feature since 2019 over concerns of road safety.
The investigation caught wind after a Tesla vehicle crashed into a tractor-trailer on March 11.
According to Reuters' report on March 16, the NHTSA launched a Special Crash Investigation (SCI) team to identify the cause of the accident, wherein two people were left critically injured.
On March 17, Detroit Police Assistant Chief David LeValley said on a press release that the driver admitted that he turned off the autopilot feature at the time of the crash.
"All the indications we have at this point are that the vehicle was not in Autopilot mode, that the driver was in control of the vehicle at the time of the crash," LeValley said.
This article is owned by Tech Times
Written by Lee Mercado