A Tesla recruiter lost his life in a tragic incident in 2022 while using the electric vehicle's (EV) Full Self-Driving (FSD) feature. The fatal accident occurred on a Colorado mountain road when Hans von Ohain's Tesla Model 3 veered off the road, collided with a tree, and subsequently caught fire.

Surviving passenger and friend Erik Rossiter informed emergency responders that the victim had activated the "auto-drive feature" before the crash. In a report, The Washington Post wrote that this would be Tesla's first driver-assistance fatality if the incident gets officially connected with FSD. Even though von Ohain was legally drunk, officials are investigating Tesla's software's participation in the disaster.

Tesla Driver-Assistance Features Under Scrutiny

Since 2021, federal officials have recorded over 900 Tesla driver-assistance system crashes, with at least 40 causing serious or fatal injuries. The majority included Tesla's Autopilot, meant for highway use, although full self-driving, intended for more situations, has not been related to any fatalities.

Investigation Continues Into Tesla Driver's Death While In Autopilot Mode

(Photo : Spencer Platt/Getty Images)The inside of a Tesla vehicle is viewed as it sits parked in a new Tesla showroom and service center in Red Hook, Brooklyn on July 5, 2016 in New York City.

Tesla calls FSD "beta" but highlights its importance for road safety after releasing it to 400,000 customers. Tesla user guides listed scenarios where FSD may not work, and the EV manufacturer emphasizes driver control. The reported Tesla accidents cast doubt on autonomous driving and driver-assistance technology safety, emphasizing regulatory attention and the necessity to comprehend changing technology on public roadways. 

Though Tesla has not yet commented on the issue, it warns that Autopilot, Enhanced Autopilot, and Full Self-Driving Capability are designed to be utilized by a fully aware driver, grasping the wheel and ready to take control at any time, according to Business Insider. "While these features are designed to become more capable over time, the currently enabled features do not make the vehicle autonomous," Tesla stated.

Read Also: Sensitive US Military Data Exposed in DOD Email Leak, Affecting More Than 20K Individuals

Is Tesla Deceiving The Public with FSD?

Hans von Ohain, a Marine veteran and Tesla employee since 2020, was a believer in the potential of Tesla's full self-driving technology and frequently utilized the feature. Given Elon Musk's claims about the technology's capabilities, his widow, Nora Bass, expressed worries about the false sense of security FSD provides, per The Post.

Bass highlighted her struggle to secure legal representation due to her husband's state of intoxication. However, she maintained that Tesla shares some responsibility for her spouse's demise, emphasizing that regardless of Hans's level of intoxication, Musk claimed that the car could operate autonomously and was essentially superior to a human. "We were sold a false sense of security."

Tesla's self-driving claims are under investigation by the government. Despite Elon Musk's frequent promises regarding autonomous vehicles, Tesla's history is now tainted by the FSD tragedies. One of multiple Tesla lawsuits is led by Nabilah Hussain, a lawyer alleging that the EV firm advertised a non-existent fully self-driving device. "Tesla is marketing and trying to sell you a product that doesn't exist," Hussain said in the article.

Tesla has not officially recognized von Ohain's death, raising issues about the company's duty and contact with employees and their families. The incident highlights persistent technical and ethical concerns around incorporating autonomous driving technology into daily use.

Related Article: SpaceX Fined, Musk Firm Pays for Nearly Amputated Employee Accident

byline-quincy


ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion