A warning to all owners of self-driving cars out there! New research shows "phantom" images such as similar road signs flashed in road billboards could confuse autopilot feature of a vehicle. In some cases, hackers could purposely hijacke billboards on main roads. This might lead to accidents or total loss control to your own car.

Tesla Autopilot gets tricked!

Be Careful! Autopilot Cars Can Crash Due to
(Photo : Photo by Alexander Koerner/Getty Images)
A driver presents a Cruising Chauffeur, a hands free self-driving system designed for motorways during a media event by Continental to showcase new automotive technologies on June 20, 2017 in Hannover, Germany. The company presented new clean diesel technology, cable-less and other advances in electric car charging, smartphone technology for rental cars, driverless car advances and robotic taxi services.

It's no longer new to find warning signs on using autopilot or self-driving features on your vehicle.

Reports say things like technology malfunction, exposure to radiation, and more accidents may occur once driven on the main road, or hackers being able to control vehicles on their own, are the reasons why people must not use the feature.

But, in 2020, the presence of this technology continues to be prevalent especially with the abundance of people buying themselves a Tesla car.

Due to this, academics from Israel's Ben Gurion University of the Negev, found a way to make sure Tesla's autopilot is reliable and safe to use.

Wired reported that researchers were able to trick the Tesla autopilot feature by flashing a few frames of a stop sign for less than half a second on an internet-connected billboard.

They called these images as "phantom" images or flickering lights on the road that could confuse self-driving cars on when to stop, go, or turn.

Related Article: Watch: Here's How Tesla's Autopilot Works in Heavy Rain

How does it work?

Be Careful! Autopilot Cars Can Crash Due to
(Photo : Photo by Spencer Platt/Getty Images)
The inside of a Tesla vehicle is viewed as it sits parked in a new Tesla showroom and service center in Red Hook, Brooklyn on July 5, 2016 in New York City. The electric car company and its CEO and founder Elon Musk have come under increasing scrutiny following a crash of one of its electric cars while using the controversial autopilot service. Joshua Brown crashed and died in Florida on May 7 in a Tesla car that was operating on autopilot, which means that Brown's hands were not on the steering wheel.

For example, a Tesla car is set to autopilot. It is driving smoothly in the middle of the night on the main road with lots of flashing billboards.

Some of these huge-sized advertisements feature warnings to drivers, which includes road signs. Once a self-driving car sees the sign, it has the possibility to be confuse on which function to go to.

Based on the experiment, hackers might also do the job of confusing the vehicles, by hijacking billboards itself.

Once the system gets confused, the driver may totally lose control of the car, leading to traffic jams or worse, accidents.

Though this is plausible, Tesla clarified via their website, autopilot feature is only "intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment."

This means, every driver should still keep their hands on the wheel, with or without the feature.

Also Read: [VIRAL] TikTokers Use Tesla's Autopilot Without Driver While Singing Justin Bieber's 'Baby' in Backseat; Now, They Are Dealing With The Backlash

This article is owned by Tech Times

Written by Jamie Pancho

ⓒ 2021 TECHTIMES.com All rights reserved. Do not reproduce without permission.