As Google continues to test its driverless cars, it has run into a rather unique problem — its vehicles are too safe and perhaps play by the rules too much.

A New York Times report Tuesday (September 1) illustrated how Google's self-driving cars are programmed to be perhaps too much of sticklers for the rules during testing, where one car slowed as it approached a crosswalk to let a pedestrian cross and was rear-ended by a human-driven sedan. Since 2009, Google cars have been in 16 accidents — mostly fender-benders — and the tech company blames human drivers and not their autonomous vehicles as the culprits.

During testing with Times journalists, Google's driverless car demonstrated how it can veer on the overly-cautious side. The Times reports that, in one instance, the car swerved sharply in a residential neighborhood to avoid a poorly-parked car and Google's sensors couldn't tell if it might pull into traffic.

In another instance, the car was approaching a red light in moderate traffic when it sensed a vehicle coming the other direction and going over the speed limit. The Google car jarringly jerked to the right to avoid a collision, when all that was happening was the driver was speeding a little — which human drivers do — but still stopped on time.

Being overly safe is not only a Google car issue and concern. The Times report says being too safe is actually an issue that's one of concern in the autonomous vehicle field, as auto manufacturers are continually challenged on how to integrate them onto roads, in which humans regularly go against the grain of a straightforward driver's ed manual.

"The real problem is that the car is too safe," Donald Norman, director of the Design Lab at the University of California, San Diego, who studies autonomous vehicles, told the Times. "They have to learn to be aggressive in the right amount, and the right amount depends on the culture."

The concern is that Google's cars as well as other autonomous vehicles are overly cautious to the point where their evasive measures won't align with what human drivers are doing on the road, creating confusion and even accidents.

"It's always going to follow the rules," Tom Supple, a Google safety driver, told the Times during a recent test drive. "I mean, almost to a point where human drivers who get in the car and are like 'Why is the car doing that?'"

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion