A top computer science expert in the U.S. has warned that humans could be defenseless when faced with threats from robots that are able to think for themselves and are made to kill.

Stuart Russell from the University of California at Berkeley said that these deadly drones could be the final stage of the ongoing technological march toward the lethal autonomous weapons systems (LAWS), which have been described as warfare's third revolution after gunpowder and nuclear arms.

In a comment article published in Nature on May 27, Russell said that the feasibility of weapons that are no longer controlled by humans but by artificial intelligence are no longer constrained in the distant future—they could be practically feasible not in decades but in as little as a few years.

He argued that these machines could be restricted not by their intelligence but by their physical constraints such as their firepower, mobility and speed.

Autonomous weapons systems are able to choose and engage targets without the need for human intervention, and these could be deadly when their targets include humans. Such advanced and lethal machines may include, for instance, armed quadcopters that can hunt and eliminate enemy combatants.

LAWS do not include remotely piloted drones and cruise missiles that still require humans making the targeting decisions.

Artificial intelligence (A.I.) and robotics components that exist today have capabilities such as perception, mapping, motor control, navigation and long-term planning, and these only need to be combined.

The technology that can be seen in self-driving cars, for instance, can be combined with the tactical control capability of DeepMind's DQN system, and these could be capable of supporting search and destroy missions.

Humanitarian law does not currently have provisions for this kind of technology, and it remains to be seen whether the international community would be supportive of a treaty that would limit or ban LAWS.

Russell has already asked his peers, scientists who specialize in A.I. and robotics, as well as professional scientific organizations to make a stand on LAWS just as biologists clarified their stance on the use of disease agents in war and physicists on the use of nuclear weapons.

Russell said that LAWS may violate human dignity's fundamental principles by giving machines the chance to choose whom to kill, such as when these are tasked to eliminate people who exhibit threatening behavior.

"The potential for LAWS technologies to bleed over into peacetime policing functions is evident to human-rights organizations and drone manufacturers," Russell said.

Photo: Gage Skidmore | Flickr

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion