Robots to shape wars of the future

A soldier with the 25th Infantry Division prepares to launch a Raven unmanned aerial vehicle in Paktika province, Iraq.

A soldier with the 25th Infantry Division prepares to launch a Raven unmanned aerial vehicle in Paktika province, Iraq.

(MILITARY TIMES)   Robots may one day be more effective than human soldiers on the battlefield and they may have a sense of ethics — even a sense of guilt, says a robotics expert who has done a study with the support of the Army’s research office.

Ethical robots that can use lethal force on the battlefield would adhere to international law and rules of engagement, Ronald C. Arkin told Army Times on Dec. 15. Arkin describes how this could work in his 2009 book “Governing Lethal Behavior in Autonomous Robots.” He is with the Mobile Robot Laboratory at the Georgia Institute of Technology.

Here’s what the future of robots may hold: Human soldiers eventually may not be up to speed compared to “humane-oids” in the battle space, Arkin says. Future developments may lead to robotic sensors better equipped than soldiers to maintain situational awareness and process information quickly about situations in which lethal force might be used.

Mr. Spock can relate to certain advantages of having robots in the combat zone: They are free of human emotion, which can distort judgment, and they don’t express anger or frustration. But Arkin envisions robots designed with some capabilities — if not exactly feelings — that would be constructive: Remorse, compassion, and yes, guilt.

Robots designed to have guilt operate this way, according to a research paper co-written by Arkin and colleague Patrick Ulam: The robots would be designed with an “ethical adaptor,” while each weapon system they carry would be grouped according to its destructive power and each group of weapons associated with a specific guilt threshold.

Highly destructive weapons would have lower thresholds for guilt than less destructive weapons. When the guilt level exceeds the threshold for a weapon, the weapon system is deactivated, with the intent to reduce collateral damage. Arkin’s example: An unmanned aerial vehicle has three systems: GBU precision-guided bombs, Hellfire missiles and a chain gun. The UAV engages the enemy with the GBU and finds the attack resulted in civilian casualties.

Then the ethical adaptor determines the guilt level should be raised and the GBU is deactivated. Next, the UAV uses a Hellfire, and there is more collateral damage so the guilt level is raised again. This time, the resulting level of guilt reaches the maximum allowed and all weapons systems are deactivated, according to the paper. There is a loophole — the operator can override the guilt sub-system.

Robots in combat might even snitch on soldiers. When working on a team with humans, the robots may have the potential to objectively monitor “ethical behavior in the battlefield by all parties” and report infractions, Arkin said.

The technology may be ready for ethical, autonomous robots to be fielded for certain types of combat operations within 10 to 20 years, provided there is sufficient study and funding, Arkin said. Further study is needed on the ethical component and the capability of discriminating between the enemy and noncombatants, he said. The Army hasn’t announced whether it plans to use such robots.

http://militarytimes.com/news/2009/12/army_robots_122709w/

Leave a Reply

Your email address will not be published. Required fields are marked *

Show some support!

We are 100% Listener & User supported!! Every little bit helps us continue. Donations help fund the site and keep all the free information on it. Thanks in advance and KEEP UP THE FIGHT!!!

Visitor Map

Subscribe For New Posts & Updates

Enter your email address to subscribe to FederalJack and Popeyeradio and you will receive notifications of new posts by email.

News Categories
The Wigner Effect
Col. L Fletcher Prouty: Secret Team