Killer Robots + Ancient Rules of War = Trouble

October 7th, 2014 - by admin

WNYC Radio – 2014-10-07 23:29:39

http://www.wnyc.org/story/killer-robots-ancient-rules-war-trouble/

Killer Robots + Ancient Rules of War = Trouble
We have the war machines, but we still don’t have a modern code of war

WNYC Radio

“When a robot gets blown up, that’s another life saved.”
— Mark Belanger, iRobot

NEW YORK (October 1, 2014) — Can replacing human soldiers with robot warriors save lives and make war more humane? We try to find out in this episode. But as we learn, the laws of war are not written in computer code. Modern warfare is not ready for killer robots that “decide” without human input.

In this episode, we hear from the people making the robots as they show off their lethal products. We meet a former fighter pilot who touts the values of automation and likes lawyers sitting side by side with soldiers. Several experts tell us about the terrifying moral risks of letting machines think too far ahead of people in battle.

We learn there could be lives to be saved, war could be made less atrocious if — and it is a huge if — the technology can advance side by side with the antiquated laws. In the end, we hear from the activists who want autonomous lethal weapons banned before they march on the enemy. A UN body has just begun to consider it.

Quotes heard in this episode:
“Maybe we can make war — as horrible as it sounds — less devastating to the non-combatants than it currently is.”
— Ronald Arkin director of the Mobile Robot Lab at Georgia Tech

When to unleash the machines: “They must do better than human beings before they should be deployed in the battlefield.”
— Ronald Arkin

On why Las Vegas could be considered a target: “With Napoleonic-era combat, you knew where the battlefield was, right? With modern warfare, modern conflict, you really don’t know, where the battlefield is.”
— Brad Allenby, Arizona State University

“Robotics has been trying to do visual recognition for. . . a bit more than 50 years and we can just about tell the difference between a lion and a car. So the idea of putting one of these things onto a battlefield. . . and thinking it should discriminate between [innocent people] and insurgents is just insane.”
— Noel Sharkey Professor of Artificial Intelligence and Robotics at the University of Sheffield in the U.K.

“In today’s warfare, a drone pilot is looking on a screen, talking to potentially five to ten other people looking at that same screen, one of which is a lawyer.”
— Missy Cummings Duke professor and former fighter pilot < About autonomous lethal weapons: “These machines for the foreseeable future would fail to meet the requirements of international law.”
— Peter Asaro, International Committee for Robot Arms Control

“The preemptive ban is the only thing that makes sense.”
— Stephen Goose, of Human Rights Watch

A version of this story won the German Prize for Innovation Journalism. It aired on Deutschlandfunk by Thomas Reintjes with help from Philip Banse.

HOSTED BY: Manoush Zomorodi
EDITORS: Alex Goldmark
CONTRIBUTORS: Thomas Reintjes
If you like this episode why not share it with that friend of yours who always posts about military issues? To get future audio downloads of our program direct to your phone or computer, subscribe to the New Tech City podcast on iTunes, Stitcher or via RSS. It just takes a second.

Posted in accordance with Title 17, Section 107, US Code, for noncommercial, educational purposes.