The New York Times – 2014-11-12 10:54:11
(November 11, 2014) — On a bright fall day last year off the coast of Southern California, an Air Force B-1 bomber launched an experimental missile that may herald the future of warfare.
Alone, without human oversight, the missile decided which of three ships to attack, dropping to just above the sea surface and striking a 260-foot unmanned freighter. The test was deemed a military success. But the design of this new missile and other weapons that can pick targets on their own has stirred protests from some analysts and scientists, who fear that an ethical boundary is being crossed.
Arms-makers, they say, are taking the first steps toward developing robotic war machines that rely on software, not human instruction, to decide what to target and whom to kill. The speed at which these weapons calculate and move will make them increasingly difficult for humans to control, critics say — or to defend against.
And some scientists worry that with the aim of reducing indiscriminate killing and automating armed conflict, these weapons one day could make war more thinkable, even more likely.
On Thursday, representatives from dozens of nations will meet in Geneva to consider whether development of these weapons should be restricted by the Convention on Certain Conventional Weapons. Christof Heyns, the United Nations special rapporteur on extrajudicial, summary or arbitrary executions, last year called for a moratorium on the development of these weapons altogether.
The Pentagon itself has issued a directive requiring high-level authorization for the development of weapons capable of killing without human oversight. But fast-moving technology already has made the directive obsolete, some scientists say.
“Our concern is with how the targets are determined, and more importantly who determines them,” said Peter Asaro, a co-founder and vice chairman of the International Committee on Robot Arms Control, a group of scientists that advocates restrictions on the use of military robots. “Are these human-designated targets? Or are these systems automatically deciding what is a target?”
Some arms-control specialists say that requiring only “appropriate” human control of these weapons is too vague, speeding the development of new targeting systems that automate killing.
Heyns, of the UN said that nations with advanced weapons should agree to limit their weapons systems to those with “meaningful” human control over the selection and attack of targets. “It must be similar to the role a commander has over his troops,” Heyns said.
Systems that permit humans to override the computer’s decisions may not meet that criterion, he added. Weapons that make their own decisions move so quickly that human overseers soon may not be able to keep up.
Posted in accordance with Title 17, Section 107, US Code, for noncommercial, educational purposes.