UN Expert Calls for Halt in Military Robot Development

June 1st, 2013 - by admin

Nick Cumming-Bruce / The New York Times – 2013-06-01 01:17:17

GENEVA (May 30, 2013)— A United Nations expert called Thursday for a global moratorium on the testing, production and use of armed robots that can select and kill targets without human command.

“War without reflection is mechanical slaughter,” said Christof Heyns, the United Nations special rapporteur on extrajudicial, summary or arbitrary executions.

“A decision to allow machines to be deployed to kill human beings worldwide — whatever weapons they use — deserves a collective pause,” he told the Human Rights Council in Geneva.

No countries use such weapons, but the technology is available or soon will be, Mr. Heyns told the council.

The United States, Britain, Israel and South Korea already use technologies that are seen as precursors to fully autonomous systems. Little is known about Russian and Chinese progress in developing them.

“My concern is that we may find ourselves on the other side of a line and then it is very difficult to go back,” Mr. Heyns said in an interview. “If there’s ever going to be a time to regulate or stop these weapons, it’s now.”

Mr. Heyns urged the council to set up a high-level panel to report within a year on advances in the development of “lethal autonomous robotics,” to assess whether existing international laws are adequate for controlling their use.

Preparations to introduce armed robots raise “far-reaching concerns about the protection of life during war and peace,” Mr. Heyns said in a report on lethal autonomous robotics he submitted to the council. “This includes questions of whether robots will make it easier for states to go to war.”

Some states active in developing such weapons have committed to not deploy them for the foreseeable future, Mr. Heyns said. He pointed to a United States Defense Department directive issued in November that banned use of lethal force by fully autonomous weapons for up to 10 years, unless specifically authorized by senior officials, and that identified possible technology failures. Mr. Heyns said that provided important recognition of the need for caution.

Addressing the council, however, he said, “It is clear that very strong forces — including technology and budgets — are pushing in the opposite direction.”

His initiative comes as nongovernmental organizations and human rights groups are campaigning to ban fully autonomous weapons to pre-empt deployment in the same way as the ban on blinding laser weapons. Discussions are under way with a number of governments that may be willing to take the lead in drafting a treaty to outlaw the weapons, Steve Goose, arms division director of Human Rights Watch, told journalists in Geneva this week.

Supporters of the robots say they offer a number of advantages: they process information faster than humans, and they are not subject to fear, panic, a desire for revenge or other emotions that can cloud human judgment. Robots can be used to acquire more accurate battlefield data that can help to target fire more precisely and in the process may save lives.

A report by Human Rights Watch and the Harvard Law School cites a United States Air Force assessment that “by 2030 machine capabilities will have increased to the point that humans have become the weakest component in a wide array of systems and processes.”

Human rights groups dispute the ability of robots to meet requirements of international law, including the ability to distinguish between civilians and combatants or to assess proportionality — whether the likely harm to civilians during a military action exceeds the military advantage gained by it. Moreover, in the event that a killer robot breaches international laws causing civilian casualties, it is unclear who could be held responsible or punished.

“It is possible to halt the slide toward full autonomy in weaponry before moral and legal boundaries are crossed,” Mr. Goose said in a statement this week, “but only if we start to draw the line now.”


Report of the Special Rapporteur on extrajudicial, summary or arbitrary executions, Christof Heyns
UN General Assembly

Human Rights Council Twenty-third session Agenda item 3 Promotion and protection of all human rights, civil, political, economic, social and cultural rights, including the right to development
A/HRC/23/47

Summary
Lethal autonomous robotics (LARs) are weapon systems that, once activated, can select and engage targets without further human intervention. They raise far-reaching concerns about the protection of life during war and peace.

This includes the question of the extent to which they can be programmed to comply with the requirements of international humanitarian law and the standards protecting life under international human rights law.

Beyond this, their deployment may be unacceptable because no adequate system of legal accountability can be devised, and because robots should not have the power of life and death over human beings.

The Special Rapporteur recommends that States establish national moratoria on aspects of LARs, and calls for the establishment of a high level panel on LARs to articulate a policy for the international community on the issue.

Posted in accordance with Title 17, Section 107, US Code, for noncommercial, educational purposes.