UN, EU Support Ban, but Major Military Powers Are Investing Heavily
(March 29, 2019)—With some activists warning that the advent of armies of vicious killer robots could be just 3-4 years away, a large number of nations are trying to get out in front of that, with an eye toward a global ban on such robots.
The EU and UN both heavily support such a ban, and Germany is seen as a major proponent. Yet several nations seem keen to resist the idea, as they envision powerful armies of metal men crushing their enemies.
The US and Russia, unsurprisingly, are leading the opposition to the global ban, calling it premature. In practice this really just means they oppose any limitations that would prevent them from building killer robots. Britain, Israel, and Australia are also opposed.
These are the usual suspects whenever people are opposing a global ban on something that would be used to commit war crimes. While Britain’s Defence Ministry denied any plans to build any “fully autonomous” killer robots of their own, they announced this week that they are in the process of developing killer drone swarms with theoretically full autonomy.
The US is sure to be the leader in the field, at least early on, with America’s huge fleet of attack drones likely to eventually be modified to take out the need for a “button pusher” to authorize strikes, and simply letting the drone decide who lives and who dies.
Since the US provides little oversight in how they decide on drone strikes, the near-term difference may be negligible, just someone else killing indiscriminately with no consequences.
Yet human ethics are always at least somewhere at the far end of the spectrum a limiting factor to how many people the fleet can kill. Given the Pentagon’s proclivity for high body counts in recent air wars, simulating that in the AI of the new killer robots will certainly be a low priority, or one eschewed entirely in the name of a more efficient, and therefore merciless, killer.
Germany Urged to Champion Global Treaty to Ban ‘Killer Robots’
Andrea Shalal / Reuters
BERLIN (March 30, 2019) – Nobel Peace Prize laureate Jody Williams and other activists warned on Thursday that fully autonomous weapons could be deployed in just 3-4 years and urged Germany to lead an international campaign for a ban on so-called “killer robots”.
Williams, who won the Nobel in 1997 for leading efforts to ban landmines, told reporters Germany should take bold steps to ensure that humans remained in control of lethal weapons. “You cannot lead from the rear,” she said.
Critics fear that the increasingly autonomous drones, missile defence systems and tanks made possible by new artificial intelligence could turn rogue in a cyber-attack or as a result of programming errors.
German Foreign Minister Heiko Maas called last week for action to ensure human control of lethal weapons, but is pushing a non-binding declaration rather than a global ban, given opposition by the United States, Russia and China.
The United Nations and European Union have called for a global ban, but discussions so far have not yielded a clear commitment to conclude a treaty.
Activists from over 100 non-governmental groups gathered in Berlin this week to pressure Maas and the German government to take more decisive action after twice endorsing a ban on fully autonomous weapons in their 2013 and 2018 coalition accords.
They rallied at Berlin’s Brandenburg Gate, with a life-sized robot telling onlookers: “Not all robots will be friendly. Stop killer robots now.”
“If Germany showed leadership and got behind it, we’d soon have the rest of Europe behind it,” said Noel Sharkey, a leading roboticist and co-founder of the Campaign to Stop Killer Robots.
He said it was only a matter of years before fully autonomous weapons could be deployed in battle given rapid advances in artificial intelligence and other technologies.
Reporting by Andrea Shalal; Editing by Mark Heinrich
UK, US and Russia among Those Opposing Killer Robot Ban
LONDON (March 29, 2019) — The UK government is among a group of countries that are attempting to thwart plans to formulate and impose a pre-emptive ban on killer robots.
Delegates have been meeting at the UN in Geneva all week to discuss potential restrictions under international law to so-called lethal autonomous weapons systems, which use artificial intelligence to help decide when and who to kill.
Most states taking part—and particularly those from the global south—support either a total ban or strict legal regulation governing their development and deployment, a position backed by the UN secretary general, António Guterres, who has described machines empowered to kill as “morally repugnant”.
But the UK is among a group of states—including Australia, Israel, Russia and the US—speaking forcefully against legal regulation. As discussions operate on a consensus basis, their objections are preventing any progress on regulation.
The talks come as the UK military is ploughing tens of millions of pounds into autonomous weapons, most recently announcing on Thursday a £2.5m project for “drone swarms” controlled with the help of next-generation autonomy, machine learning, and AI.
The talks in Geneva are taking place under the convention on certain conventional weapons. First enacted in 1983, the convention is intended to restrict the use of weapons “that are considered to cause unnecessary or unjustifiable suffering to combatants or to affect civilians indiscriminately”. It already covers landmines, booby traps, incendiary weapons, blinding laser weapons and clearance of explosive remnants of war.
“We urgently need a ban on killer robots,” said Ben Donaldson, head of campaigns at the United Nations Association—UK. “The majority of states get it. A rapidly growing proportion of the tech community get it. Civil society gets it. But a handful of countries including the UK are blocking progress at the UN. The UK needs to listen to this growing coalition and join calls for a preemptive ban.”
Responding to the criticism, a Ministry of Defence spokesperson said: “The United Kingdom does not possess fully autonomous weapon systems and has no intention of developing them. We believe a preemptive ban is premature as there is still no international agreement on the characteristics of lethal autonomous weapons systems.”
The issue of human control is at the heart of discussions about killer robots, according to the British military, and its negotiators have sought to focus debates at the UN on building consensus on what that means. Britain’s negotiating team says that no UK offensive weapons systems will be capable of attacking targets without human control and input.
They are arguing against a preemptive ban on the basis that it could jeopardise their ability to exploit any potential military advantages they could gain by imbuing weapons with AI.
“What’s being said is that current humanitarian law is enough,” said Taniel Yusef, international adviser for the Women’s International League for Peace and Freedom, who is in Geneva lobbying for a ban. “But robots can’t make ethical and legal decisions.”
Those backing legal controls say the UK’s position masks potential for the development and deployment of weapons with significant levels of autonomy. Military commanders already possess weapons that, once launched, can identify their own targets within a limited area, they point out, and the potential with AI is expanding such uses over a wider area for longer.
“It then becomes more difficult to assert that it’s the commander that has really made the decision or whether the attack was made without much human involvement at all,” said Richard Moyes, managing director of Article 36, a UK-based non-profit organisation that campaigns for more control over new weapons technologies.
“The UK should be under some pressure on this issue. There are officials in the UK who are quite thoughtful on this stuff and I feel that the posture the government takes in the talks is quite unhelpful. They are being a brake on movement towards agreement rather than positively pushing forward.”
Posted in accordance with Title 17, Section 107, US Code, for noncommercial, educational purposes.