A Rogue Killer Drone ‘Hunted Down’ a Human Target Without Being Instructed to, UN Report Says
(May 29, 2021) — A “lethal” weaponized drone “hunted down a human target” without being told to for the first time, according to a UN report seen by the New Scientist.
The March 2020 incident saw a KARGU-2 quadcopter autonomously attack a human during a conflict between Libyan government forces and a breakaway military faction, led by the Libyan National Army’s Khalifa Haftar, the Daily Star reported.
The Turkish-built KARGU-2, a deadly attack drone designed for asymmetric warfare and anti-terrorist operations, targeted one of Haftar’s soldiers while he tried to retreat, according to the paper.
The drone, which can be directed to detonate on impact, was operating in a “highly effective” autonomous mode that required no human controller, the New York Post said.
“The lethal autonomous weapons systems were programmed to attack targets without requiring data connectivity between the operator and the munition: in effect, a true ‘fire, forget and find’ capability,” the report from the UN Security Council’s Panel of Experts on Libya said.
This is likely the first time drones have attacked humans without instructions to do so, Zak Kellenborn, a national security consultant who specializes in unmanned systems and drones, confirmed in the report.
Kallenborn, however, has concerns about the future of autonomous drones. “How brittle is the object recognition system?” he said. “How often does it misidentify targets?”
Jack Watling, a researcher on land warfare at the Royal United Services Institute (RUSI), told the New Scientist that this incident demonstrates the “urgent and important” need to discuss the potential regulation of autonomous weapons.
Human Rights Watch has called for an end to so-called “killer robots” and is campaigning for a “preemptive ban on the development, production, and use of fully autonomous weapons,” according to a report by the charity.
Drones May Have Attacked Humans Autonomously for the First Time
The New Scientist
Explosive-carrying quadcopters deployed during an engagement between rival factions in the Libyan civil war are thought to have deliberately crashed into targets without being ordered a human controller
(May 27, 2021) — Military drones may have autonomously attacked humans for the first time ever last year, according to a United Nations report. While the full details of the incident, which took place in Libya, haven’t been released and it is unclear if there were any casualties, the event suggests that international efforts to ban lethal autonomous weapons before they are used may already be too late.
The robot in question is a Kargu-2 quadcopter produced by STM
Connecticut teen arrested after attaching a semiautomatic pistol to a drone.
Killer AI Drones ‘Hunted Down Humans Without Being Told to’ UN Warns
(May 29, 2021) — An autonomous weaponised drone “hunted down” a human target last year and is thought to have attacked them without being specifically ordered to, according to a report prepared for the United Nations.
The news raises the spectre of terminator-style AI weapons killing on the battlefield without any human control.
The drone, a Kargu-2 quadcopter produced by Turkish military tech company STM, was deployed in March 2020 during a conflict between Libyan government forces and a breakaway military faction led by Khalifa Haftar, commander of the Libyan National Army.
The Kargu-2 is fitted with an explosive charge and the drone can be directed at a target in a kamikaze attack, detonating on impact.
The report from the UN Security Council’s Panel of Experts on Libya, published in March 2021, was obtained by New Scientist magazine.
In one passage the repots details how Haftar’s were “hunted down” as they retreated by Kargu-2 drones that were operating in a “highly effective” autonomous mode that required no human controller.
“The lethal autonomous weapons systems were programmed to attack targets without requiring data connectivity between the operator and the munition: in effect, a true ‘fire, forget and find’ capability,” the report says.
There is no record of how many casualties, if any, the AI war machines inflicted.
Zak Kallenborn at the National Consortium for the Study of Terrorism and Responses to Terrorism in Maryland, could be the first time that drones have autonomously attacked humans.
He says this development is cause for serious concern, given that AI systems can not always interpret visual data correctly.
“How brittle is the object recognition system?” Kallenborn asks. “… how often does it misidentify targets?”
Jack Watling at UK defence think tank Royal United Services Institute, told New Scientist that the drones are in something of a grey area when it comes to regulation of AI weapons, because only the drones’ controllers would know whether the machines were being remotely controlled at the time of the attack.
“This does not show that autonomous weapons would be impossible to regulate,” he says. “But it does show that the discussion continues to be urgent and important. The technology isn’t going to wait for us.”
Home-made Russian Drone Designed for Mass Killings
FPSRussian (April 23, 2012)