Zachary Kallenborn, an affiliate researcher who studies drone warfare, terrorism and weapons of mass destruction at the University of Maryland, said the report suggested that for the first time a weapons system with a capability artificial intelligence functioned autonomously to find and attack humans.
“What is clear is that this drone was used in the conflict,” said Mr Kallenborn, who wrote of the report in the Bulletin of Atomic Scientists. “What is not clear is whether the drone was allowed to select its target autonomously and whether the drone, while acting autonomously, harmed someone. The UN report strongly implies, but does not say, that it did.
But Ulrike Franke, senior policy researcher at the European Council on Foreign Relations, said the report does not say how independently has the drone acted, what degree of human oversight or control does it have over it, and what specific impact it has had in the conflict.
“Should we talk about more autonomy in weapon systems?” Definitely, ”Ms. Franke said in an email. “Does this instance in Libya seem like a revolutionary and innovative moment in this discussion? Not really.”
She noted that the report stated that the Kargu-2 and “other stray munitions” attacked convoys and retreating fighters. Stray munitions, which are simpler, self-contained weapons designed to hover on their own in an area before crashing into a target, have been used in several other conflicts, Ms. Franke said.
“What is not new is the presence of stray munitions,” she said. “What is not new either, is the observation that these systems are quite autonomous. The degree of autonomy is difficult to determine – and autonomy is poorly defined anyway – but we do know that several manufacturers of loitering ammunition claim that their systems can act autonomously.
The report says the “race to regulate these weapons” is fading, a potentially “catastrophic” development, said James dawes, a professor at Macalester College in St. Paul, Minn., who has written on autonomous weapons.