Rogue Killer Drone ‘Hunted Down’ Human Target Without Being Told To, UN Reports
Published

A weaponised drone reportedly attacked a human target without being instructed to do so.
The news comes as part of a report issued by the United Nation’s (UN) Security Council’s Panel of Experts on Libya, which found that the military drone had autonomously hunted down the human target for the first time ever.
The incident is said to have taken place in Libya in March last year, and despite having happened more than a year ago, it remains unclear if there were any casualties.

The lethal drone, named the KARGU-2 quadcopter, reportedly attacked a Libyan National Army soldier during an altercation between them and the Libyan Government’s forces, Business Insider reports.
The soldier was attempting to retreat when the drone, which is designed for asymmetric warfare and anti-terrorist operations, targeted them.
The KARGU-2 can be fitted with an explosive charge and could be directed at a target before detonating on impact, according to The Daily Star.
Part of the report, obtained by New Scientist, read:
The lethal autonomous weapons systems were programmed to attack targets without requiring data connectivity between the operator and the munition: in effect, a true ‘fire, forget and find’ capability.
See the advert for the KARGU drone here:
Zak Kellenborn, a national security consultant who specialises in unmanned systems and drones, believes this was the first autonomous attack of its kind. He’s since raised concerns about the future of the self-controlling drones, begging the question of how often the drones may misidentify targets.
Jack Watling at UK defence think tank Royal United Services Institute partially agreed with Kellenborn’s concerns, saying, ‘The discussion continues to be urgent and important [surrounding autonomous technology]. The technology isn’t going to wait for us.’
However, Watling added that last year’s incident ‘does not show that autonomous weapons would be impossible to regulate’.

While Watling remains optimistic, Human Rights Watch called for these so-called ‘killer robots’ to be banned altogether.
Part of a report issued by the organisation states that it is ‘call[ing] for a pre-emptive ban on the development, production, and use of fully autonomous weapons’.
The report added, ‘There are serious doubts that fully autonomous weapons would be capable of meeting international humanitarian law standards, including the rules of distinction, proportionality, and military necessity, while they would threaten the fundamental right to life and principle of human dignity.’
If you have a story you want to tell, send it to UNILAD via [email protected]
Topics: Technology, Now