• UN Secretary-General Antonio Guterres has called for a global ban on lethal autonomous weapon systems — machines capable of taking human lives without human oversight — describing them as “politically unacceptable” and “morally repugnant”.
• AI-driven drones, also called “killer robots”, are reshaping warfare.
• The rising spectre of unmanned drones or other autonomous weapons is adding fresh urgency to ongoing worries about “killer robots” raining down death from the skies, deciding for themselves who they should attack.
• Machines that have the power and discretion to take human lives without human control should be prohibited by international law, he said.
Drone attacks in Ukraine
• In its report, the Independent International Commission of Inquiry on Ukraine said that Russian armed forces have committed murder of civilians as crimes against humanity using drones.
• From July 2024, Russian forces have recurrently killed and injured civilians in an area stretching over more than 100 kilometres along the right bank of the Dnipro River in Kherson Province.
• Nearly 150 civilians have been killed and hundreds more injured as a result of the drone attacks in Kherson city and 16 localities in the Ukrainian-controlled areas.
• The drone operators used video feeds transmitted in real time by the cameras embedded in the drones, focused on targets that were visibly civilian, and dropped explosives on them.
• Ambulances, which have special protection under international law, have been targeted and struck by drones.
• The use of drones to target civilians and civilian objects is a violation of the fundamental principle of international humanitarian law.
Lethal Autonomous Weapon Systems
• While there is no internationally accepted definition of Lethal Autonomous Weapon Systems (LAWS), they broadly refer to weapons such as advanced drones which select targets and apply force without human instruction.
• The most common types of weapons with autonomous functions are defensive systems. This includes systems such as anti-vehicle and anti-personnel mines, which, once activated, operate autonomously based on trigger mechanisms.
• Newer systems employing increasingly sophisticated technology include missile defence systems and sentry systems, which can autonomously detect and engage targets and issue warnings.
• Other examples include loitering munition (also known as suicide, kamikaze or exploding drone) which contain a built-in warhead (munition) and wait (loiter) around a pre-defined area until a target is located by an operator on the ground or by automated sensors onboard, and then attacks the target.
• These systems first emerged in the 1980s. However, their systems functionalities have since become increasingly sophisticated, allowing for, among other things, longer ranges, heavier payloads and the potential incorporation of artificial intelligence (AI) technologies.
• Autonomous capabilities can be provided through pre-defined tasks or sequences of actions based on specific parameters, or through using artificial intelligence tools to derive behavior from data, thus allowing the system to make independent decisions or adjust behavior based on changing circumstances.
• AI can also be used in an assistance role in systems that are directly operated by a human.
• There are substantial concerns that autonomous weapon systems violate international humanitarian and human rights laws by removing human judgement from warfare.
What is the position of UN on LAWS?
• UN Special Rapporteur on extrajudicial, summary or arbitrary executions, Christof Heyns, was the first to raise the alarm about lethal autonomous weapons systems, in a report to the Human Rights Council in 2013.
• UN Member States have considered regulations for autonomous weapons systems since 2014 under the Convention on Certain Conventional Weapons (CCW) which deals with weapons that may violate humanitarian law.
• Since 2018, United Nations Secretary-General Antonio Guterres has maintained that lethal autonomous weapons systems are politically unacceptable and morally repugnant and has called for their prohibition under international law.
• In his 2023 New Agenda for Peace, the Secretary-General reiterated this call, recommending that States conclude, by 2026, a legally binding instrument to prohibit lethal autonomous weapon systems that function without human control or oversight, and which cannot be used in compliance with international humanitarian law, and to regulate all other types of autonomous weapons systems.
• In the absence of specific multilateral regulations, the design, development and use of these systems raise humanitarian, legal, security and ethical concerns and pose a direct threat to human rights and fundamental freedoms.
• United Nations independent experts have also expressed concerns regarding lethal autonomous weapons systems.
• Specifically, there is consensus on what is known as a “two-tiered” approach, meaning that there should be both prohibitions on certain types of autonomous weapon systems and regulations on others.
• However, there are still other sticking points. For example, it remains unclear what precisely characterises an autonomous weapon system and what it would look like to legislate “meaningful human control”.
• Recently, however, there has been increased urgency around this issue, in part due to the quickly evolving nature of artificial intelligence, algorithms and, therefore, autonomous systems overall.
Manorama Yearbook app is now available on Google Play Store and iOS App Store