May 21, 2020
United Nations Secretary-General António Guterres has flagged the imperative of banning killer robots in his report on the protection of civilians in armed conflict. UN Security Council members will hold an open debate on the report findings next Wednesday, 27 May. This is the first annual protection of civilians report since 2013 to highlight concerns over killer robots.
The Campaign to Stop Killer Robots urges states to heed the UN leader’s call to quickly agree on the “limitations and obligations that should be applied to autonomy in weapons.” States should launch negotiations now on a new international treaty to prohibit fully autonomous weapons and retain meaningful human control over the use of force.
The 2013 report by UN Secretary-General Ban Ki-moon became the first UN leader to raise serious questions over the prospect of weapons systems that, once activated, can select and engage targets without further human intervention. He called for the urgent commencement of multilateral discussion on such questions.
States heeded that call at the end of that year, when they agreed to begin discussing questions relating to lethal autonomous weapons systems at the Convention on Conventional Weapons (CCW) in Geneva. The CCW talks have made some progress since then to identify key issues of concern regarding autonomy in weapons systems. Several states have committed not to acquire or develop lethal autonomous weapons systems and 30 states have called for a ban on such weapons systems.
Yet, the CCW discussions have yielded little in the way of a lasting multilateral outcome due to the opposition by a handful of military powers, most notably Russia and the United States, which firmly reject proposals to negotiate a new international treaty or protocol.
The 2013 protection of civilians report acknowledged “important concerns” over the capacity of lethal autonomous weapons systems to operate in accordance with international humanitarian and human rights law. Seven years on, Guterres finds there are still important doubts about whether attacks using such weapons would conform with international humanitarian law. Like his predecessor, Guterres also cites “fundamental moral and ethical issues in allowing technology to decide to take human life.”
In the 2020 report, Guterres finds that “all sides appear to be in agreement that, at a minimum, retention of human control or judgement over the use of force is necessary.” He notes that “a growing number of Member States have called for a prohibition of LAWS.”
Since November 2018, the UN Secretary-General has repeatedly expressed his desire for a new international treaty to ban killer robots. In a January 2020, Guterres warned that such weapons systems “are bringing us into unacceptable moral and political territory.” At the Human Rights Council in February, he called for specific measures to protect human rights from emerging technologies, repeating his call for a ban treaty to “ensure that autonomous machines are never given lethal capacity outside human judgment or control.”
As Security Council president this month, Estonia will chair a virtual session on 27 May to hear views on the report and its recommendations. All five permanent UN Security Council members are investing heavily in the development of weapons systems with autonomous functions. Unlike France, Russian Federation, United Kingdom, and the United States, China says that existing international humanitarian law must be strengthened when it comes to fully autonomous weapons through the development of a new international treaty.
Most of the ten non-permanent Security Council members have expressed serious concern over lethal autonomous weapons systems and participate in the CCW talks, notably Belgium, Estonia, Germany, South Africa, and Tunisia. In April, Germany’s foreign minister Heiko Maas called such weapons systems “a red line we should never cross,” because “letting machines decide over life and death of human beings goes against ethical standards and undermines human dignity.”
Last October, Indonesia delivered a statement on behalf of the 120+ member Non-Aligned Movement that reiterated the urgent need for a legally-binding instrument stipulating prohibitions and regulations on lethal autonomous weapons systems.
The other non-permanent Security Council members have not elaborated their views on killer robots: Dominican Republic, Niger, Saint Vincent and the Grenadines, and Viet Nam.
Extracts of the killer robots content in the 2020 and 2013 Protection of Civilians reports follows.
2020 Protection of Civilians report
37. It is also important to move expeditiously to address concerns over the implications posed by developments in the area of lethal autonomous weapon systems (LAWS). Autonomous weapons are generally considered to be systems that are enabled to select and attack a target – whether a person or an object – without human intervention. While LAWS are not specifically regulated by IHL treaties, it is undisputed that any autonomous weapon system must be capable of being used, and must be used, in accordance with IHL. There are, however, important doubts on how the use of LAWS to carry out attacks can conform to IHL. There are also fundamental moral and ethical issues in allowing technology to decide to take human life.
38. A growing number of Member States have called for a prohibition of LAWS. Others believe that the application of existing IHL is sufficient to regulate their use. All sides appear to be in agreement that, at a minimum, retention of human control or judgement over the use of force is necessary. It is imperative that Member States, with the support and active participation of the United Nations and other international organizations, civil society and the private sector, quickly reach common understanding on characteristics, as well as on agreed limitations and obligations, that should be applied to autonomy in weapons.
2013 Protection of Civilians report
28. The proliferation of drone technology and the increasing resort to such weapons systems will also further sharpen the asymmetry that exists in many conflicts between State and non-State parties. As technology allows one party to become increasingly removed from the battlefield, and the opportunities to fight against it are reduced, we may see technologically inferior parties increasingly resort to strategies intended to harm civilians as the most accessible targets. Moreover, drone technology increases opportunities to conduct attacks that might otherwise be considered unrealistic or undesirable through other forms of air power or the deployment of ground troops. As the ability to conduct attacks increases, so too does the threat posed to civilians.
29. In the future, these concerns, and others, may apply also to the use of autonomous weapons systems, or what are known as “killer robots”, which, once activated, can select and engage targets and operate in dynamic and changing environments without further human intervention. Important concerns have been raised as to the ability of such systems to operate in accordance with international humanitarian and human rights law. Their potential use provokes other questions of great importance: is it morally acceptable to delegate decisions about the use of lethal force to such systems? If their use results in a war crime or serious human rights violation, who would be legally responsible? If responsibility cannot be determined as required by international law, is it legal or ethical to deploy such systems? Although autonomous weapons systems as described herein have not yet been deployed and the extent of their development as a military technology remains unclear, discussion of such questions must begin immediately and not once the technology has been developed and proliferated. It must also be inclusive and allow for full engagement by United Nations actors, ICRC and civil society.