October 30, 2020
Concerns over killer robots featured prominently at the high-level opening segment of the historic 75th session of the United Nations General Assembly this year.
Pope Francis warned that lethal autonomous weapons systems would “irreversibly alter the nature of warfare, detaching it further from human agency.” He called on states to “break with the present climate of distrust” that is leading to “an erosion of multilateralism, which is all the more serious in light of the development of new forms of military technology.”
The Pope’s UNGA address marks the first time that he has commented explicitly on killer robots, indicating the Vatican maybe preparing to intensify its work in this regard. The Holy See first called for a ban on lethal autonomous weapons systems in May 2014.
In his UNGA address, Austria’s foreign minister Alexander Schallenberg concurred the UN Secretary General’s strong concerns over giving “machines the power to decide who lives and who dies.” Schallenberg said, “We have to act now, before the survival of civilians in a conflict zone is determined by an algorithm and before all constraints laid down in international humanitarian law become redundant and decisions are taken by killer-robots without any human control or ethical concerns.”
Schallenberg invited all states to Vienna in 2021 to participate in an international meeting “to address this urgent issue.” Earlier in 2020, Brazil convened an international meeting to discuss how to address autonomous weapons systems, while Germany held the first virtual meeting on the subject at the beginning of the COVID-19 pandemic. Japan is also scheduled to hold its meeting on killer robots in December 2020.
UNGA First Committee
The 75th session of UNGA First Committee on Disarmament and International Security opened on 9 October was also held virtually as well as in-person at a physical distance. At the opening, UN High Representative for Disarmament Affairs, Izumi Nakamitsu, warned that the global pandemic has not slowed “the ceaseless development of new technology” and urged nations to make diplomatic progress, especially to ensure that “humans remain in control of weapons and the use of force.”
Approximately three dozen states raised killer robots in their statements to the 75th session of the UNGA this year—listed below—while dozens more aligned themselves with statements by the European Union, Non-Aligned Movement, and Nordic group. Many states that spoke expressed concern at the prospect of lethal autonomous weapon systems and drew attention to the need for multilateral action.
Some reiterated their strong desire for a new international treaty to prohibit and restrict killer robots. Brazil described a legally binding instrument as “the best option to ensure human control” and cautioned that the opportunity to adopt one is narrowing quickly. China recommended the CCW continue its deliberations on killer robots “with the purpose of negotiating a legally-binding international instrument.” The Philippines identified the need for “a robust and future-proof” international treaty on killer robots.
Several states–most from Europe–urged continued CCW deliberations on killer robots and welcomed a set of guiding principles developed in 2018 and 2019 to guide the CCW talks. Several explicitly urged states make progress on killer robots concerns by the CCW Sixth Review Conference in December 2021, including Brazil, Finland, France, Iceland, Netherlands, and the EU.
Albania and Iceland spoke on the issue of killer robots for the first time at UNGA, pushing the number of states that have commented on killer robots to a total of 99.
The International Committee of the Red Cross (ICRC) warned that, “erosion of human control over the use of force creates clear risks for civilians and combatants who are no longer fighting, challenges related to compliance with IHL, and fundamental ethical concerns about leaving life-and-death decisions to sensors and software.” The ICRC founds that “internationally agreed limits on autonomous weapons must be urgently established.”
The Campaign to Stop Killer Robots urged states to launch negotiations on an international treaty to ban fully autonomous weapons, in order to “ensure that future technologies are developed and used to promote peace and respect for each other’s inherent dignity.”
During UNGA First Committee, 37 states referred to killer robots in their statements: Albania, Australia, Austria, Bulgaria, Brazil, China, Colombia, Costa Rica, Cuba, Denmark, Ecuador, Estonia, France, Finland, Germany, Greece, Iceland, India, Ireland, Italy, Japan, Liechtenstein, Peru, Philippines, Nepal, Netherlands, North Macedonia, Poland, Portugal, Republic of Korea, San Marino, Slovakia, Spain, Sri Lanka, Sweden, Switzerland, Turkey, Venezuela. There were group statements by the European Union, Non-Aligned Movement, and Nordic countries as well as by the Campaign and ICRC. Relevant extracts are provided below
(16 October) We need to continue our deliberations on issues such as Lethal Autonomous Weapons Systems, outer space and cybersecurity and counter the threats posed by the illicit trafficking of small arms and light weapons.
(12 October) New or emerging technologies with advanced artificial intelligence (Al) and enhanced autonomous functions are becoming increasingly prevalent in both civilian and military sectors. Australia recognises the potential value and benefits that Al brings to military and civilian technologies. Australia values the work of the Group of Governmental Experts on Lethal Autonomous Weapons Systems that has been considering the technical, legal and international security implications of the potential development of autonomous weapons.
(12 October) As we all know, the advances in technology and artificial intelligence do not only extend to our homes, but also extend to their application in weapons systems, with potentially unacceptable consequences. We cannot allow lethal autonomous weapon systems (LAWS) to be developed and deployed in armed conflicts, using lethal force without human control over critical functions. This would fundamentally undermine international humanitarian law and ethical standards. We concur with the Secretary General’s assessment that such a scenario would be “politically unacceptable and morally repugnant” – the time to prevent it through a legally binding norm is now. To address the issue of LAWS in more detail, Austria will organize an international conference in 2021 – we invite all of you to participate, hoping the global health situation allows for such a conference to take place.
(12 October) The historical window for adopting an appropriate legal framework to regulate the issue of lethal autonomous weapons systems is narrowing quickly. Brazil believes that a legally binding instrument is the best option to ensure human control over critical functions in autonomous systems, which is paramount to prevent violations to international law. Throughout last year, Brazil has organized or co-sponsor a number of initiatives aimed at promoting dialogue and further common understanding on the issue of LAWS, including a table-top exercise on the Human Element and Autonomous Weapons Systems for Latin America and the Caribbean, carried out by UNIDIR in early September with the support of the Brazilian Foreign Ministry, as well as the Rio Seminar on Autonomous Weapons Systems, last February. Those numerous initiatives prompted by Brazil should provide valuable inp ut for debates in the GGE/LAWS, and for its recommendations to the 2021 Review Conference of the CCW.
(16 October) The Republic of Bulgaria actively participates in the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems with a view to developing and adopting an effective and comprehensive normative and operational framework for control on their production, use and transfer. We believe that the 11 Guiding principles are an excellent basis in this regard.
(12 October) China also supports continued in-depth discussions on Lethal Autonomous Weapons Systems (LAWS) within the framework of Convention on Certain Conventional Weapons (CCW), with the purpose of negotiating a legally-binding international instrument.
(12 October) We regret that the Conference on Disarmament has not made any progress so far. This places us in a scenario of lack of regulation and normativity in the face of the vertiginous development of artificial intelligence and other technologies applied to the design and development of new types of weapons. New technologies must be implemented under the precepts of the principle of humanity. If they are accepted and used, their deployment must prioritize the mitigation of superfluous damage or unnecessary suffering for those involved in an armed conflict and, of course, for protected persons.
(9 October) Human control over the use of force must remain constant. We call for a strengthening of our work on the Convention on Certain Conventional Weapons (CCW), as a lack of common rules and understandings regarding the application of international law increases the risk that we will one day face autonomous weapons systems that cannot be used in accordance with humanitarian principles and international law. Costa Rica reiterates its support for the work of the CCW to adopt a legally binding instrument in this regard.
(12 October) We will continue to advocate the adoption, as soon as possible, of a Protocol banning lethal autonomous weapons, even before they are manufactured on a large scale. In addition, regulations are required for the use of weapons with certain autonomous capabilities; in particular military combat drones, which are causing a high number of civilian casualties.
(19 October) Denmark supports the work of the GGE on Lethal Autonomous Weapons Systems (LAWS), in particular the 11 guiding principles. In our work on these principles we should in particular aim to develop an understanding of the type and degree of human machine interaction
(12 October) We reject the increasing use and improvement of artillery UAVs, as well as lethal autonomous weapons. The militarization of artificial intelligence presents challenges for international security, transparency, control, proportionality, and accountability.
(15 October) We support the efforts to universalize and strengthen the Convention on Certain Conventional Weapons. Regarding emerging technologies in the area of Lethal Autonomous Weapons Systems, we welcome the agreement on the 11 Guiding Principles last year and the start of the 2020 Group of Governmental Experts on LAWS. We should aim to elaborate on how international law, in particular international humanitarian law, applies to weapon systems with autonomous functions, explore how states can implement mechanisms of command and control, individual accountability, and verify that LAWS are used in accordance with international law. We are convinced that the CCW is the appropriate forum for such discussions.
(9 October) In the work of the Group of Governmental Experts on Lethal Autonomous Weapon Systems, our aim is an effective normative and operational framework, adopted by consensus by all parties to the process. It is an ambitious aim, but one that Finland will fully strive for. The 11 Guiding Principles are an excellent basis on which States can continue building a practical outcome. We welcome the continuation of the work of the GGE and the constructive discussions that took place at the September meeting. The aim is still to achieve concrete results by 2021. With patience and flexibility on all sides, we will be able to reach an outcome all parties can commit to. We should strive for nothing less.
(16 August) In the field of conventional armaments, the issue of weapons systems based on emerging technologies in the field of SALA is of major importance and we welcome the adoption of the “Eleven Guiding Principles” in 2019 by the High Contracting Parties to the Convention on Certain Conventional Weapons. We stand ready to work, in view of the CCW Review Conference in 2021, to further develop these guidelines and to clarify how they can serve as a basis for the development of a robust but also universally accepted operational and normative framework.
(12 October) We continue to support the work in the framework of the Convention on Certain Conventional Weapons [CCW] towards a normative and operational framework on Lethal Autonomous Wepons Systems [LAWS] and welcome the productive spirit of this year’s meeting of the Group of Governmental Experts in Geneva. We hope for an inclusive continuation of the GGE process bearing in mind the current restrictions.
(16 October) There have been numerous developments in the Conventional Weapons Disarmament realm since the 74th First Committee. I would like to briefly highlight the constructive deliberations, in the context of the CCW, on the Lethal Autonomous Weapons Systems (LAWS) under the chairmanship of Ambassador Ljupco Jivan Gjorgjinski of North Macedonia. We believe that further discussions regarding the aspects of the normative and operational framework to address the challenges deriving from the incorporation of emerging technologies in the area of LAWS, should be based on the implementation of the 11 agreed Guiding Principles at a national level. In this context we believe that CCW provides the appropriate framework since it seeks to strike the balance between military necessity and humanitarian concerns.
(9 October) In addition to new technologies, artificial intelligence is another issue that we cannot ignore, and for this reason we believe that the Conference of States Parties to the Convention on Certain Conventional Weapons is an appropriate forum to continue working on the subject, especially in the creation of an instrument that prohibits the commonly called killer robots or lethal autonomous weapons.
(9 October) New challenges and frontiers in the field of disarmament, including increasing activities in outer space and lethal autonomous weapons, need to be coherently addressed, drawing on existing international law, 5 norms, and conventions. Any translation of the growing interest in activities in outer space into arms race or, for that matter, weaponization of space is unacceptable. We are encouraged by the work of the GGE on the lethal autonomous weapons, in particular its consensus on the 11 Guiding Principles. We hope for concrete results based on these principles in time for the CCW Review Conference in late 2021.
(14 October) India attaches high priority to the CCW which serves as one of the most important legal instruments to address the legitimate defence requirements of States while striking a balance with humanitarian concerns. In this context, India has been actively participating in the deliberations of the GGE on LAWS within the framework of the CCW and looks forward to concrete recommendations to be agreed by consensus.
(12 October) The fast-paced nature of technological advancements raise a number of difficult legal, political, military and ethical questions. It is our view that working to achieve consensus within the UN system remains the most effective way to address the emerging challenges posed by Lethal Autonomous Weapons Systems. Therefore, it is crucial that those countries most likely to develop these new weapon systems meaningfully engage as we work toward the development of a normative framework. We will continue to work within the Group of Governmental Experts as we address the considerable ethical, moral and legal dilemmas posed by LAWS.
(19 October) We especially welcome the substantive outcomes of the work of the GGE LAWS, with the endorsement of the eleven guiding principles. Bearing in mind the importance of applying IHL to all weapons systems, we believe it is crucial to reach consensus on the possible elements of a normative and operational framework.
(16 October) Japan welcomes the ongoing discussions on emerging technologies in the area of LAWS within the CCW framework, and will also continue its contribution to the international rule-making effort.
(9 October) Liechtenstein supports a new regulatory framework for lethal autonomous weapons systems, in the form of legally binding standards to ensure a human component in the decision-making processes of such systems. An element of meaningful human control across the entire life cycle of lethal autonomous weapons systems is essential and helps to ensure compliance with applicable law, including international humanitarian law. The Alliance for Multilateralism has made an important political commitment to advance this agenda with its declaration on Lethal 5 Autonomous Weapons Systems (LAWS), which Liechtenstein fully supports. In these times of strong nationalist tendencies and hostility towards cooperative and multilateralist approaches to disarmament, such initiatives are welcome vehicles to advance our common objectives. The history of the United Nations is shaped by coalitions of the willing – a reconciliatory perspective on the UN’s 75th anniversary against the difficult political odds of today.
(16 October) Nepal supports the international normative frameworks to regulate the use of frontier technologies including drones and lethal autonomous weapons.
(9 October) Secondly, new technologies come with great opportunities. Cyberspace, artificial intelligence and technological developments in outer space come with many societal and economic benefits. However, these dual-use technologies can generate security challenges too. Malicious cyber operations disrupting our societies are a real, credible threat. Likewise, we reject the development of fully autonomous weapons systems, which are not under meaningful human control…The Netherlands also reiterates the essential role of multilateralism concerning Lethal Autonomous Weapon Systems (LAWS) by pointing out that good progress has been made within GGE on LAWS under the Convention on Certain Conventional Weapons (CCW) in 2019. The eleven guiding principles reflect that there is a common understanding among State parties that humans must have some form of control over autonomous weapons to ensure compliance with International Humanitarian Law and International Human Rights Law. In light of the Sixth Review Conference of the CCW in 2021, discussions need to move forward so that we can continue to make progress on issues like ‘humanmachine’ interaction.
(12 October) Since representative of North Macedonia is chairing the GGE on LAWS, we deem necessary to briefly address the issue of emerging technologies in the area of lethal autonomous weapon systems. In the past years at the meetings held, including the last one held in September 2020, many states have elaborated their views and all have acknowledged the central importance of human control over the use of force. We would like to highlight the need of urgent progress in this area. We are hoping that more considerable results could be achieved at the second set of meetings, which should convene in November 2020.
(9 October) In the current context, new and rapidly proliferating technologies are modifying contemporary conflicts, creating new challenges to international humanitarian and human rights law, as well as to the maintenance of international peace and security. I am referring in particular to unmanned aerial vehicles (UAVs) that are used as a weapon of war by covert armed forces and non-state actors, and to Lethal Autonomous Weapons Systems (SAAL). In the first case, we consider imperative its regulation of use, transfer and proliferation by the international community, and in the second, the urgent need to define lethal autonomous weapons systems and to identify their characteristics as a starting point for a process regulating them.
(9 October) The Philippines sees the CCW as the appropriate framework to address potential threats arising from lethal autonomous weapons systems (LAWS), including possible acquisition by armed non-state actors. There is a need for a robust and future-proof legally-binding instrument to address these threats.
(12 October) Emerging and disruptive technologies enable new methods and means of warfare, raising fundamental questions that cut across traditional concepts of international 5 relations and international law. From security perspective, there are concerns about the ability of new weapons to destabilize security relations and increase unpredictability. This might be the case for example with new sophisticated hypersonic weapons or anti-satellite systems. There are concerns over the potential of new technologies to be used to conduct malicious activities, that fall short of traditional thresholds for use of armed force. Due to the rapidly evolving nature of technologyrelated challenges, the legally binding instruments might not provide us with adequate solutions. We should rather look for more pragmatic solutions, starting with increased transparency and confidence building measures.
(19 October) On new and emerging threats related to Lethal Autonomous Weapons (LAWS), cyberspace and militarization of outer space, we should encourage the implementation of norms of responsible state behaviour, transparency and the respect for international law, human rights and international humanitarian law.
Republic of Korea
(9 October) The ROK is also fully committed to the Convention on Certain Conventional Weapons (CCW), which strikes a balance between security concerns and humanitarian considerations. Particularly, we welcome the progress made by the Group of Governmental Experts on Lethal Autonomous Weapons Systems (GGE LAWS) within the framework of the CCW through the adoption of the 11 guiding principles. We hope that the collective efforts through the GGE process will continue until a consensus on normative and operative frameworks on emerging technologies is reached.
(19 October) San Marino is also concerned for the consequences of the application of technology and artificial intelligence in weapons systems, which pose serious legal and ethical doubts. We therefore need to cooperate and address the emerging challenges related to Lethal Autonomous Weapons Systems.
(19 October) Rapid technological development poses new challenges for international community, particularly in the area of Lethal Autonomous Weapons Systems (LAWS). Slovakia supports the work of the Group of Governmental Experts (GGE) on LAWS in the framework of the Convention on Certain Conventional Weapons (CCW). The agreed 11 Guiding Principles are a welcomed step forward. The in-depth productive expert discussions within the Group showed that further progress on this issue is possible and needed. To move forward towards aspects of the normative and operational framework on emerging technologies in the area of LAWS, we believe we need to further deepen the discussions and seek consensus on relevant elements of LAWS, including the human-machine interaction
(12 October) In relation to lethal autonomous weapons systems, we continue to propose to establish a code of conduct including measures of transparency, trust-building and exchange of information and best practices, with possible advances related to artificial intelligence.
(14 October) Having initiated State level discussions on Lethal Autonomous Weapons Systems during its Presidency of the CCW in 2015, which paved the way to the Government Group of Experts (GGE) in 2016, Sri Lanka supports the ongoing discussions within the framework of the CCW GGE on Lethal Autonomous Weapons Systems (LAWS), and encourages the continuation of the GGE process. While recognizing the positive benefits that could accrue from the dual–use nature of the technology, new technological developments, Artificial Intelligence (AI) – the development of LAWS devoid of any human control have created unprecedented risks and challenges to humanity. In our view centrality of human control is fundamental. Legal clarity on exact parameters of prohibitive and permissive limitations through the adoption of a new legally binding instrument is the only way forward which would provide clear legal limitations on autonomous weapon systems while complementing and strengthening the existing IHL norms. Sri Lanka encourages the State Parties to the CCW to deepen and fast track the discussion within the GGE to urgently address the issues of possible development and deployment of LAWS to ensure that our efforts on this very important issue are not over -run by the fast-moving realities on the ground.
(14 October)The ever more rapid technological development poses new challenges within the area of disarmament, non-proliferation and arms control. It is important to find effective solutions that prevent unwanted development – the area of LAWS is no exception. Sweden is of the strong conviction that IHL continues to apply fully to all weapon systems and that human control over the use of force always must be upheld. Sweden supports the work of the GGE on LAWS. The CCW is the central forum in which to continue discussions on these issues. The 11 Guiding Principles are a welcomed step forward that should form the basis for further progress. We need to continue to seek consensus around the central elements of LAWS, not least those related to what constitutes human control.
(12 October) The Group of Governmental Experts on Autonomous Weapons is doing essential work in seeking to establish an operational and normative framework applicable to these weapons, guided in this by the requirement to comply with International Humanitarian Law. The GGE should work towards a common understanding on how to ensure the necessary human control over autonomy in weapons systems.
(15 October) Issues such as Improvised Explosive Devices (IEDs) and Lethal Autonomous Weapons Systems (LAWS) continue to be important.
(15 October) However, in light of the context described above, and given the palpable challenges to international law, including international humanitarian and human rights law, as well as to the maintenance of international peace and security, we also express our concern about the growing use and improvement of lethal autonomous weapons systems, including unmanned aerial vehicles, with which in 2018 in our country a frustrated assassination attempt was perpetrated against President Nicolás Maduro. Hence, we advocate in favor of discussions in which ethical, moral, technical and legal considerations of these emerging technologies are addressed, with a view to the eventual adoption of a legally binding instrument that provides an operational and regulatory framework in this matter.
(9 October) We welcome the outcome of the 2019 Meeting of High Contracting Parties to CCW, notably the progress of endorsing the 11 Guiding principles and the start of the 2020 Group of Governmental Experts on Lethal Autonomous Weapons Systems (GGE LAWS). Ahead of the Sixth Review Conference of the CCW to be held in 2021, we will support the GGE in the clarification, consideration and development of aspects of the normative and operational framework for emerging technologies in the area of LAWS. We emphasise that human beings must make the decisions with regard to the use of lethal force, exert control over lethal weapons systems they use, and remain accountable for decisions over the use of force in order to ensure compliance with International Law, in particular International Humanitarian Law and International Human Rights Law.
Nordic Countries delivered by Iceland
(9 October) The Nordic countries support the work of the GGE on Lethal Autonomous Weapons Systems (LAWS), in particular the 11 guiding principles adopted by consensus last year and consequently highlighted in the statement of the Alliance for Multilateralism. It will be important to advance work on these principles, especially regarding human – machine interaction, in the GGE´s work leading up to the CCW review conference next year.
Non-Aligned Movement delivered by Indonesia
(9 October) NAM is of the view that Lethal Autonomous Weapon Systems (LAWS) raise a number of ethical, legal, moral, technical, as well as international peace and security related questions, which should be thoroughly deliberated and examined in the context of conformity to international law, including international humanitarian law and international human rights law. In this regard, NAM States Parties to the Convention on Certain Conventional Weapons (CCW) take note of the adoption by consensus of the 2019 Report of the GGE on LAWS and have agreed that there is an urgent need to pursue a legally-binding instrument on LAWS.
International Committee of the Red Cross
(19 October) The developments in the past 40 years demonstrate that the CCW is a dynamic instrument that can respond to advancements in weapons technology and the evolution of armed conflict, as shown in particular by the Protocols on blinding laser weapons and explosive remnants of war, and the amendment extending the CCW’s scope of application to non-international armed conflicts. Today, it is critical that the CCW lives up to its potential, by responding to new advancements in weapons technology.
With regard to new technologies, the ICRC is concerned about developments towards increasingly autonomous weapon systems, understood as weapon systems that select and apply force to targets without human intervention. The associated erosion of human control over the use of force creates clear risks for civilians and combatants who are no longer fighting, challenges related to compliance with IHL, and fundamental ethical concerns about leaving life-and-death decisions to sensors and software.
The ICRC is convinced that internationally agreed limits on autonomous weapons must be established with some urgency – whether in the form of new, legally binding rules, policy standards or best practices. Rapid military technology developments indicate that this is not a question for the future, but a concern for the present.
It is encouraging, therefore, to see the agreement among High Contracting Parties to the CCW that human control or involvement in the use of force must be retained, and a growing convergence of views on the measures that are needed to ensure it. In practice, strict constraints will be needed on the types of autonomous weapon systems used, and the situations in which they are used. Measures aimed at ensuring human control, as proposed by the ICRC – such as limits on the types of targets, constraints on the environment of use, and requirements for human supervision, intervention and deactivation – can inform these internationally agreed limits.
We are witnessing today a rapid uptake of artificial intelligence (AI) and machine learning for a variety of military applications, especially in “decision-support” or “automated decision making” applications that “recommend” whom, or what, to attack and when. The ICRC remains convinced of the need for a human-centered approach that allows sufficient time for human control and judgement to apply the law and ensure human agency is retained in decisions with serious consequences for people’s lives. AI and machine learning systems are tools that should be used to augment and improve human decision-making, not to replace it.
Campaign to Stop Killer Robots