High-level concerns on killer robots at UN

October 30, 2019

High-level concerns on killer robots at UN

Yet again, the killer robots challenge has received high-level attention at the annual session of the United Nations General Assembly

To re-cap, in 2018 UN Secretary-General Antonio Guterres called the prospect of machines with the power and discretion to kill “morally repugnant and politically unacceptable,” while Germany’s foreign minister Heiko Maas urged nations to support his government’s proposal to ban fully autonomous weapons.

This year during the high-level opening of the UN General Assembly (UNGA), Maas and foreign minister Jean-Yves Le Drian of France appealed to states to prioritise urgent multilateral action to tackle the killer robots threat, climate change, girls’ education and three other “politically relevant” issues of international concern. At least 16 foreign ministers followed Maas and Le Drian’s invitation to co-sign a political declaration endorsing the objective of “developing a normative framework” that would address autonomous weapons. Yet, there is little agreement about what that means in practice. Does this mean more informal guidelines or a new international treaty to prohibit or restrict lethal autonomous weapons systems?

UNGA First Committee

(c) Ari Beser for the Campaign to Stop Killer Robots, October 2019

At the opening of the First Committee on Disarmament and International Security, the UN High Representative for Disarmament Affairs, Izumi Nakamitsu, welcomed a set of “guiding principles” agreed last year as part of a Group of Governmental Experts (GGE) established by the Convention on Conventional Weapons (CCW). She reminded states that “more work remains to be done to ensure humans remain at all times in control over the use of force.”

At least 41 states raised killer robots in their statements to the 74th session of the UNGA this year—as shown below—while dozens more aligned themselves with statements by the European Union, Non-Aligned Movement, and Nordic group statements on killer robots.

Libya, Namibia, and San Marino spoke on killer robots for the first time at the 2019 UNGA session, bringing the number of countries that have commented on this topic since 2013 to a total of 93. Namibia became the 30th state to call for a treaty to ban killer robots. It said it favors “a protocol prohibiting lethal autonomous weapons” as such weapons “are totally incompatible with international humanitarian law.” Libya acknowledged, “these weapons are being developed in a very fast manner and this can cause threats to peace and security, especially if those technologies are available to illegal groups and organizations.” San Marino commented that “deep ethical and legal doubts … need to be addressed” and said “meaningful human control is required over life and death decisions.”  

Several states reiterated their previous calls to prohibit killer robots by creating a new treaty, including Austria, China, Cuba, Ecuador, and Guatemala. China stated it “believes it necessary to reach an international legally-binding instrument on fully autonomous lethal weapons systems in order to prevent automated killing by machines,” encouraging states to “first reach an agreement on the issues such as the definition and scope of LAWS.”

Many states stressed the need to retain some form of human control over weapons systems and the use of force, particularly Austria, Brazil, Ireland, Italy, Liechtenstein, Mexico, and the Netherlands. The European Union statement emphasized that “human beings should make the decisions with regard to the use of lethal force, exert control over lethal weapons systems they use, and remain accountable for decisions over life and death in order to ensure compliance with international law, in particular international humanitarian law and international human rights law.” Most countries supported continued CCW talks on killer robots, including Russia, which in 2018 called the deliberations “extremely premature and speculative.”

Campaigning activities at UNGA

(c) Ari Beser for the Campaign to Stop Killer Robots, October 2019

The Campaign to Stop Killer Robots presence at UNGA was strongly felt this year throughout UN headquarters. The Campaign’s Silicon Valley Lead, Marta Kosmyna, delivered a UNGA statement on 18 October that called on states to launch negotiations on a ban treaty to preserve meaningful human control over the use of force. The Campaign described that “commitments to vague ‘normative frameworks’ and additional ‘guiding principles’ are a form of diplomatic treading water” and found that “an effective legal instrument is both achievable and necessary.”

More than 100 UNGA delegates attended a Campaign to Stop Killer Robots side event on 21 October that included a speech by the UN disarmament chief. Nakamitsu repeated the call by UN Secretary-General Antonio Guterres to ban killer robots as a matter of urgency and praised the Campaign, stating, “[C]ivil society has played a leading role in providing impetus to the diplomatic process and must continue to do so. I commend the Campaign to Stop Killer Robots and other civil society partners for your tireless efforts, and also for your imagination and inclusiveness in this work.”

Nobel Peace Laureate Jody Williams, tech worker Liz O’Sullivan of the International Committee for Robot Arms Control (ICRAC), and Colombian scout and youth campaigner Mariana Sanz Posse addressed the side event, which also featured the Campaign’s robot mascot “David Wreckham.” The first press conference at UN headquarters by the Campaign attracted significant interest, resulting in coverage by media outlets such as Associated Press, Japan’s NHK, The Guardian, and Xinhua, The UN Office for Disarmament Affairs also published a summary overview of the Campaign side event on its website.

The weekend before, the Campaign took its robot mascot around Manhattan, including Times Square, and challenged tech companies that have yet to pledge not to develop fully autonomous weapons to do so without delay. Campaign members also participated in the eighth annual Humanitarian Disarmament Forum convened by the Colombian Campaign to Ban Landmines and held a “watch party” to see a “killer robots” episode by the CBS-US show Madam Secretary.

During UNGA First Committee, campaigners met with diplomatic representatives from dozens of states. The Campaign contributed a chapter to WILPF Reaching Critical Will’s annual First Committee Briefing Book and several articles for the weekly First Committee Monitor.

UNGA Statements

(c) Ari Beser for the Campaign to Stop Killer Robots, October 2019

The following extracts show how at least 42 countries raised killer robots in their remarks to the 74th annual session of the UN General Assembly and its First Committee on Disarmament and International Security: Argentina, Australia, Austria, Brazil, Bulgaria, Burkina Faso, China, Cuba, Czech Republic, Denmark, Ecuador, Egypt, Estonia, Finland, France, Germany, Guatemala, India, Ireland, Israel, Italy, Japan, Kazakhstan, Libya, Liechtenstein, Lithuania, Mexico, Myanmar, Namibia, Netherlands, Poland, Portugal, Republic of Korea, Russia, San Marino, South Africa, Spain, Sri Lanka, Switzerland, Turkey, United Kingdom, and United States.* Many of these states repeatedly raised concerns over killer robots concerns and called for new law. Three groups of states included killer robots in their remarks: European Union, Non-Aligned Movement (delivered by Indonesia), and Nordic states (delivered by Finland).

*According to the campaign’s count, 49 states raised killer robots at UNGA in 2018, 34 did so in 2017, 36 in 2016, 32 in 2015, 23 in 2014, and 16 in 2013.

Argentina

(25 October) In this context, Argentina notes with concern the development of capabilities in lethal autonomous weapons systems (LAWS) and considers it important to rely on prior consensus, analyzing the issue from a preventative perspective, with a view to maintaining responsibilities for human agents and ensuring respect for international humanitarian law and human rights.

Australia

(11 October) New or emerging technologies with implications for global security present contemporary challenges in relation to regulating their development and use. That is why Australia values the CCW as the most appropriate forum to address these questions, in particular to elaborate additional guiding principles on lethal autonomous weapons systems.

(25 October) Australia is actively participating in the Group of Governmental Experts on Lethal Autonomous Weapons Systems (LAWS) in Geneva. We consider the Convention on Certain Conventional Weapons the most appropriate forum for taking discussions on this issue forward.

Austria

(14 October) The weaponization of artificial intelligence poses fundamental challenges to international law, international humanitarian law in particular. Against the backdrop of rapid technological progress, we need to urgently draw the line between the acceptable and the unacceptable. It is an ethical and legal imperative that humans must remain in control of selecting and engaging targets. In the area of LAWS, we have a unique opportunity and a moral obligation to act, and to act swiftly. Inaction would undermine our current legal framework, which is based on humans, not machines. Therefore, Austria supports the immediate commencement of negotiations on a legally binding commitment to ensure human control over decisions of life and death.

(24 October) We believe it is not only a responsibility or obligation, but also in our shared security interest to regulate the issue of lethal autonomous weapons systems before we are overtaken by facts on the ground. Therefore, Austria fully supports the immediate start of negotiations of a legally binding instrument to ensure meaningful human control over selecting and engaging targets. As more and more political leaders voice their commitment that humans must remain in control over life and death, the GGE on LAWS should step up its efforts in order to allow the international community to implement this political will. Austria believes that it will be key for the success of the GGE LAWS to focus and advance on the issue of human control.

Brazil

(25 October) Brazil is Party to all its protocols to the CCW and actively participates in the deliberations therein, including on the emergence of new threats and challenges to the implementation of International Humanitarian Law, including in the recent discussions on Lethal Autonomous Weapons Systems. These systems are intrinsically problematic, posing profound ethical, legal and political challenges. Given the exponential technological advances in the fields of robotic weapon systems, miniaturization and artificial intelligence, the historical window for adopting appropriate legal and operational framework to regulate the issue is narrowing very quickly. Although the final report of the recently concluded GGE on LAWS in the CCW has fallen short of our expectations for a clear mandate for the negotiation of a legally-binding agreement on the issue, we hope that future discussions on a possible regulatory framework might bring us closer to a substantive consensus in this regard. This includes the establishment of meaningful human control in the interface between humans and machines, as well as the improvement of international law, including international humanitarian law (IHL), on the matter.

Bulgaria

(17 October) The Republic of Bulgaria actively participates in the Group of Governmental Experts on lethal autonomous weapons systems, with the view to developing and adapting a comprehensive and operational framework on their production, use, and transfer.

Burkina Faso

(25 October) With regard to lethal weapons, they constitute a serious source of concern because development and projection of the use of weapons not requiring human intervention raises many questions. It is therefore urgent to look very seriously into the threat posed by this new category of weapons.

China

(25 October) China attaches great importance to the humanitarian, legal and ethical concerns caused by Lethal Autonomous Weapon Systems (LAWS), and supports in-depth discussions on LAWS within the framework of CCW. Although LAWS is a concept of future weapons yet non-existent, China believes that it is necessary to reach an international legally-binding instrument on fully-autonomous lethal weapon systems in order to prevent automated killing by machines. All parties should first reach an agreement on the issues such as the definition and scope of LAWS. The GGE meeting on LAWS has agreed on a guideline by consensus this year when renewing the mandate of the GGE. China commends these efforts, especially the positive role of the GGE Chair (Mr. JTVAN, Minister-Counselor of the Republic of North Macedonia) in this regard. This fully demonstrates that the Convention is the appropriate venue to discuss this issue. China will work with all parties to strengthen exchanges and actively explore effective ways to solve this problem.

Cuba

(14 October) The Convention on Certain Conventional Weapons is the ideal forum for negotiations of a legally binding instrument that prohibits lethal autonomous weapons and regulates the semiautonomous, including military attack drones.

(24 October) My delegation advocates for the adoption, as soon as possible, of a protocol that prohibits lethal autonomous weapons, before they begin to be produced on a large scale. In addition, regulations for the use of weapons with some autonomy should be established, in particular for military attack drones, which are causing a high number of civilian casualties. These types are weapons are totally incompatible with international humanitarian law.

(29 October) We have to make progress in legally-binding initiatives which are agreed multilaterally to prevent militarization of cyberspace, outer space, and lethal autonomous weapons, such as attack drones. We demand that the countries that are producing stop opposing the security and well-being of citizens around the world on behalf of their own interests of the military industrial complex.

Czech Republic

(24 October) We welcome the work on Lethal Autonomous Weapons Systems (LAWS). In our view, it is indispensable for the CCW High Contracting Parties to have sufficient guidance on how to ensure that any new weapon, means, or methods of warfare are in compliance with the International Humanitarian Law, which is our main objective.

Ecuador

(17 October) We reject the increasing use and improvement of unmanned aerial artillery vehicles as well as lethal autonomous weapons. The international community in its various regional and universal forums must continue to work more in depth on the implications that is has for international humanitarian law, even providing for the prohibition of these types of weapons. We support, for this purpose, work within the framework of the Convention on Certain Conventional Weapons and we believe that regulation through international trade alone is insufficient.

(25 October) We also reject the increasing use and improvement of unmanned aerial artillery vehicles as well as lethal autonomous weapons. We support the work under the Convention on Certain Conventional Weapons and believe that the regulation only of international trade is insufficient.

(29 October) On unmanned aerial vehicles and artificial intelligence, Ecuador rejects the increasing use of unmanned aerial drones as well as autonomous weapons. The international community in its various regional and universal forums must continue to work more in depth on the implications that it has for humanitarian law and human rights law, even providing for the prohibition of these types of weapons. The militarization of AI also represents a challenge for international security, also transparency and control, proportionality and responsibility. We cannot, for example, ignore reports that we’ve seen through the UN system about topics that have been included in our reports, including the reports by the Special Rapporteur on Extrajudicial Executions, and my country would like to state that only regulating is not enough. We must make all the efforts that we can in this area, in order to ensure that the UN organization can ensure that we have satisfactory results.

Egypt

(29 October) Taking into consideration the rapid scientific and technological developments in several strategic fields, there are several domains which have a direct impact on international security that are left without any internationally agreed rules to prevent them from turning into scenes of arms races and armed conflicts and to ensure the reliable continuation of the contribution of the relevant technologies to development and welfare. Cyberspace, outer space, and the applications of artificial intelligence are prominent examples of such domains. The lack of progress in addressing the severe security threats that arise in such domains is clearly not due to the lack of technical expertise on the part of the international community, but is rather due to the continued misguided belief by some states that an absolute dominance in such domains can be maintained, and thereby resisting any efforts towards development of rules-based international regimes prohibiting the malicious uses of weaponization of such technologies. In a multipolar world, where the relevant technologies are available and accessible to many state and nonstate actors, this approach can only lead to an arms race that no one can win, while international security continues to severely deteriorate….We believe it is time to move forward in the most inclusive and action-oriented manner to elaborate legally binding rules in all these strategic domains. Nonbinding norms and voluntary confidence-building measures are interim steps that cannot provide sufficient guarantees in the area of international security and arms control in the long term.

Estonia

(11 October) Regarding emerging technologies in the area of lethal autonomous weapons systems, we are convinced that the CCW is the most appropriate forum for such discussions, bringing together the right expertise. Estonia welcomes the outcome of the 2019 session, in particular agreement on the 11 Guiding Principles, and on the two-year timeline until the 2021 CCW Review Conference.

Finland

(11 October) Finland has engaged actively in the work of the Group of Governmental Experts on Lethal Autonomous Weapon Systems. Our aim is an effective normative framework for LAWS, adopted by consensus by all parties to the process. It is an ambitious aim, but one that Finland will fully strive for. The 11 Guiding Principles agreed by the GGE are an excellent basis on which States can now begin building a practical outcome. For negotiations on this extremely complex topic, the Geneva GGE is the appropriate forum – the only game in town. We support a new mandate for the GGE as proposed in August, with the clear aim of achieving concrete results by 2021. With patience and flexibility on all sides, we will be able to reach an outcome all parties can commit to. We should strive for nothing less.

France

(14 October) The implementation of the Convention on Certain Conventional Weapons should continue, including the prospective question of lethal autonomous weapons systems, or improvised explosive devices. My country will continue to provide the necessary impetus and expertise to progress on these topics. 

(24 October) The GGE’s work on emerging technologies in the field of lethal autonomous weapons is the perfect example [of gathering political, legal, military, and diplomatic expertise]. France actively participated in the work of the GGE, which allowed the establishment of consensus principles to guide the development and use of autonomous weapons systems. France welcomes the agreement of all government experts recommending the continuation of work as part of a structured process that will allow substantial progress in the coming years.

Germany

(10 October) The world seems to be at the beginning of a new arms race, fueled to a considerable extent by new technologies. If left unchanged, our current arms control architecture risks being eroded by future weapons systems featuring autonomous functions, cyber instruments, or new missile technologies. In order to provide solutions to these challenges, German Foreign Minister Maas has initiated a dialogue aimed at capturing new technologies and rethinking arms control.

(24 October) The CCW’s work on LAWS has been constructive this year. For the first time, the GGE recommends to High Contracting Parties to take action: to endorse, in November of this year, the eleven guiding principles agreed upon by the GGE. These principles will, thus, fulfill an important guiding function. Germany looks forward to taking discussions forward in this GGE in a results-oriented way to ensure progress towards strong normative and operational frameworks for the RevCon 2020.

Guatemala

(11 October) The new technologies, artificial intelligence, are other issues that we cannot leave behind, and that is why we believe that the Conference of States Parties to the Convention on Certain Conventional Weapons is an appropriate forum to continue working on the subject, especially in the creation of an instrument that prohibits the so-called killer robots or lethally autonomous weapons.

(25 October) The scientific and commercial advances in the field of artificial intelligence and the use of technology for the development of new weapons must be prohibited by means of a legally binding instrument. Commonly called killer robots or lethal autonomous weapons pose a serious danger to humanity if they are not controlled by a human.

India

(14 October) We look forward to further discussions in the GGE on LAWS to explore and agree on possible recommendations based on the 11 principles which have enjoyed consensus support in the GGE.

(24 October) We welcome the progress made in the GGE on LAWS over the last three years since its establishment in 2017. We remain convinced that the CCW is the relevant forum to address this issue. We support continued substantive technical discussions in the GGE on LAWS within the CCW context with the participation of all relevant stakeholders.

(29 October) As mandated by the 2018 resolution, the UN Secretary-General has submitted an updated report in 2019 on the recent developments in science and technology and their potential impact on international security and disarmament efforts. This report, as contained in document A/74/122, touches upon a range of topics such as autonomous technologies, uncrewed aerial vehicles, biology and chemistry, advanced missile technologies, space-based technologies, materials technologies and ICT technologies, providing an update on various activities that have taken place over the year across these domains.

Ireland

(16 October) Ireland is also heavily engaged in work within the Convention on Conventional Weapons on addressing the considerable ethical, moral and legal dilemmas posed by the development of lethal autonomous weapon systems. Our perspective is informed by our commitment to multilateralism and a deep desire to ensure that technological developments do not outpace out collective ability to ensure full compliance with international humanitarian law. It remains our firm belief that such weapons must always remain under meaningful human control, and that only human accountability can ensure full compliance with international humanitarian law.

(24 October) We are encouraged that the GGE on LAWS agreed to recommend an extended list of Guiding Principles to the 2019 Meeting of High Contracting Parties. It is our firm belief that such weapons must always remain under human control, and that only human accountability can ensure full compliance with IHL. The fast pace of technological developments presents a compelling incentive for us to accelerate our efforts and agree on tangible outcomes.

Israel

(24 October) In Israel’s view, the fact that the Convention on Certain Conventional Weapons strives to strike the necessary balance between military necessity and humanitarian considerations in the application of International Humanitarian Law, makes it an important instrument in the conventional field. It is also an appropriate forum for discussing many challenges in this sphere. Israel values very much its principles and finds the in-depth discussions in the GGE on Lethal Autonomous Weapons Systems (LAWS), satisfying and beneficial.

Italy

(14 October) We welcome the agreement on the eleven guiding principles as a substantive outcome of the Group of Governmental Experts on Lethal Autonomous Weapons Systems, and we believe it is important to further discuss this issue, bearing in mind the importance that International Humanitarian Law applies to all weapons systems and any existing or future weapon system must be subject to human control, particularly in relation to the ultimate decision to use lethal force.

(24 October) Italy especially welcomes the in-depth work carried out by the GGE on LAWS Group of Governmental Experts on Lethal Autonomous Weapons Systems. We are of the view that human control is fundamental to ensure that all weapons systems are developed, deployed, and used in compliance with International Humanitarian Law. In particular, we deem it necessary for the decisions to use lethal force and to produce lethal effects to remain in the hands of human beings. We believe that the continuation of the work of the GGE in the next two years will provide the opportunity to further discuss aspects relating to the characteristics and implications of LAWS with a view to paving the way towards a consensual political declaration.

Japan

(14 October) Given the potential impact of accelerating scientific and technological developments on conventional weapons, we welcome the active on LAWS held at the GGE within the CCW framework, and will continue our contribution to the thorough and constructive discussions.

(24 October) Japan welcomes the adoption by consensus of the 11 guiding principles at the LAWS GGE and the decision to continue the discussion and to work on recommendations towards 2021 Review Conference. Future efforts must build on existing achievements. In this context, Japan attaches importance to the development and elaboration of guiding principles, in particular, the principle related to human-machine interaction. We need to identify and examine a range of factors in determining the quality and extent of human-machine interaction.

Kazakhstan

(11 October) The development of lethal autonomous weapons systems becomes a particular source of concern, as well. LAWS have the potential to challenge the most basic principles of both international law and international humanitarian law. It remains to be proven that an autonomous weapon system would be able to comply with the three fundamental principles of IHL namely those of proportionality, distinction and precautions in attack.

(24 October) Possible development of Lethal Autonomous Weapon Systems (LAWS) remains a particular source of concern in our modern world. We supported the creation of Group of Governmental Experts on Lethal Autonomous Weapons Systems (GGE LAWS). LAWS have the potential to challenge the most basic principles of international law, in particular the IHL. It remains to be proven that an autonomous weapon system would be able to comply with three fundamental IHL principles. We are all aware of the influence of Artificial Intelligence on the future of our countries and the entire world. We are thus equally vigilant of the dangerous consequences arising from the development of new technologies. Further work is required to help our understanding evolve deeper on LAWS, and so Kazakhstan believes that it is important to continue our discussions into 2020, as well.

Libya

(14 October) What causes new concern is also development of new weapons and technologies like drones and lethal autonomous weapons and cyber weapons. These weapons are being developed in a very fast manner and this can cause threats to peace and security, especially if those technologies are available to illegal groups and organizations. Thus it is incumbent on us to give more attention to these issues.

Liechtenstein

(17 October) Technical developments clearly point to a need for new regulation in the area of lethal autonomous weapon systems, in the form of binding standards to legally ensure a human component in the decision-making processes of such systems. An element of meaningful human control across the entire life cycle of lethal autonomous weapons systems is essential and helps to ensure compliance with applicable law, including international humanitarian law. Liechtenstein supports the Declaration on Lethal Autonomous Weapons Systems (LAWS) as it defines important common ground to take this agenda forward. Bringing the discussions conducted by the Group of Governmental Experts to the General Assembly might be commensurate to the urgency the Secretary-General and many States attach to this issue.

Lithuania

(25 October) We are convinced that discussions on new technologies and conventional weapons shall remain within the CCW framework.

Mexico

(24 October) We also reiterate the need for the international community to determine the future channel of lethal autonomous weapons systems and the risk that these weapons represent if they are not subject to any substantive human control.

Myanmar (Burma)

(25 October) In today’s world of technology advancement, there are growing concerns on new types of weapons such as lethal autonomous weapon systems and their destructive power. Myanmar participated as an observer in the meetings of GGE in 2019. We thank the members of GGE for their hard work.

Namibia

(24 October) We favor a protocol prohibiting lethal autonomous weapons. Likewise, regulations are required for the use of weapons with certain autonomous capabilities, in particular, military drones. These kinds of weapons are totally incompatible with international humanitarian law. Finally, we hope that the First Committee will give the necessary impetus to the negotiations in Geneva of the open-ended Group of Governmental Experts on emerging technologies in the area of lethal autonomous weapons systems.

Netherlands

(14 October) With regard to Lethal Autonomous Weapons Systems: again multilateralism is key here. We commend the important work of the GGE on this issue; the Netherlands welcomes the outcomes of this year’s sessions. However, there is much work still to be done, and therefore we trust to see the mandate renewed in November.

(25 October) While recognizing the many potential benefits of the increased autonomy of weapons systems, we cannot turn a blind eye to the potential risks. We will continue to insist that humans exercise meaningful control and remain responsible at all times. We welcome the principles agreed by the GGE LAWS in August and look forward to continuing our work in this regard, with a focus on the application of existing international law.

Pakistan

(29 October) The development of Lethal Autonomous Weapons Systems or LAWS, has emerged as a major concern. LAWS are rightly being described as the next revolution in military affairs that would fundamentally change the nature of war. Their introduction will lower the threshold of armed conflicts and would affect progress on disarmament and non-proliferation. Any weapon system that delegates life and death decisions to machines is by nature unethical and cannot fully comply with IHL. The issue of LAWS does not only have legal ethical and technical dimensions, but also carries serious implication for regional and global security. The developments in the field of AI need to be appropriately regulated in all its dimensions. They should not outpace the evolution of regulations governing them. Pakistan, therefore, supports the development of an international legally binding instrument stipulating appropriate prohibitions and regulations on LAWS. The process launched six years ago within the CCW framework can only be sustained if it yields concrete results, heeding the concerns of all States. Besides the CCW, the international security dimensions of LAWS should be comprehensively addressed by the UN Disarmament Machinery, including by the CD.

Poland

(11 October) Emerging and disruptive technologies enable new methods and means of warfare, raising fundamental questions that cut across traditional concepts of international relations and international law. The acceleration of technological development and proliferation of new systems are challenging multilateral regulatory frameworks and inter-governmental processes. From a peace and security perspective, there are concerns about the ability of new weapons to destabilize security relations and increase unpredictability. This might be the case for example with hypersonic weapons or anti-satellite systems. There are concerns over the potential of new technologies to be used to conduct malicious activities, that fall short of traditional thresholds for use of armed force, as illustrated by recent examples of hybrid warfare. Due to the rapidly evolving nature of technology-related challenges, the legally binding instruments might not provide us with adequate solutions. We should rather look for more pragmatic solutions, starting with increased transparency and confidence building measures.

(24 October) Development of military capabilities of tomorrow requires special attention. It refers also to lethal autonomous weapons systems, as the universal application of artificial intelligence is still the future. We should ensure that its development, implementation, and possible use will remain in compliance with international law, in particular with international humanitarian law.

Portugal

(14 October) On new and emerging threats relating to lethal autonomous weapons systems, cyberspace, and militarization of outer space, we should encourage the implementation of norms of responsible state behavior, transparency, and respect for international law, human rights, and international humanitarian law. Portugal welcomes the outcome of the 2019 session of the GGE on LAWS.

(24 October) Portugal also welcomes the outcome of the 2019 session of the GGE LAWS.

Republic of Korea

(24 October) We welcome the progress made this year by the Group of Governmental Experts on lethal autonomous weapons systems within the framework of the CCW. We hope that the collective efforts through the GGE process until consensus on normative and operative frameworks on emerging technologies in the area of LAWS is reached.

Russia

(25 October) We continue to take a skeptical approach to the prospects of an accelerated consideration of and especially the adoption of any decisions on lethal autonomous weapons systems. Our reasons for this are simple. There is a lack of any working models or samples of such systems. Due to this, there is an evident inability to come to a common understanding regarding the basic characteristics and concepts relating to lethal autonomous weapons systems. We also cannot ignore the significant disparities in views between the participants in discussions on the topic. However, we have demonstrated a constructive approach and we support the ongoing discussion of this topic within the relevant Group of Governmental Experts.

San Marino

(16 October) We also need to reflect on the use of armed drones and fully autonomous weapons, fields in which international standards should be discussed and developed. There are deep ethical and legal doubts that need to be addressed. San Marino believes that meaningful human control is required over life and death decisions.

South Africa

(24 October) South Africa reaffirms its commitment to the Certain Conventional Weapons Convention (CCW) and the humanitarian principles enshrined in it. We attach great importance to this framework Convention, as evidenced in our ratification of all the Protocols annexed to the Convention. We also reaffirm our support for the work of the open-ended Group of Governmental Experts (GGE) to discuss emerging technologies in the area of lethal autonomous weapons systems (LAWS).

Spain

(15 October) The Group of Governmental Experts on lethal autonomous weapons systems has, in its 2019 session, created the basis for advancement in this area. We maintain our proposal to establish a code of conduct that includes measures of transparency, confidence building and information exchange and best practices, including possible advances in the field of artificial intelligence.

(25 October) Technological development poses new challenges in the field of conventional weapons. This is the case for so-called lethal autonomous weapons. As soon as they transcend current international humanitarian law, based on the adoption of confrontational decisions by humans, we must strive to define an adequate legal framework that, without undermining the legitimate security interests of each country, guarantees the maintenance of the guiding principles of international humanitarian law. Spain is in favor of establishing a political declaration and a possible code of conduct that includes transparency and confidence measures, exchange of information and best practices in this field.

Sri Lanka

(11 October) Sri Lanka supports the ongoing discussions within the framework of the CCW GGE on lethal autonomous weapons systems (LAWS), and encourages the continuation of the GGE process. While recognizing the positive benefits that could accrue from the dual-use nature of the technology, new technological developments including 3D printing, synthetic biology, Artificial Intelligence, and the development of LAWS devoid of any human control have created unprecedented risks and challenges to humanity. These are matters which, if not regulated, have the potential to threaten international peace and security. We encourage States Parties to the CCW to deepen and fast track the discussion within the GGE to urgently address the issues of possible development and deployment of LAWS. Given the increasing manner in which non-state actors and criminal elements are able to acquire more and more sophisticated weapons, we must be aware of the period that these new technologies and weapons pose, when falling into the wrong hands. It is in this context that we encourage States with capabilities to develop autonomous weapons to take immediate action in placing national moratoria, as a temporary regulator measure, and engage fully within the GGE discussions. Sri Lanka, however, calls for the negotiation of a binding legal framework, which, inter alia, provides for regulatory norms with meaningful human control as its central thrust.

(25 October) While recognizing the positive benefits that could occur from the dual use and dual nature of new technology and new developments, including 3-D printing, synthetic biology, artificial intelligence (AI), and development of lethal autonomous weapons systems, also commonly known as killer robots, devoid of any human control, have created unprecedented risk and challenges to humanity. These are factors, which if not related, have the potential to threaten international peace and security. We encourage state parties to the CCW to deepen and fast-track the discussions within the GGE to address this issue of possible development and deployment of LAWS. There is a crucial need for negotiation of a binding legal instrument which provides for military norms of which meaningful human control is a central thrust.

Switzerland

(14 October) With regard to lethal autonomous weapons systems, the work undertaken within the framework of the Convention on Certain Conventional Weapons (CCW) has already clarified some key elements, in particular that international humanitarian law applies fully to these weapons and that human-machine interaction is an important dimension. However, significant efforts will still be needed to clarify aspects of the normative and operational framework for the use of such weapons.

(23 October) Whilst we welcome the progress made by the COW Group of Governmental Experts on Lethal Autonomous Weapons Systems (LAWS), we believe that it must intensify its efforts to achieve results that are more consequent. For the Group to make the most of the limited time available, it should focus its attention on developing concrete measures that clarify the operational and normative framework governing these weapons so as to more effectively circumscribe the challenges that these weapons pose.

Turkey

(25 October) Issues such as Improvised Explosive Devices (lEDs) and Lethal Autonomous Weapons Systems (LAWS) continue to be important.

United Kingdom

(23 October) We welcome the progress made this year by the CCW’s GGE on lethal autonomous weapons systems. We look forward to the formal endorsement of the guiding principles affirmed by the GGE at this year’s Meeting of High Contracting Parties, and welcome the continuation of the GGE’s mandate into the 2021 Review Conference.

United States

(23 October) Some States have expressed concerns regarding lethal autonomous weapons systems (LAWS). We have engaged in the Convention on Certain Conventional Weapons (CCW) GGE on Emerging Technologies in the Area of LAWS since its inception in 2017, and are prepared to continue. The GGE has made significant progress on this complex topic, and we urge the CCW High Contracting Parties to endorse the recommendations in the 2019 GGE report, in particular to continue the GGE’s work through 2021. As a community, we should work to understand better the potential risks and benefits that are presented by weapons with autonomous functions in terms of compliance with International Humanitarian Law (IHL).

European Union

(23 October) On the theme of emerging technologies in the area of lethal autonomous weapons systems, the EU welcomes the outcome of the 2019 session of the open-ended Group of Governmental Experts on Lethal Autonomous Weapons Systems (GGE LAWS) as a good basis for further progress, notably the agreement on the 11 guiding principles and the importance of human-machine interaction. We emphasize that human beings should make the decisions with regard to the use of lethal force, exert control over lethal weapons systems they use, and remain accountable for decisions over life and death in order to ensure compliance with international law, in particular international humanitarian law and international human rights law. We call upon all High Contracting Parties to constructively engage in order to agree on substantive recommendations on aspects of a normative and operational framework ahead of the 2021 CCW Review Conference. We recall that CCW is the relevant international forum in this regard combining legal and military expertise and involving the private sector and civil society. The CCW must remain responsive to fast pace developments in the field of weapons technology, be able to adequately address them and ensure that international legal frameworks remain appropriate.

Non-Aligned Movement (NAM)

(10 October) There is an urgent need to pursue legally binding instrument on lethal autonomous weapons systems (LAWS). The issues surrounding LAWS should be deliberated thoroughly in conformity to international law, including international humanitarian law and international human rights law. NAM States Parties to the Convention on Certain Conventional Weapons (CCW) welcome the consensus adoption of the 2019 Report of the GGE on LAWS.

(24 October) Lethal autonomous weapons systems should be thoroughly deliberated and examined in the context of commitment to international law, including international humanitarian law and international human rights law. In this regard, NAM state parties to the CCW have agreed that there is urgent need to pursue a legally binding instrument on LAWS. 

Nordic Countries

(10 October) We welcome the progress made on lethal autonomous weapons systems in the Group of Governmental Experts and are looking forward to continued work within the framework of the CCW, the appropriate framework for this issue.

(23 October) The Group of Governmental Experts on Lethal Autonomous Weapons Systems (LAWS) has been an extremely valuable venue for international work on this multifaceted and exceptionally complex arms control topic. Progress has indeed been made, including on the now 11 Guiding Principles. Strict adherence to International Law, and, in particular, International Humanitarian Law, is and must continue to be the cornerstone of all weapons use. High Contracting Parties should seize the opportunity to consider and clarify the normative and operational framework for LAWS. This should be done in the Geneva GGE, which we see as the appropriate forum for this topic.