menu hero image
AOAV IGw AWS FINAL 150ppi_LR

Minority of states delay effort to ban killer robots

AOAV IGw AWS FINAL 150ppi_LR

 

Yet again, a small group of military powers have shown an appalling lack of ambition and zero sense of urgency for achieving a meaningful result from the diplomatic talks on lethal autonomous weapons systems.

Many of the 90 states participating in this week’s United Nations meeting on these weapons expressed their firm desire to move to negotiate a new treaty to prohibit or restrict these weapons systems. Such a treaty is widely seen as necessary to enshrine the principle that states should maintain meaningful human control over the use of force.

Calls to ban killer robots are multiplying rapidly and more than 4,500 artificial intelligence experts have called for a new treaty to prohibit lethal autonomous weapons systems in various open letters since 2015. That includes Yoshua Bengio, Yann le Cunn, and Geoffrey Hinton, who were this week fittingly awarded the Turing Award, the most prestigious prize in the field of computer science.

It’s clear that a majority of states want to do the right thing, but the calls from some states for guiding principles, declarations, guidelines, codes of conduct, compendiums of military “best practices,” questionnaires, and more committees are not the answer. Such measures will not satisfy public concerns.

There is rising concern that the Convention on Conventional Weapons (CCW) talks on lethal autonomous weapons systems are a way for militarily powers to try to placate civil society, distract public attention, and manage media expectations rather than seriously address the challenges they pose for humanity.

Russia, Australia, Israel, United Kingdom and United States spoke against any move to create a new treaty. These states are investing significant funds and effort into developing weapons systems with decreasing human control over the critical functions of selecting and engaging targets.

The many fundamental moral, ethical, legal, operational, technical, proliferation, international stability and other concerns with fully autonomous weapons are going to multiply rather than disappear. Delegating life-and-death decisions to machines crosses a moral “red line” and a stigma is already becoming attached to the prospect of removing meaningful human control from weapons systems and the use of force.

It’s increasingly clear that killer robots must be prohibited via a new treaty. After six years of talks involving more than 80 countries, the CCW talks still have not agreed on the regulatory approach necessary to prevent a future of fully autonomous weapons.

The only credible option for addressing the humanitarian, ethical, and international security challenges posed by fully autonomous weapons is for states to negotiate a new treaty to prohibit weapons systems that can select and engage targets without meaningful human control.

There is precedent for a ban treaty, including ones negotiated outside of United Nations auspices. In the past, responsible states found it necessary to supplement existing legal frameworks for weapons that by their nature posed significant humanitarian threats, such as biological weapons, chemical weapons, antipersonnel mines and cluster munitions. There is also precedent for a preemptive ban in CCW Protocol IV prohibiting laser weapons designed to permanently blind human soldiers.

This week’s meeting on lethal autonomous weapons systems at the UN in Geneva was the seventh CCW meeting on this topic since 2014. The meeting opened with a strong appeal from the UN Secretary-General Antonio Guterres for a ban on lethal autonomous weapons systems, which he called “morally repugnant and politically unacceptable.” The Secretary-General statement reminded states present that, “the world is watching, the clock is ticking.”

 

mary

Image alt text
SKR dots icon

Stop killer robots

Join us

Keep up with the latest developments in the movement to Stop Killer Robots.

Join us