menu hero image
AOAV IGw AWS FINAL 150ppi_LR

Missile Systems and Human Control

AOAV IGw AWS FINAL 150ppi_LR

An article that appeared on the front page of The New York Times on 12 November detailing the trend towards autonomous warfare by the United States and other nations with high-tech militaries has attracted significant interest. The in-depth piece by science writer John Markoff is the outlet’s first on the “killer robots” challenge since Bill Keller’s reflections in March 2013 and Nick Cumming-Bruce’s coverage of the May 2013 report by UN special rapporteur Christof Heyns, including his call for a moratorium.

The Campaign to Stop Killer Robots distributed copies of The New York Times article to delegates attending the annual meeting of the Convention on Conventional Weapons (CCW) in Geneva, where 118 nations agreed by consensus on 14 November to proceed with deliberations that began earlier this year on the matter of ‘lethal autonomous weapons systems.’ The next CCW meeting on the matter will be held at the UN in Geneva on 13-17 April 2015.

The New York Times article cites several missile systems under development or in use with various degrees and forms of human control, namely:

  • The US long-range anti-ship missile (LRASM) prototype developed by Lockheed Martin for the Air Force and Navy, launched from a B-1 bomber;
  • Norway’s Joint Strike Missile planed for use by its fleet of advanced jet fighters;
  • Israel’s “Harpy” antiradar missile, which detects enemy radar signals and destroy targets with a high explosive warhead;
  • The UK’s Brimstone “fire and forget” missile, which it is using in Iraq.

As the international debate over autonomous weapons systems emerges the autonomous features in these systems are coming under scrutiny. Mark Gubrud of the International Committee for Robot Arms Control (ICRAC) described the LRASM to The New York Times as “pretty sophisticated stuff that I would call artificial intelligence [operating] outside human control.” The Washington DC-based Center for a New American Security (CNAS) views them as “precision-guided weapons” controlled as “a person selects the targets they are engaging.” Human Rights Watch and others have called the missile systems examples of “precursors” to fully autonomous weapons systems. The Pentagon views the LRASM as “semi-autonomous.”

In the debate, the question has been asked if these are examples of “killer robots” that should be banned. Are activists “shifting the goalposts” now that official discussions are underway and going after present-day weapons?

The Campaign to Stop Killer Robots calls for a preemptive ban on fully autonomous weapons systems. Its members can question developments in the world’s arsenals as they keep a close check on the targeting procedures and the use of force for such weapons.

“Only through objective scientific analysis of new versions of current systems on the cusp of autonomy and their potential extensions can we can determine if such systems are under appropriate human control” said Professor Noel Sharkey, chair of ICRAC, a group of scientific experts that independently assess weapons technologies to determine where they lie on the spectrum of autonomy. “Such scientific scrutiny is important in building a comprehensive picture of developments in weapons technology.”

More transparency would aid common understandings over how these existing and emerging weapons systems function and the degree of human control involved in their targeting and attack decisions. These and other systems with high degrees of autonomy must be examined in order to understand how meaningful human control is exercised and how civilians can be afforded adequate and lasting protections.

Photo: Campaign representatives review a Lockheed Martin video of the long-range anti-ship missile (LRASM)

mary

Image alt text
SKR dots icon

Stop killer robots

Join us

Keep up with the latest developments in the movement to Stop Killer Robots.

Join us