Representatives from both sides of the House of Commons in the United Kingdom agree that fully autonomous weapons raise numerous concerns warranting further deliberation, including at the international level. The Parliamentary Under Secretary of State at the Foreign and Commonwealth Office, Alistair Burt, however emphasized that the government does not support the call for a moratorium on these future weapons that would select and attack targets without further human intervention, described as “lethal autonomous robotics” in the parliamentary adjournment debate held late in the evening of 17 June 2013.
Burt said in the statement that ‘robots may never be able to meet the requirements of international humanitarian law’ is “absolutely correct; they will not. We cannot develop systems that would breach international humanitarian law, which is why we are not engaged in the development of such systems and why we believe that the existing systems of international law should prevent their development.” He emphasized that as a matter of policy, “Her Majesty’s Government are clear that the operation of our weapons will always be under human control as an absolute guarantee of human oversight and authority and of accountability for weapons usage.”
Burt thanked Labour MP Nia Griffith for raising “this … important subject which will inevitably become ever more so as technology develops.” Griffith, who is Shadow Wales Minister and Vice-Chair for the All-Party Parliamentary Group on Weapons and Protection of Civilians, urged the government to support the call for a pre-emptive and comprehensive ban on fully autonomous weapons, achieved through an international treaty.
Griffith also called on the government to explain its position on the recommendations of a report on “lethal autonomous robotics” by the UN Special Rapporteur on extrajudicial, summary or arbitrary executions, including the report’s call for a “halt” or moratorium until an international framework can be established.
On the question of a moratorium, Burt said, “The UK has unilaterally decided to put in place a restrictive policy whereby we have no plans at present to develop lethal autonomous robotics, but we do not intend to formalise that in a national moratorium. We believe that any system, regardless of its level of autonomy, should only ever be developed or used in accordance with international humanitarian law. We think the Geneva conventions and additional protocols provide a sufficiently robust framework to regulate the development and use of these weapon systems.”
When the UN expert report was presented at the Human Rights Council on 30 May 2013, representatives of two dozen states acknowledged that further discussion is needed and many listed the concerns expressed in the report. At the time the UK said it did not support the call for a moratorium and commented that there are “more appropriate” places to consider the issue, outside the Human Rights Council.
During the parliamentary debate Burt proposed that the issue of lethal autonomous robotics be discussed at the international level by states to the 1980 Convention on Conventional Weapons, noting that it “seems the right place for this important issue.” This framework convention contains several protocols on weapons that are excessively injurious or have indiscriminate effects such as the preemptive ban on blinding lasers achieved in 1995, the last time the CCW prohibited a weapon. It last adopted a new adopted a new protocol a decade ago, in 2003, to address explosive remnants of war.
The UK government has elaborated its policy several times in recent months. In April, the UK Ministry of Defence said that, “There are no plans to replace skilled military personnel with fully autonomous systems. … Although the Royal Navy does have defensive systems, such as Phalanx, which can be used in an automatic mode to protect personnel and ships from enemy threats like missiles, a human operator oversees the entire engagement.”
In the parliamentary debate, Griffith said the UK should take “a leading role” and “use our considerable standing on the world stage” to address “lethal autonomous robotics.” These affirmative policy statements committing to ensure that robotic weapons are always under meaningful human control could form the basis for a strong policy, one that could see the UK contribute to an international treaty preventing the removal of humans from decisions to use lethal force.
For more information see: