The challenge of killer robots is firmly on the agenda in Norway, where the ethics council of the $830 billion Norwegian Government Pension Fund Global has announced it intends to begin monitoring companies investing in the potential development of fully autonomous weapons systems. Ethics council chair Johan H. Andresen said the aim is to see if such investments would be contrary to the fund’s investment policies and ethical guidelines. He described the initiative as “a statement of fair warning, a heads-up.”
This move by the world’s largest pension fund should spur similar funds and other investors to take a critical look at their investments amid rising concerns over weapons that would select and attack targets without meaningful human control. It sends strong signal in Oslo that it’s time for Norway to get serious about dealing with fully autonomous weapons.
The measure is outlined in a four-page article in the ethics council’s annual report on “the weapon criterion and the development of autonomous weapons.” The report affirms that the council will monitor developments in the process underway at the Convention on Conventional Weapons (CCW) “or alternative processes outside the CCW framework” and “keep abreast with technological developments in the area through our contacts with various interested parties.” According to the report, if companies begin to develop autonomous weapons, the ethics council would “consider dealing with individual cases under the conduct-based criteria.”
According to the ethics council report, issues relating to autonomous weapons are relevant to the section 2a of the Fund’s ethical guidelines, which state that the fund shall not be invested in companies which produce weapons that violate fundamental humanitarian principles through their normal use.
In an interview with Reuters, Andresen said, “If you think about developing technology for recognizing cancer, that is fine. But if you are adapting it to track down a certain type of individual in a certain environment, and cooperating with others to make an autonomous weapon out of it, don’t be surprised if we take a look at you.”
Following past recommendations by the council, the Fund does not invest in companies involved in the manufacture of landmines, cluster munitions, nuclear weapons, or weapons covered by the framework of Convention on Conventional Weapons (CCW) protocols on non-detectable fragments, incendiary weapons, and blinding laser weapons.
In the event that states adopt a new CCW protocol on lethal autonomous weapons systems–where talks have been underway since 2014 and another round is due in April–the report states that “it will be natural for autonomous weapons to be added to the list of weapon types that provide grounds for the exclusion of companies under the Fund’s ethical guidelines, in the same way as it has done” before. But the council finds “there is little likelihood of an agreement under the CCW that prohibits autonomous weapons being negotiated in the foreseeable future” and highlights the possible need to go outside the CCW via an alternate process.
According to the council’s report, an alternative to considering autonomous weapons under the guidelines’ product-based exclusion criteria could be to consider them under the criteria for “conduct-based exclusions.” This means “if autonomous weapons used against humans entail an infringement of individuals’ rights in war and conflict situations, it can be considered whether the production of autonomous weapons may be covered by section 3b of the guidelines, or alternatively section 3f, i.e. the guidelines’ criteria for conduct-based exclusion.”
Andresen has emphasized that the council can only look at the companies involved and not the weapons technology itself. In a NRK interview, he said the main question for the council is whether such weapons would violate fundamental humanitarian principles as well as conventions that Norway has ratified. Andresen warned of a potential arms race that would be “completely out of control.” The council’s report finds that “One of the starting points for the assessment may be that the actual concept of autonomous weapons – that decisions of life and death are left up to machines – is in principle and intrinsically a problem. A more limited assessment is whether it is possible to envisage a use of autonomous weapons that does not contravene the … humanitarian principles for warfare.”
The Government Pension Fund Global is not a conventional pension fund, as it derives its financial backing from Norway’s oil profits, rather than pension contributions. The fund invests in 9,000 companies in 75 countries in accordance with its ethical guidelines.
Previously, in January 2015, Andresen spoke briefly to Norwegian business daily DN (Dagens Næringsliv) after attending a World Economic Forum session in Davos where speakers raised fully autonomous weapons concerns. When asked by DN if the council should propose excluding investments in companies involved in producing the weapons, Andresen said that he hoped a decision will not be need to be taken if efforts to preemptively ban them succeed.
Since 2013, Norwegian groups have held several events to discuss the challenges posed by fully autonomous weapons. Most recently, on 29 January 2016, Steve Goose, arms director of Human Rights Watch, which coordinates the Campaign to Stop Killer Robots, spoke about the coalition’s concerns and objectives at a public talk in Oslo organized by Agenda and campaign co-founder Norges Fredslag. The event featured a discussion with representatives from the Peace Research Institute of Oslo (PRIO) and the International Law and Policy Institute (ILPI) that saw strong reactions from audience members on the need for Norway to act on this issue.
For more information see:
Photo: An offshore gas platform operated by Statoil ASA in the Oseberg North Sea oil field 140 kilometers from Bergen, Norway. © Kristian Helgesen/Bloomberg, 2014