menu hero image

Artificial intelligence and automated decisions: shared challenges in the civil and military spheres

This paper provides an initial sketch of responses to AI and automated decision-making in wider society while contextualising these responses in relation to autonomy in weapons systems.

For more than nine years, autonomous weapons systems have been the subject of international discussion in various fora, including in the UN Human Rights Council, the UN General Assembly First Committee on Disarmament and International Security, and the Convention on Certain Conventional Weapons (CCW) Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems (GGE on LAWS). In these discussions, states, United Nations agencies, international organisations, and non-governmental organisations have highlighted the various and serious ethical, moral, humanitarian, and legal implications of artificial intelligence (AI) and autonomous weapons systems. Despite a majority of states supporting negotiation of a legal instrument, the Sixth Review Conference of the CCW in December 2021 failed to agree on a mandate to work towards any form of regulation.

In compiling the report, forty states were identified which had publicly released specific policy documents or other strategies on the development and use of artificial intelligence domestically. The report assesses numerous relevant national AI strategies and positions, EU-level reports and regulations, international guidelines, and other documents, in order to draw out core themes and concerns regarding the adoption and use of AI and automated decision-making technologies in the civil space.

Automated Decision Research

Image alt text
SKR dots icon

Stop killer robots

Join us

Keep up with the latest developments in the movement to Stop Killer Robots.

Join us