UN Human Rights Council Event: The Threat of Autonomous Weapons Systems to International Human Rights Law
The event provided a critical assessment of the dangers posed by autonomy in weapon systems to international human rights law, including the right to life, prohibition of inhumane and degrading treatment, freedom from discrimination, equality before the law, the presumption of innocence, and the right to privacy.
Panelists evaluated the dangers of bias in the development and use of autonomous weapon systems and the impact on the human rights of marginalized groups. The discussion also highlighted the relevance of international human rights law to situations of armed conflict and considered pathways towards a legal framework to safeguard against the dangers posed by autonomous weapons systems.
The event was attended by UN officer holders, diplomats working in human rights and disarmament, researchers, and civil society representatives around the world.
As international momentum builds towards launching negotiations on a new international treaty on autonomous weapons systems, this event provided a space for considering the wide range of human rights concerns that need to be addressed in delivering a framework that adequately safeguards against the dangers posed by autonomy in weapons systems.
Stop Killer Robots will continue to engage with work at the UN Human Rights Council and other forums towards achieving a new international treaty that the world urgently needs. We look forward to collaborating with all states, researchers and technologists, civil society organizations, and other interested parties in achieving a successful outcome.
Watch the video of the event below.
|Richard Moyes is Managing Director of UK-based NGO Article 36 and coordinates the campaign to Stop Killer Robots. Article 36 was a founding member of the campaign and has been a leading source of policy thinking on the issue – including through the concept of ‘meaningful human control’. Richard has worked on the creation of a number of international legal and political instruments relating to weapons and violence – including the Convention on Cluster Munitions, the Safe Schools Declaration, and the Treaty on the Prohibition of Nuclear Weapons.
|Bonnie Docherty is a senior researcher in the Arms Division of Human Rights Watch and a lecturer on law and the associate director of armed conflict and civilian protection at the International Human Rights Clinic at Harvard Law School. Her many publications on autonomous weapons systems have helped build the case for a new treaty on weapons that would select and engage targets without meaningful human control and what the elements of that treaty should be. Her reports include “Shaking the Foundations”, which examines the human rights implications of killer robots.
|Katherine Chandler is an assistant professor in the Culture and Politics Program at the Walsh School of Foreign Service, Georgetown University. Her research studies intersections between gender, race, and technology in a global context. She authored the 2021 UNIDIR report, “Does Military AI Have Gender? Understanding Bias and Promoting Ethical Approaches in Military Applications of AI.” Her 2020 book, Unmanning: How Humans, Machines, and Media Perform Drone Warfare, analyzes how wartime experiments with technology align racial and gender stereotypes with global inequalities. Her current research studies how critical and anti-racist approaches to technologies can be linked to the gender, peace, and security agenda.www.katherinechandler.net/
|Taylor Woodcock is a Ph.D. researcher in public international law at the Asser Institute, University of Amsterdam. Her research examines the implications of the development and use of military applications of artificial intelligence (AI) for current international legal frameworks governing armed conflict, international humanitarian law, and international human rights law. This Ph.D. project is part of the research project Designing International Law and Ethics into Military Artificial Intelligence (DILEMA), funded by NWO–MVI Programme on Responsible Innovation (2020–2024). Taylor can be reached via email at [email protected] or on Twitter @TaylorKWoodcock.
|Dr. Matt Mahmoudi is a Researcher/Adviser at Amnesty International working with Amnesty Tech’s AI & Big Data team on developing research, policy, and advocacy on AI and human rights, with a particular focus on facial recognition technologies. Matt is a graduate of the Jo Cox Ph.D. Studentship at Pembroke College, Cambridge, he is the recipient of an MPhil from Cambridge and a BA in Politics with Business Management from the Queen Mary University of London. Matt has contributed and advised on several projects including Africa’s Voices Foundation, Rift Valley Institute, the UN OHCHR, and Global Rights Nigeria.