menu hero image

Statement on Ethical Considerations to UNGA meeting on autonomous weapons systems

Delivered by Salena Barry, Digital Communications Lead, Stop Killer Robots, 12 May 2025

 

Thank you, chair.

I speak on behalf of the Stop Killer Robots Campaign.  Stop Killer Robots represents over 270 civil society organisations in over 70 countries. Since our establishment in 2013, we have consistently called for states to negotiate a legally binding instrument that rejects the automation of killing and ensures meaningful human control over the use of force. 

When we started out more than a decade ago, autonomous weapons were seen as a future problem. Yet today that is no longer the case as we see the reported use of weapons systems with concerning levels of autonomy being used in both Gaza and Ukraine. 

Around the world, people of all backgrounds speak to our campaigners about their deep concern and fear about what the development and use of autonomous weapons will do to our world. On social media, members of the public express time and time again, their concern for what it would mean for humanity and the value of human life, if we delegate life and death decisions to machines. Dozens of Nobel Laureates, hundreds of faith groups and leaders, thousands of tech workers and AI experts, as well as young people, military veterans, and parliamentarians from around the world have all expressed their support for an international treaty to prohibit and regulate autonomous weapons systems. 

The ethical concerns raised by autonomous weapons systems are wide reaching and profound and a major impetus behind the need for new international law to prohibit and regulate them. 

Chair, we cannot allow autonomous weapons to target people. This would be an extreme form of digital dehumanisation – the reduction of people to data which is then used to make and/or take actions that negatively affect their lives. Digital dehumanisation in armed conflict is already contributing to serious harm and raises significant ethical and moral issues for civilian and military victims alike. Autonomous weapons do not ‘see’ you as a human being. Instead, you are ‘sensed’ by the machine as a collection of data points. No machine, computer, or algorithm is capable of recognizing a human as a human being, nor can it respect humans as inherent bearers of rights and dignity, much less what it means to have, or to end, a human life. 

Anti-personnel systems would also inevitably have disproportionate impacts on already marginalised communities through encoding and reproducing our societal biases in data sets such as sexism, racism, ableism, and other forms of discrimination. This is not only a legal and technological consideration, but an ethical one. We cannot allow people to be killed or injured as a result of data bias.

The dictates of the public conscience are clear: We need an international treaty that will ensure meaningful human control over all weapons systems, ban systems that can’t be properly controlled, and ban systems that target people.

It is time to start drawing clear red lines to protect humanity now.  

Thank you.

Stop Killer Robots

Image alt text
SKR dots icon

Stop killer robots

Join us

Keep up with the latest developments in the movement to Stop Killer Robots.

Join us