menu hero image

Gender and killer robots

Gender norms and patriarchal power structures affect the way we view, use, and engage with weapons, war, and violence. Autonomous weapons could be used to commit acts of gender based violence, and increase inequality as a result of algorithmic bias or target profiling.

New international law regulating autonomous weapons and banning systems that target people or operate without meaningful human control would support feminist foreign policy goals by focusing on human security and preventing militarisation of emerging technology and technological advancements.

What’s gender got to do with it?

Gender based violence could easily be enacted by autonomous weapons that select and engage targets on the basis of target profiles. We have already seen target profiling based on gender in the use of semi-autonomous weapons such as armed drones, which have been used to target militants (or count them as legitimate targets in casualty recording) based on their appearance as “military-aged males”. In this case, assumptions about men as potential or active combatants reinforce gender norms regarding male violence, which in turn legitimises them as targets – feeding into the cycle of gender-based violence.

Image alt text

Proponents of killer robots argue that autonomous weapons wouldn’t get hungry, tired, feel pain, fear or anger, and wouldn’t act in self-defence or make rash decisions in the heat of the moment. But as inanimate objects, such weapons systems would also lack empathy, conscience, emotion and understanding of human rights and human dignity.

These tools of human judgment are crucial for making the complex ethical and moral decisions required of soldiers in combat.

The development and use of autonomous weapons would only further dehumanise warfare and killing, and perpetuate patriarchal structures of military violence.

Autonomous weapons would select and engage targets on the basis of sensor processing, rather than immediate human command. In essence, this would reduce humans to patterns of data or lines of code. This becomes even more dangerous when considering bias that could be programmed autonomous weapons.

Emerging technologies like facial and vocal recognition have been proven to have high failure rates in recognising women, people of colour, and persons with disabilities. The use of autonomous weapons that rely on these technologies would likely result in higher risk for these groups, and anyone who does not fit within the ‘norm’ determined by the programmer.

Killer robots would not end sexual violence in conflict, but would likely perpetuate it. Autonomous weapons, void of human compassion or doubt,  would not question an order to rape, if programmed to do so. Rape and sexual violence are used as weapons in conflict, and are already ordered by states and armed groups as a matter of strategic policy and inflicting terror. Autonomous weapons would be even less likely to disobey orders to commit rape than human soldiers, as a result of their lack of conscience, empathy or understanding of the act or consequences of sexual violence.

Image alt text

Securing a Feminist Future

In recent years, a small – but growing – number of governments are adopting feminist foreign policies, including Canada, France, Mexico, and Sweden. While these policies are being implemented to varying degrees and in various ways, ensuring meaningful human control over the use of force would support the feminist foreign policy approach and strengthen global peace and security.

 

A Feminist Foreign Policy is a framework which elevates the everyday lived experience of marginalised communities to the forefront and provides a broader and deeper analysis of global issues. It takes a step outside the black box approach of traditional foreign policy thinking and its focus on military force, violence, and domination by offering an alternate and intersectional rethinking of security from the viewpoint of the most marginalised. It is a multidimensional policy framework that aims to elevate women’s and marginalised groups’ experiences and agency to scrutinise the destructive forces of patriarchy, capitalism, racism, and militarism.
Centre for Feminist Foreign Policy
Play video Image alt text

Securing a feminist future

Killer robots or fully autonomous weapons systems will exacerbate already existing systems of inequality and oppression and may be used to enact gender-based violence. This video was made for International Women's Day 2020.
Play video Image alt text

Human Rights implications of killer robots

Rasha Abdul Rahim, now Director of Amnesty Tech speaks to us about the Human Rights implications of killer robots.

1

Intersectionality, racism and killer robots

What is intersectionality? And why is it important when we are discussing killer robots and racism? With historical and theoretical roots in Black feminism and women of colour activism, intersectionality is a concept that acknowledges all forms of oppression such as ableism, classism, misogyny, and racism; and examines how these oppressions operate in combination.

2

Gender and bias

Ray Acheson of the Women's International League for Peace and Freedom on 'What does gender have to do with killer robots?'

Play video Image alt text

Profiling and killer robots

Professor Lucy Suchman of the International Committee for Robot Arms Control speaks about profiling and autonomous weapons. Photo: Charlotte Perhammar
Image alt text
SKR dots icon

Stop killer robots

Join us

Keep up with the latest developments in the movement to Stop Killer Robots.

Join us