Gender norms and patriarchal power structures affect the way we view, use, and engage with weapons, war, and violence. Fully autonomous weapons could contribute to or exacerbate notions of militarized masculinities, be used to commit acts of gender based violence, and increase inequality as a result of algorithmic bias or target profiling.
Gender based violence could easily be enacted by fully autonomous weapons that select and engage targets on the basis of target profiles. We have already seen target profiling based on gender in the use of semi-autonomous weapons such as armed drones, which have been used to target militants (or count them as legitimate targets in casualty recording) based on their appearance as “military-aged males”. In this case, assumptions about men as potential or active combatants reinforces gender norms regarding male violence, which in turn legitimizes them as targets feeding into the cycle of gender-based violence.
Proponents of killer robots argue that fully autonomous weapons wouldn’t get hungry, tired, feel pain, fear or anger, and wouldn’t act in self-defence or make rash decisions in the heat of the moment. But as inanimate objects, such weapons systems would also lack empathy, conscience, emotion and understanding of human rights and human dignity. These tools of human judgment are crucial for making the complex ethical and moral decisions required of soldiers in combat. The development and use of fully autonomous weapons would further dehumanise warfare and killing, and perpetuate patriarchal structures of military violence.
Fully autonomous weapons would select and engage targets determined by sensor processing, using sensor-identifiable characteristics without meaningful human control. In essence, this would reduce humans to patterns of data or lines of code. This becomes even more dangerous when considering the possibility that bias could be programmed into algorithms contained in fully autonomous weapons. Emerging technologies like facial and vocal recognition have been proven to have high failure rates in recognizing women, people of colour, and persons with disabilities. The use of fully autonomous weapons would likely result in higher risk for these groups, and anyone who does not fit within the norm’ as determined by the programmer.
Killer robots would not end sexual violence in conflict, but would likely perpetuate it as fully autonomous weapons would not question an order to rape, if programmed to do so. Rape and sexual violence are used as weapons in conflict, and are already ordered by states and armed groups as a matter of strategic policy and inflicting terror. Fully autonomous weapons would be even less likely to disobey orders to commit rape than human soliders, as a result of their lack of conscience, empathy or understanding of the act or consequences of sexual violence.
More than 90 countries, including the United States, Israel, China, South Korea, Russia, and the United Kingdom are participating in UN talks on the concerns raised by fully autonomous weapons, also known as lethal autonomous weapons systems. But after eight meetings since 2014 with no credible outcome, the pace of diplomacy is struggling to keep up with technological advances. Bold political leadership is urgently needed to launch negotiations on an international treaty to prohibit fully autonomous weapons and retain meaningful human control over the use of force. Take a stand by endorsing the call to ban killer robots.