menu hero image
Neon sign reading ‘Game on’

Techno-optimism, human rights and killer robots…

Farah Bogani is the Project Officer for The Campaign to Stop Killer Robots, based in Ottawa, Canada. She previously worked at Amnesty International’s Secretariat office in London, UK.

It’s hard to be taken seriously when you’re a young woman actively working against the development of technologies. Most of the time when I talk about killer robots, I get raised eyebrows — sometimes of interest, other times of scepticism.

Robots have so long been in the sci-fi lore that they don’t present as an immediate threat in reality. But we’re not talking about those humanoid robots you’ve seen in movies — we’re talking about real weapons systems that are programmed to autonomously select targets and kill. Recently, watching the explosion of surveillance and artificially intelligent ‘solutions’ to COVID-19, it seems to me that the urge to techno-optimism does not take into account the longer-term impact these technologies will ultimately have. This is, and should be, concerning to other young people because of the implications that it has for our human rights and our futures. The reality is that the existence of autonomous weapons systems would not be limited to use in conflict zones, and our concerns about these weapons should not stop there because we could be seeing them in police forces in the near future. The development and use of these weapons would not only constitute gross violations of human rights, but would also be considered an excessive, arbitrary, and an unnecessary use of force.

While fully autonomous weapons have been discussed in the context of conflict and humanitarian disarmament at the UN (under the term lethal autonomous weapons or LAWS), it is increasingly important to consider the development and use of autonomous weapons in non-conflict situations. As a young campaigner and human rights activist, the killer robots issue becomes even more urgent when considering how these weapons present huge risks to those who attend protests. That is, maybe less ‘killer’, but equally as lethal. As civil and political rights continue to come under fire while surveillance powers expand, it’s essential that governments act now to prevent these weapons from becoming a reality. “Non-lethal” or “less lethal” autonomous weapons are not yet in existence but they are not far off and it’s the normalisation of these technologies that present the biggest threat to the future of our human rights.

Colourful peaceful protest march walking under a bridge as police officers on the bridge look down

Photo by Alex Radelich.

Surveillance drone technology is not new, but its applicability to pandemic policing means it’s not hard to imagine them being repurposed into “less lethal” autonomous weapons. After all, the US has used military-grade Predator drones to fly over cities during protests, while Israel has used remotely piloted drones to disperse tear gas over Palestinian protesters. So would it be out of the realm of possibility to imagine these weapons as autonomous, hovering over a crowd of protesters ready to disperse chemical agents? Indeed, the same surveillance technology that determines physical distancing may be able to identify or predict certain types of activity, alerting an autonomous weapon to fire on crowds using non-lethal ammunition or tear gas. Surveillance drones that use sound could be equipped with sounds that disorient and confuse in an effort to disseminate protesters. And while everyone, under the UN Declaration of Human Rights, has the right to a peaceful protest — the growth of surveillance powers has led protesters to become more vigilant about protecting their identity from persecution by law enforcement.

Protest sign reads ‘What lessens one of us lessens all of us’.

Photo: Michiele Henderson

Yet thanks to COVID-19, the increased masking of civilians has prompted those developing facial recognition technology to adjust their approach so that systems may recognise people’s faces while covered with masks. As a result, many protesters who ordinarily would wear masks to obscure their identity from authorities may soon lose that ability to “hide”. If “less lethal” autonomous weapons were outfitted with facial recognition, they could not only pose risks to prominent protesters and other civilians with a history of protesting, but would also be biased, worsening and codifying racial profiling. Finally, geotagging — whether through phones using certain apps, photos, videos, or other media — can give authorities valuable location data that allows them to track individuals’ whereabouts and the people they come into contact with. This kind of technology can be co-opted to force people to stay in certain areas, thereby restricting their right to move freely, and may even prevent them from gathering in crowds to protest. This would be particularly concerning for those who live in highly policed, occupied, or controlled territories where authorities may force people to stay within their designated areas.

Normalising the technological solutions trend is concerning for the future because while people may accept the technological innovations of today, we are not prepared for their application in the fully autonomous weapons systems of tomorrow. The human rights risks that the development and use of this technology poses is what makes this issue so urgent, and autonomous weapons should not be considered an issue isolated to conflict situations. These weapons have every possibility of becoming part of your own “backyard” as law enforcement becomes increasingly militarised over the years. As a young activist, these are not the kind of weapons I want to face when I’m protesting — and nobody else should have to face them either.

Fully autonomous weapons would delegate life and death decisions to machines, programs, and algorithms — crossing an ethical red line, contravening law designed to protect civilians, threatening human rights, and destabilizing global security. Want to hear more about the human rights risks posed by fully autonomous weapons, and why it is not too late to stop them? Register before July 20th to join us at RightsCon on 30 July 2020.

Picture of Farah Bogani in front of a UN sculpture and a crowd shot with ‘Justice Now’ on a sign

Farah Bogani is the Project Officer for the Campaign to Stop Killer Robots is, based in Ottawa, Canada. She previously worked at Amnesty International’s Secretariat office in London, UK.


Original Article posted on Medium.com.

Farah Bogani

Image alt text
SKR dots icon

Stop killer robots

Join us

Keep up with the latest developments in the movement to Stop Killer Robots.

Join us