I’m Richard Moyes – a middle-aged, middle-class, white guy who works for Article 36, an NGO that specialises in developing strategies to protect civilians from weapons.
Targeted advertising, bank loan approvals, predictive policing… autonomy in weapons, automation is increasingly becoming part of our lives. What types of digital dehumanisation are you most worried about and why?
I feel this issue in terms of bureaucratisation. In all of these areas, automation and AI can extend and accelerate bureaucracy. I don’t mean more ‘red-tape’. I mean more fitting the world into little boxes. Through those boxes we can lose control of our relationship with the world and with one another. Controlling the labelling and categorisation structures of society is a way of exerting power.
Tell us about a time when you saw or experienced discrimination, risk, or oppression resulting from the use of a technology, program, or algorithm.
My background is working on the impact of weapons. A child in a refugee camp in Eritrea once walked up to us holding an unexploded cluster bomb that had been made in the UK. I had been to the offices of the company that made them, and it makes you think of the distance that can exist between the people designing, making and profiting from technologies and the people left holding the risk.
Technology isn’t intrinsically good or bad. How do we stand up for technology that benefits humanity whilst preventing harm or unintended consequences?
Hmmm, but technology isn’t neutral either – it is always embedded within social and economic conditions. Often we are just going to drift into certain relationships with technology.
So maybe we all need to find ways to become more empowered, and to ask questions about who technology is benefiting, who it is excluding and how it is recalibrating our relationships.
What do you think our relationship with technology will look like in 5/10/20 years and how much power do we have to influence this?
I think we do have the power to influence those relationships – but we have lots of different relationships in that space, right? Different people in society have probably always had different levels of access and benefit from technology – so I imagine those relationships will continue to be contested. But contestation is good – society is probably never going to sit still long enough to ‘solve’ problems like this. And maybe we don’t want it to?
Do you believe that individuals can make a difference?
100%. But I recognise it as a privilege that I know that this is true.
When did you first learn about killer robots, and what did you think?
I was quoted on this issue in the New Scientist, back in 2008 – saying that humans not sensors should be making targeting decisions. That quote has stood up pretty well – but it makes me feel a bit old! I had been working on new, sensor-based weapons and it felt like lines needed to be drawn to prevent human control from slipping away.
The 2020’s have been a little rough so far. What gives you hope?
The answer to this question is always people isn’t it?! I have confidence in people.
Why are you part of the Stop Killer Robots movement?
All of the above!
Stop killer robots
Keep up with the latest developments in the movement to Stop Killer Robots.Join us