menu hero image

Youth and killer robots

When young people come together, we can make the world listen.

Killer robots might seem like science fiction, but unfortunately, they are the next big step in digital dehumanisation; in cementing the biases and inequalities of previous generations. Killer robots change the relationship between people and technology by handing over life and death decision-making to machines. They challenge human control over the use of force, and where they target people, they dehumanise us – reducing us to data points.

Unless we act, future generations will have to live with the consequences of decisions that are being made right now – the development of new technologies with algorithmic bias, climate change, unjust conflicts, social and political inequity and unrest… The world we live in can be complex but we have more power to influence events than we think.

 

Image alt text

There are many global and social issues that we will have to face in our lifetimes.

In the future, young people will have to confront the reality and consequences of decisions that are made today. Youth around the world are working to bring about change and new hope; as agents of change with power and influence.

We’ve grown up in a digital age and we are the future. The coders, programmers, engineers, soldiers, diplomats, politicians, activists, organisers, and creatives who will have to deal with the realities of killer robots.

We have the power to shape the future of the tech, of industry, of politics. To invest in our communities and humanity more broadly. Each and every one of us has the opportunity to carry the message that killing shouldn’t be delegated to machines, and that each human life has value.


 

FAQ's

There are ethical, moral, technical, legal, and security problems with autonomous weapons. Whether on the battlefield or at a protest, machines cannot make complex ethical choices — they cannot comprehend the value of human life. In case of a mistake or an unlawful act, who would be accountable? The programmer, the manufacturer, the military commander or the machine itself? This accountability gap would make it difficult to ensure justice, especially for victims. Other problems with killer robots include increasing digital dehumanisation, algorithmic biases, loss of meaningful human control, the lowering of the threshold to war, a potential destabilising arms race, and a fundamental shift in our relationship with technology.

If used, autonomous weapons will fundamentally shift the nature of how wars are fought. They would lead to more asymmetric war, and destabilize international peace and security by sparking a new arms race. They would also shift the burden of conflict further onto civilians. But, the risks of killer robots don’t only threaten people in conflict. The use of these weapons within our societies more broadly could also have serious consequences. Think about future protests, border control, policing, and surveillance, or even about other types of technologies we use. What would it say about our society – and what impact would it have on the fight for ethical tech – if we let ultimate life and death decisions be made by machines? The emergence and consequences of autonomous weapons affects us all.

Some people say killer robots would be more accurate – that they would be quicker and more efficient than human soliders, could go into places that are difficult for soldiers to operate in, could be more precise in targeting, save lives by reducing “boots on the ground”, and act as a deterrent. But similar things were said about landmines, cluster munitions, and nuclear weapons – indiscriminate weapons that killed and injured hundreds of thousands of people before being banned. Technologies that change their own behaviour or adapt their own programming independently can’t be used with real control. Other technologies can present a ‘black box’, where it is not possible to know why or how decisions are made. Under the law, military commanders must be able to judge the necessity and proportionality of an attack and to distinguish between civilians and legitimate military targets. This means not just understanding a weapon system, but also understanding the context in which it might be used. Machines don’t understand context or consequences: understanding is a human capability – and without that understanding, we lose moral responsibility and we undermine existing legal rules. The threats and risks of killer robots far outweigh any potential advantages.

Around the world momentum continues to build behind the call for limits on autonomy in weapon systems through a new international treaty. Killer robots are regarded as a major threat to humanity that requires a swift and strong multilateral response.

Over 230 NGOs support the movement to stop killer robots. Our calls for a treaty are shared by technical experts, world leaders, international institutions, parliamentary bodies and political champions. 90 states are now calling for new legal rules and limits on autonomous weapons systems. Hundreds of tech companies have pledged to never participate in nor support the development, production, or use of autonomous weapon systems. Thousands of artificial intelligence and robotics experts have warned against these weapons and called on the United Nations to take action. There is also clear public concern. In IPSOS surveys released in 2019 and 2020 more than three in every five people stated their opposition to the development of weapons systems that would select and attack targets without human intervention.

UN Secretary-General Guterres has called autonomous weapons “morally repugnant and politically unacceptable”, and has made multiple statements since 2018 urging states to negotiate a treaty. And the International Committee of the Red Cross said that new law is needed to address autonomy in weapons and has called for a treaty combining prohibitions and regulations. The European Parliament, Human Rights Council rapporteurs, and 26 Nobel Peace Laureates have endorsed calls to prohibit and regulate autonomous weapons.

There are concerns that tech companies, especially those working on military contracts, don’t have policies to make sure their work isn’t contributing to the development of autonomous weapons. A 2019 report from PAX found that Microsoft and Amazon are named among the world’s ‘highest risk’ tech companies that might be putting the world at risk through killer robot development. In 2018, thousands of employees protested Google’s contract with the Pentagon on an initiative called Project Maven. Tech worker action resulted in Google not renewing Project Maven and releasing a set of principles to guide its work in relation to artificial intelligence. In those ethical AI principles, Google committed not to “design or deploy artificial intelligence for use in weapons”.

Tech should be used to make the world a better place, and tech companies like Amazon, Google, Microsoft, Facebook, and others should commit publicly not to contribute to the development of autonomous weapons. Tech workers, roboticists, engineers, and researchers know this – which is why thousands of them have signed an open letters and pledges calling for new international law to address autonomy in weapons and ensure meaningful human control over the use of force.

It’s possible. Many universities have research institutions working on artificial intelligence and machine learning. If you want to be certain, you can check whether your university has an ethical positioning or clear statement of their position on killer robots. Or if they have contracts with defence ministries or private companies that contracted them to develop specific technologies. It is crucial for universities to be aware of how the technology they develop could be used in the future. The PAX report “Conflicted Intelligence” warns of the dangers of university AI research and partnerships, and outlines how universities can help prevent the development of autonomous weapons.

If it looks like your university is developing technologies related to killer robots, don’t panic! There is a way to act. In 2018, the Korean Advanced Institute of Science and Technology (KAIST) announced a collaboration with arms producer Hanwha Systems. The goal was to “co-develop artificial intelligence technologies to be applied to military weapons, joining the global competition to develop autonomous arms”. The announcement led to a boycott by professors and students worldwide, and this eventually pushed the university to make public reassurances that it would not develop killer robots. It implemented a policy that states “AI in any events should not injure people”. Hope comes from action. For more ideas of how to keep your university from developing autonomous weapons, check out the brilliant PAX universities Action Booklet.

If you’re reading this then you are already contributing to the movement. Follow us on social media. Join our Youth Network. Teach the people around you – your friends, family, school – about killer robots. Share your thoughts and opinions on social media. Awareness is the first step to change, and the more the message spreads the more likely we will be able to push forward with more momentum.

Image alt text

Youth Network

Youth have a right and responsibility to participate in decisions that impact them. The Youth Network is a place for young leaders to bring their skills and strengths to the movement to stop killer robots. The Network connects young people from around the world, facilitates community and collaboration, and provides accessible resources and opportunities for all youth to secure a future without automated killing.
Join the Youth Network
Image alt text

Universities and killer robots

Universities play a big role in shaping society. They often drive innovation and play a key role in training the next generation of leaders. Many important inventions used in everyday life, from seatbelts to touchscreens, come from university research – there are many positive impacts and applications university research can have.

However, not all innovations stemming from universities are beneficial. In many countries, the defence industry sponsors university research facilities. Students and professors could be working on a project that will be used to advance development of autonomous weapons. This is why universities should publish clear policies, clarifying where they draw the line between what technology they will and will not develop.

Inform yourself, raise awareness, and take action.
Save your Uni from Killer Robots
Play video Image alt text

Scouts and killer robots

In the summer of 2019, over 40,000 scouts from 169 nations attended the 24th World Scout Jamboree at the Summit Bechtel Scout Reserve in West Virginia, USA. For over 10 days, Scouts were brought together from the far reaches of the globe to promote peace, mutual understanding, leadership and life skills. Our team spoke to thousands of Scouts, who told us why they want to #KeepCtrl over the use of force.

In 2020, the World Organization of the Scout Movement-Interamerican Region and the Campaign to Stop Killer Robots formed a partnership to strengthen the capacities of children, adolescents and adult Scout volunteers to apply the Sustainable Development Goals, particularly No. 16 “Peace, Justice and Solid Institutions “, in order to mitigate the impact and threats of killer robots to security and peace of the civilian population. The joint endeavour aims to promote the ability to obtain and improve their knowledge of killer robots and the global benefits of their ban in support of global peace.
Scouts, welcome to #TeamHuman
Image alt text

Ypres Peace Prize

Every three years since 2002, school students across the Belgian City of Ypres have awarded it’s International Peace Prize to an organisation, person or initiative that has recently contributed to peace. The prize for 2020 was awarded to the Campaign to Stop Killer Robots and ninety percent of those who voted for the Ypres Peace Prize were age 18 or younger.

This tells us that interest in killer robots and the future we are trying to secure is growing across the world, especially for youth.

More about the Ypres Peace Prize

2

International Youth Day 2020

Our Hungarian Youth Activist Coordinator Illés Katona explains why youth should take action to stop killer robots on International Youth Day 2020.

3

Good Robots Gone Bad

Marta Kosmyna on how great inventions can have unintended consequences. So how do we protect our robots from going to the dark side? As students, you are the changemakers of today and the leaders of tomorrow. Let's talk about the role of robotics in law enforcement and warfare, and how to build robots we can be proud of.

Image alt text
SKR dots icon

Stop killer robots

Join us

Keep up with the latest developments in the movement to Stop Killer Robots.

Join us