The Campaign To Stop Killer Robots
Machines taking human lives on the battlefield, in policing, border control and other circumstances risks reinforcing and exacerbating violence and discrimination, because they rely on technologies that are biased. Weapons are tools of colonial and imperial power, fueling war and conflict that disproportionately affects visible minorities and marginalized communities.
A race-sensitive, intersectional approach to fully autonomous weapons considers the disproportionate impacts such weapons would have against marginalized and vulnerable groups of different races, genders, abilities, socioeconomic status, and others.

What makes killer robots racist?

Growing evidence shows that the development of artificial intelligence (A.I.) and other emerging technologies that could someday be used in killer robots are not neutral, and that racism operates at every level of the design process, production, implementation, distribution and regulation.
Historical racial and ethnic biases can be perpetuated, with technologies like facial recognition drawing on already biased training datasets that favour light-skinned and outwardly masculine faces over darker-skinned and outwardly feminine faces. As a result, the cycle of structural and institutional violence against those who lack power and privilege continues.
Predictive policing is a current example of racist technology that perpetuates structural and institutional violence. Crime statistics that identify or predict likely “high crime” neighborhoods often target low-income areas, traditionally where Black and other communities of color and vulnerable populations live. This also feeds into biased databases that rely on racial and ethnic profiling rather than objective evidence and individual behavior. Consequently, marginalized communities suffer a higher police presence, more aggressive policing, and over criminalization.
Combined with A.I., ingrained racial and ethnic biases will continue to reproduce these inequalities, and risk replicating and amplifying discriminatory practices and impact to an unprecedented scale.
This further excludes and marginalizes groups, particularly Black people, who have historically suffered discrimination and oppression, and subjects them to further physical and structural violence.
Long-standing inherent biases built into algorithms, technologies, and A.I. pose an added ethical and human rights threat that disproportionately affects marginalized social groups. Automating violence risks an increase in biased killings, and can amplify power disparities based on racial and other hierarchies, causing irreparable harm to targeted communities.

Why an intersectional approach?

intersectional Approach
Intersectionality is a framework for understanding how intersecting identities – gender, race, ethnicity, sexuality, religion, socioeconomic status, age, and ability – connect and reinforce distinct forms of oppression suffered by those from traditionally marginalized groups. For example, Black women face a distinct form of discrimination comprised of racism and sexism that denies them access to certain spaces or opportunities, or causes them to face more violence.
This is compounded when considering how other aspects of one’s identity, including ability or religion, affects the nature of oppression and discrimination that an individual faces.
By taking an intersectional approach to preventing killer robots – and working for a more peaceful and inclusive security more widely – we see how the combined experiences of discrimination by racialized and marginalized groups of people leads to compounded effects of violence in war and conflict at the hands of groups that hold power and privilege. Historically, communities that are Black, Indigenous, non-white, and/or other marginalized groups have suffered most from various indiscriminate weapons, from landmines to cluster bombs to nuclear weapons.
Fully autonomous weapons would be no different. Identifying, acknowledging, and addressing the challenges that killer robots present to marginalized communities is fundamental to prevent cycles of violence.

Securing a future free from automated bias and killer robots

A future of peace and security cannot be realized without acknowledging the role that colonialism, white supremacy, systemic racism, and structures of oppression have played in the development of weapons. Historically, former colonial powers from North America and Western Europe have been the largest weapons-producers, testing and deploying them on a wide scale either in or on the territories of colonized countries and small island states across Africa, Asia-Pacific, Latin America, and the Middle East and North Africa. Fully autonomous weapons would only further entrench these structures and systems, disproportionately affecting marginalized and vulnerable groups of people based on their identities.
By failing to acknowledge and address systemic racism and the role of white supremacy in our institutions and movements, we cannot begin the important work of dismantling structures of oppression, and racial and social injustice. Building a movement that champions inclusivity requires an intersectional perspective on race and other marginalized identities.
secureing future
It is essential that the future treaty banning fully autonomous weapons supports an intersectional approach, proactively addressing systemic racism and structures of oppression, and highlighting the importance of inclusion, visibility, and ownership.

The Campaign to Stop Killer Robots is committed to working for racial equity and acknowledging our own failures to address systemic racism. If you have questions, feedback, or suggestions about this work, please get in touch with our team at accountability@stopkillerrobots.org.


Video materials


Resources


A comprehensive ban is possible

comprehensive
Over 90 countries are participating in UN talks on the concerns raised by fully autonomous weapons, also known as lethal autonomous weapons systems. While countries from North America and Europe have been slow to move forward on negotiations, a growing number of countries from Africa, Latin America, and the Middle East and North Africa are calling for a ban. Bold political leadership is urgently needed to launch negotiations on an international treaty to prohibit fully autonomous weapons and retain meaningful human control over the use of force.