menu hero image

Use of Lavender data processing system in Gaza

Stop Killer Robots continues to find reports of Israeli use of target recommendation systems in the Gaza strip deeply concerning from a legal, moral and humanitarian perspective.

Photo by Mohammed Ibrahim 

Stop Killer Robots continues to find reports of Israeli use of target recommendation systems in the Gaza strip deeply concerning from a legal, moral and humanitarian perspective. Although the Lavender system, like the Habsora /Gospel system, is not an autonomous weapon, both raise serious concerns over the increasing use of artificial intelligence in conflict, automation bias, digital dehumanisation, and loss of human control in the use of force. 

Reports that Lavender has been used by the IDF to generate human targets are deeply troubling. The system reportedly makes targeting recommendations based on behavioural “features”, including communications patterns, social media connections, and changing addresses frequently. According to a +972 source, “ An individual found to have several different incriminating features will reach a high rating, and thus automatically becomes a potential target for assassination.” 

The Lavender is a data processing system,  not an autonomous weapon, and the actual decision whether or not to strike a recommended target is made and carried out separately by a human and not a machine. However, the use of this system demonstrates key issues with autonomous weapons, namely digital dehumanisation and loss of meaningful human control. 

Protection of civilians is a core tenet of International Humanitarian Law (IHL), and in case of doubt concerning the status of an individual they must be considered a civilian. The ranking of the general population in Gaza as potential targets based on behavioural profiles, which have an error margin of 10%, reduces human beings to data points and raises grave concerns around compliance with IHL, the violation of human dignity, and digital dehumanisation.

Additionally concerning are reports of “sweeping approval for officers to adopt Lavender’s kill lists, with no requirement to thoroughly check why the machine made those choices or to examine the raw intelligence data on which they were based”. This lack of meaningful human control and engagement involved in the decisions to attack Lavender’s recommended targets demonstrates a worrying trend in the reliance on automated systems. 

Stop Killer Robots believes that technology should be developed and used to promote peace, justice, human rights, equality and respect for law – not for autonomous killing or for the further entrenchment or reproduction of inequality and oppression. 

The UN Secretary-General, the International Committee of the Red Cross, and more than 100 countries have called for a legal instrument on autonomous weapons systems. The increasing use of systems with concerning levels of autonomy demonstrates the urgent need for clear prohibitions and regulations on autonomy in weapons systems in international law.

Requests for comment or interviews should be directed to [email protected] 

Stop Killer Robots

Image alt text
SKR dots icon

Stop killer robots

Join us

Keep up with the latest developments in the movement to Stop Killer Robots.

Join us