How Israeli military’s AI is marking Gaza children, civilians for death

New investigation says Israeli forces are using “undisclosed methods” to spy on Palestinian families in their homes to mark them for assassination and bomb them in real time.

Israeli commanders using Lavender in Gaza deemed families of individuals labeled as 'Hamas members' acceptable collateral damage. / Photo: AP
AP

Israeli commanders using Lavender in Gaza deemed families of individuals labeled as 'Hamas members' acceptable collateral damage. / Photo: AP

Israel’s invasive use of artificial intelligence in its military offensives in Palestine's Gaza has long been documented, with various reports shedding light on the array of software tools employed by its forces.

Among these, “Lavender” stands out as a key system for generating targeting information.

A recent investigation by The Washington Post has provided new insights into how the targeting system has operated, leading to the targeting of tens of thousands of Gaza civilians as Hamas fighters.

The report has revealed that the Israeli military captured “real-time photos” of individuals in their homes to expedite the process of identifying those flagged by Lavender through “undisclosed methods”.

Custom-built facial recognition tools were then employed to cross-reference these images with existing data on suspected Hamas members in the Lavender database.

While the specific technical details of how Israeli forces spied into Palestinian homes to obtain these real-time images were kept secret by the interviewed Israeli soldiers, the report indicates that the Israeli military typically utilises a combination of surveillance technologies, including drones and other sensors, to capture live imagery.

For instance, Israel’s deployment of quadcopter drones, small unmanned aerial vehicles, has been reported not only for surveillance but also for direct attacks on individuals.

Reports indicate that these drones have also been equipped to drop explosive devices or fire upon Palestinians at close range, leading to civilian casualties.

Read More
Read More

Quadcopter strikes: 1000 Palestinians killed by Israeli drones in one year

In the next step, the images acquired through the “undisclosed methods” were cross-referenced with existing data in the Lavender database using custom-built facial recognition technology.

A soldier interviewed by the media outlet notes that some of these images included family members of suspected fighters, who were nonetheless deemed acceptable collateral damage by Israeli commanders.

The investigation also details one instance of the use of the software in decision-making processes for airstrikes, where Israeli forces deployed it to assess whether to proceed with the bombing of 50 buildings in northern Gaza.

The software worked by counting how many cellphones were active in the area and comparing that number to the total population estimate, the report says. If less than 25 percent of the population seemed to be there, the system gave the green light to go ahead with the bombing.

Critical variables such as powered-off phones, children without phones, or areas with poor signal coverage were completely overlooked, the report quoted anonymous Israeli soldiers as saying.

Read More
Read More

Licence to abuse? The problem with Israel’s signing of global treaty on AI

Lavender’s scoring system, which estimates the likelihood of individuals being affiliated with Hamas, has drawn heavy backlash for its reliance on criteria as trivial as frequent address changes, switching cellphones, or being part of a WhatsApp group with a group member.

According to earlier reports from +972 Magazine and Local Call, Lavender played a significant role in the unprecedented scale of airstrikes in Gaza, especially during the early phases of the conflict, having generated a list of approximately 37,000 bombing targets, including numerous low-level alleged Hamas operatives who might not ordinarily be prioritised in military attacks.


Route 6