Skip to main contentSkip to navigationSkip to navigation
scene of destruction with people searching
Palestinians search for missing people and victims under the rubble of a home destroyed after an Israeli airstrike in al-Maghazi refugee camp, southern Gaza. Photograph: Mohammed Saber/EPA
Palestinians search for missing people and victims under the rubble of a home destroyed after an Israeli airstrike in al-Maghazi refugee camp, southern Gaza. Photograph: Mohammed Saber/EPA

‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets

This article is more than 7 months old

Israeli intelligence sources reveal use of ‘Lavender’ system in Gaza war and claim permission given to kill civilians in pursuit of low-ranking militants

The Israeli military’s bombing campaign in Gaza used a previously undisclosed AI-powered database that at one stage identified 37,000 potential targets based on their apparent links to Hamas, according to intelligence sources involved in the war.

In addition to talking about their use of the AI system, called Lavender, the intelligence sources claim that Israeli military officials permitted large numbers of Palestinian civilians to be killed, particularly during the early weeks and months of the conflict.

Their unusually candid testimony provides a rare glimpse into the first-hand experiences of Israeli intelligence officials who have been using machine-learning systems to help identify targets during the six-month war.

Israel’s use of powerful AI systems in its war on Hamas has entered uncharted territory for advanced warfare, raising a host of legal and moral questions, and transforming the relationship between military personnel and machines.

“This is unparalleled, in my memory,” said one intelligence officer who used Lavender, adding that they had more faith in a “statistical mechanism” than a grieving soldier. “Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier.”

Another Lavender user questioned whether humans’ role in the selection process was meaningful. “I would invest 20 seconds for each target at this stage, and do dozens of them every day. I had zero added-value as a human, apart from being a stamp of approval. It saved a lot of time.”

Palestinian children salvage items amid the destruction caused by Israeli bombing in Bureij, central Gaza, on 14 March. Photograph: AFP/Getty Images

The testimony from the six intelligence officers, all who have been involved in using AI systems to identify Hamas and Palestinian Islamic Jihad (PIJ) targets in the war, was given to the journalist Yuval Abraham for a report published by the Israeli-Palestinian publication +972 Magazine and the Hebrew-language outlet Local Call.

Their accounts were shared exclusively with the Guardian in advance of publication. All six said that Lavender had played a central role in the war, processing masses of data to rapidly identify potential “junior” operatives to target. Four of the sources said that, at one stage early in the war, Lavender listed as many as 37,000 Palestinian men who had been linked by the AI system to Hamas or PIJ.

Lavender was developed by the Israel Defense Forces’ elite intelligence division, Unit 8200, which is comparable to the US’s National Security Agency or GCHQ in the UK.

Several of the sources described how, for certain categories of targets, the IDF applied pre-authorised allowances for the estimated number of civilians who could be killed before a strike was authorised.

Two sources said that during the early weeks of the war they were permitted to kill 15 or 20 civilians during airstrikes on low-ranking militants. Attacks on such targets were typically carried out using unguided munitions known as “dumb bombs”, the sources said, destroying entire homes and killing all their occupants.

story tips embed

“You don’t want to waste expensive bombs on unimportant people – it’s very expensive for the country and there’s a shortage [of those bombs],” one intelligence officer said. Another said the principal question they were faced with was whether the “collateral damage” to civilians allowed for an attack.

“Because we usually carried out the attacks with dumb bombs, and that meant literally dropping the whole house on its occupants. But even if an attack is averted, you don’t care – you immediately move on to the next target. Because of the system, the targets never end. You have another 36,000 waiting.”

According to conflict experts, if Israel has been using dumb bombs to flatten the homes of thousands of Palestinians who were linked, with the assistance of AI, to militant groups in Gaza, that could help explain the shockingly high death toll in the war.

The health ministry in the Hamas-run territory says 33,000 Palestinians have been killed in the conflict in the past six months. UN data shows that in the first month of the war alone, 1,340 families suffered multiple losses, with 312 families losing more than 10 members.

Israeli soldiers stand on the Israeli side of the Israel-Gaza border surveying the Palestinian territory on 30 March. Photograph: Amir Cohen/Reuters

Responding to the publication of the testimonies in +972 and Local Call, the IDF said in a statement that its operations were carried out in accordance with the rules of proportionality under international law. It said dumb bombs are “standard weaponry” that are used by IDF pilots in a manner that ensures “a high level of precision”.

The statement described Lavender as a database used “to cross-reference intelligence sources, in order to produce up-to-date layers of information on the military operatives of terrorist organisations. This is not a list of confirmed military operatives eligible to attack.

“The IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist,” it added. “Information systems are merely tools for analysts in the target identification process.”

Lavender created a database of tens of thousands of individuals

In earlier military operations conducted by the IDF, producing human targets was often a more labour-intensive process. Multiple sources who described target development in previous wars to the Guardian, said the decision to “incriminate” an individual, or identify them as a legitimate target, would be discussed and then signed off by a legal adviser.

In the weeks and months after 7 October, this model for approving strikes on human targets was dramatically accelerated, according to the sources. As the IDF’s bombardment of Gaza intensified, they said, commanders demanded a continuous pipeline of targets.

“We were constantly being pressured: ‘Bring us more targets.’ They really shouted at us,” said one intelligence officer. “We were told: now we have to fuck up Hamas, no matter what the cost. Whatever you can, you bomb.”

To meet this demand, the IDF came to rely heavily on Lavender to generate a database of individuals judged to have the characteristics of a PIJ or Hamas militant.

Details about the specific kinds of data used to train Lavender’s algorithm, or how the programme reached its conclusions, are not included in the accounts published by +972 or Local Call. However, the sources said that during the first few weeks of the war, Unit 8200 refined Lavender’s algorithm and tweaked its search parameters.

After randomly sampling and cross-checking its predictions, the unit concluded Lavender had achieved a 90% accuracy rate, the sources said, leading the IDF to approve its sweeping use as a target recommendation tool.

Lavender created a database of tens of thousands of individuals who were marked as predominantly low-ranking members of Hamas’s military wing, they added. This was used alongside another AI-based decision support system, called the Gospel, which recommended buildings and structures as targets rather than individuals.

Two Israeli air force F15 fighter jets near the city of Gedera, southern Israel, on 27 March. Photograph: Abir Sultan/EPA

The accounts include first-hand testimony of how intelligence officers worked with Lavender and how the reach of its dragnet could be adjusted. “At its peak, the system managed to generate 37,000 people as potential human targets,” one of the sources said. “But the numbers changed all the time, because it depends on where you set the bar of what a Hamas operative is.”

They added: “There were times when a Hamas operative was defined more broadly, and then the machine started bringing us all kinds of civil defence personnel, police officers, on whom it would be a shame to waste bombs. They help the Hamas government, but they don’t really endanger soldiers.”

Before the war, US and Israeli estimated membership of Hamas’s military wing at approximately 25-30,000 people.

In the weeks after the Hamas-led 7 October assault on southern Israel, in which Palestinian militants killed nearly 1,200 Israelis and kidnapped about 240 people, the sources said there was a decision to treat Palestinian men linked to Hamas’s military wing as potential targets, regardless of their rank or importance.

The IDF’s targeting processes in the most intensive phase of the bombardment were also relaxed, they said. “There was a completely permissive policy regarding the casualties of [bombing] operations,” one source said. “A policy so permissive that in my opinion it had an element of revenge.”

Another source, who justified the use of Lavender to help identify low-ranking targets, said that “when it comes to a junior militant, you don’t want to invest manpower and time in it”. They said that in wartime there was insufficient time to carefully “incriminate every target”.

“So you’re willing to take the margin of error of using artificial intelligence, risking collateral damage and civilians dying, and risking attacking by mistake, and to live with it,” they added.

‘It’s much easier to bomb a family’s home’

The testimonies published by +972 and Local Call may explain how such a western military with such advanced capabilities, with weapons that can conduct highly surgical strikes, has conducted a war with such a vast human toll.

When it came to targeting low-ranking Hamas and PIJ suspects, they said, the preference was to attack when they were believed to be at home. “We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity,” one said. “It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”

Relatives outside the morgue of the al-Najjar hospital in Rafah mourn Palestinians killed in Israeli bombings on 1 February. Photograph: Mohammed Abed/AFP/Getty Images

Such a strategy risked higher numbers of civilian casualties, and the sources said the IDF imposed pre-authorised limits on the number of civilians it deemed acceptable to kill in a strike aimed at a single Hamas militant. The ratio was said to have changed over time, and varied according to the seniority of the target.

According to +972 and Local Call, the IDF judged it permissible to kill more than 100 civilians in attacks on a top-ranking Hamas officials. “We had a calculation for how many [civilians could be killed] for the brigade commander, how many [civilians] for a battalion commander, and so on,” one source said.

“There were regulations, but they were just very lenient,” another added. “We’ve killed people with collateral damage in the high double digits, if not low triple digits. These are things that haven’t happened before.” There appears to have been significant fluctuations in the figure that military commanders would tolerate at different stages of the war.

One source said that the limit on permitted civilian casualties “went up and down” over time, and at one point was as low as five. During the first week of the conflict, the source said, permission was given to kill 15 non-combatants to take out junior militants in Gaza. However, they said estimates of civilian casualties were imprecise, as it was not possible to know definitively how many people were in a building.

Another intelligence officer said that more recently in the conflict, the rate of permitted collateral damage was brought down again. But at one stage earlier in the war they were authorised to kill up to “20 uninvolved civilians” for a single operative, regardless of their rank, military importance, or age.

“It’s not just that you can kill any person who is a Hamas soldier, which is clearly permitted and legitimate in terms of international law,” they said. “But they directly tell you: ‘You are allowed to kill them along with many civilians.’ … In practice, the proportionality criterion did not exist.”

The IDF statement said its procedures “require conducting an individual assessment of the anticipated military advantage and collateral damage expected … The IDF does not carry out strikes when the expected collateral damage from the strike is excessive in relation to the military advantage.” It added: “The IDF outright rejects the claim regarding any policy to kill tens of thousands of people in their homes.”

Experts in international humanitarian law who spoke to the Guardian expressed alarm at accounts of the IDF accepting and pre-authorising collateral damage ratios as high as 20 civilians, particularly for lower-ranking militants. They said militaries must assess proportionality for each individual strike.

Smoke rises over the Gaza Strip, as seen from from the Israeli side of the border on 21 January. Photograph: Amir Levy/Getty Images

An international law expert at the US state department said they had “never remotely heard of a one to 15 ratio being deemed acceptable, especially for lower-level combatants. There’s a lot of leeway, but that strikes me as extreme”.

Sarah Harrison, a former lawyer at the US Department of Defense, now an analyst at Crisis Group, said: “While there may be certain occasions where 15 collateral civilian deaths could be proportionate, there are other times where it definitely wouldn’t be. You can’t just set a tolerable number for a category of targets and say that it’ll be lawfully proportionate in each case.”

Whatever the legal or moral justification for Israel’s bombing strategy, some of its intelligence officers appear now to be questioning the approach set by their commanders. “No one thought about what to do afterward, when the war is over, or how it will be possible to live in Gaza,” one said.

Another said that after the 7 October attacks by Hamas, the atmosphere in the IDF was “painful and vindictive”. “There was a dissonance: on the one hand, people here were frustrated that we were not attacking enough. On the other hand, you see at the end of the day that another thousand Gazans have died, most of them civilians.”

Guardian Newsroom: The unfolding crisis in the Middle East
On Tuesday 30 April, 7-8.15pm GMT, join Devika Bhat, Peter Beaumont, Emma Graham-Harrison and Ghaith Abdul-Ahad as they discuss the fast-developing crisis in the Middle East. Book tickets here or at theguardian.live

Most viewed

Most viewed