‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza

thousands of Palestinians — most of them women and children or people who were not involved in the fighting — were wiped out by Israeli airstrikes, especially during the first weeks of the war, because of the AI program’s decisions.
The Israeli army has marked tens of thousands of Gazans as suspects for assassination, using an AI targeting system with little human oversight and a permissive policy for casualties, +972 and Local Call reveal.
By Yuval Abraham, April 3, 202, + 972 Magazine,
In 2021, a book titled “The Human-Machine Team: How to Create Synergy Between Human and Artificial Intelligence That Will Revolutionize Our World” was released in English under the pen name “Brigadier General Y.S.” In it, the author — a man who we confirmed to be the current commander of the elite Israeli intelligence unit 8200 — makes the case for designing a special machine that could rapidly process massive amounts of data to generate thousands of potential “targets” for military strikes in the heat of a war. Such technology, he writes, would resolve what he described as a “human bottleneck for both locating the new targets and decision-making to approve the targets.”
Such a machine, it turns out, actually exists. A new investigation by +972 Magazine and Local Call reveals that the Israeli army has developed an artificial intelligence-based program known as “Lavender,” unveiled here for the first time. According to six Israeli intelligence officers, who have all served in the army during the current war on the Gaza Strip and had first-hand involvement with the use of AI to generate targets for assassination, Lavender has played a central role in the unprecedented bombing of Palestinians, especially during the early stages of the war. In fact, according to the sources, its influence on the military’s operations was such that they essentially treated the outputs of the AI machine “as if it were a human decision.”
Formally, the Lavender system is designed to mark all suspected operatives in the military wings of Hamas and Palestinian Islamic Jihad (PIJ), including low-ranking ones, as potential bombing targets. The sources told +972 and Local Call that, during the first weeks of the war, the army almost completely relied on Lavender, which clocked as many as 37,000 Palestinians as suspected militants — and their homes — for possible air strikes.
During the early stages of the war, the army gave sweeping approval for officers to adopt Lavender’s kill lists, with no requirement to thoroughly check why the machine made those choices or to examine the raw intelligence data on which they were based. One source stated that human personnel often served only as a “rubber stamp” for the machine’s decisions, adding that, normally, they would personally devote only about “20 seconds” to each target before authorizing a bombing — just to make sure the Lavender-marked target is male. This was despite knowing that the system makes what are regarded as “errors” in approximately 10 percent of cases, and is known to occasionally mark individuals who have merely a loose connection to militant groups, or no connection at all.
Moreover, the Israeli army systematically attacked the targeted individuals while they were in their homes — usually at night while their whole families were present — rather than during the course of military activity. According to the sources, this was because, from what they regarded as an intelligence standpoint, it was easier to locate the individuals in their private houses. Additional automated systems, including one called “Where’s Daddy?” also revealed here for the first time, were used specifically to track the targeted individuals and carry out bombings when they had entered their family’s residences.
The result, as the sources testified, is that thousands of Palestinians — most of them women and children or people who were not involved in the fighting — were wiped out by Israeli airstrikes, especially during the first weeks of the war, because of the AI program’s decisions.
“We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity,” A., an intelligence officer, told +972 and Local Call. “On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”
The Lavender machine joins another AI system, “The Gospel,” about which information was revealed in a previous investigation by +972 and Local Call in November 2023, as well as in the Israeli military’s own publications. A fundamental difference between the two systems is in the definition of the target: whereas The Gospel marks buildings and structures that the army claims militants operate from, Lavender marks people — and puts them on a kill list.
In addition, according to the sources, when it came to targeting alleged junior militants marked by Lavender, the army preferred to only use unguided missiles, commonly known as “dumb” bombs (in contrast to “smart” precision bombs), which can destroy entire buildings on top of their occupants and cause significant casualties. “You don’t want to waste expensive bombs on unimportant people — it’s very expensive for the country and there’s a shortage [of those bombs],” said C., one of the intelligence officers. Another source said that they had personally authorized the bombing of “hundreds” of private homes of alleged junior operatives marked by Lavender, with many of these attacks killing civilians and entire families as “collateral damage.”
The following investigation is organized according to the six chronological stages of the Israeli army’s highly automated target production in the early weeks of the Gaza war. First, we explain the Lavender machine itself, which marked tens of thousands of Palestinians using AI. Second, we reveal the “Where’s Daddy?” system, which tracked these targets and signaled to the army when they entered their family homes. Third, we describe how “dumb” bombs were chosen to strike these homes.
Fourth, we explain how the army loosened the permitted number of civilians who could be killed during the bombing of a target. Fifth, we note how automated software inaccurately calculated the amount of non-combatants in each household. And sixth, we show how on several occasions, when a home was struck, usually at night, the individual target was sometimes not inside at all, because military officers did not verify the information in real time.
STEP 1: GENERATING TARGETS
‘Once you go automatic, target generation goes crazy
…………………………………………………………………………………………………………………………… once the list was expanded to include tens of thousands of lower-ranking operatives, the Israeli army figured it had to rely on automated software and artificial intelligence. The result, the sources testify, was that the role of human personnel in incriminating Palestinians as military operatives was pushed aside, and AI did most of the work instead……………………………………………………………………………………………………………………………………………………………..
STEP 2: LINKING TARGETS TO FAMILY HOMES
‘Most of the people you killed were women and children’
…………………………… in the case of systematic assassination strikes, the army routinely made the active choice to bomb suspected militants when inside civilian households from which no military activity took place.
…………………………………………………….. these programs track thousands of individuals simultaneously, identify when they are at home, and send an automatic alert to the targeting officer, who then marks the house for bombing. One of several of these tracking softwares, revealed here for the first time, is called “Where’s Daddy?”
………………………………. Evidence of this policy is also clear from the data: during the first month of the war, more than half of the fatalities — 6,120 people — belonged to 1,340 families, many of which were completely wiped out while inside their homes, according to UN figures.
………………………………….. “In the end it was everyone [marked by Lavender],” one source explained. “Tens of thousands…………………………. Lavender and systems like Where’s Daddy? were thus combined with deadly effect, killing entire families, sources said………………………………………………
STEP 3: CHOOSING A WEAPON
‘We usually carried out the attacks with “dumb bombs”’
……………………………. In December 2023, CNN reported that according to U.S. intelligence estimates, about 45 percent of the munitions used by the Israeli air force in Gaza were “dumb” bombs, which are known to cause more collateral damage than guided bombs.
…………………….. we usually carried out the attacks with dumb bombs, and that meant literally destroying the whole house on top of its occupants. But even if an attack is averted, you don’t care — you immediately move on to the next target. Because of the system, the targets never end. You have another 36,000 waiting.”
STEP 4: AUTHORIZING CIVILIAN CASUALTIES
‘We attacked almost without considering collateral damage’
…………………………………………………………………………….. Sources told +972 and Local Call that now, partly due to American pressure, the Israeli army is no longer mass-generating junior human targets for bombing in civilian homes. The fact that most homes in the Gaza Strip were already destroyed or damaged, and almost the entire population has been displaced, also impaired the army’s ability to rely on intelligence databases and automated house-locating programs.
……………………………………………………………………………… ‘Entire families were killed’
Intelligence sources told +972 and Local Call they took part in even deadlier strikes. In order to assassinate Ayman Nofal, the commander of Hamas’ Central Gaza Brigade, a source said the army authorized the killing of approximately 300 civilians, destroying several buildings in airstrikes on Al-Bureij refugee camp on Oct. 17, based on an imprecise pinpointing of Nofal.
………………………………………………………………………………………………… Such a high rate of “collateral damage” is exceptional not only compared to what the Israeli army previously deemed acceptable, but also compared to the wars waged by the United States in Iraq, Syria, and Afghanistan. ……………………………………………………………………………
STEP 5: CALCULATING COLLATERAL DAMAGE
‘The model was not connected to reality’
…………………………………………………………………………………………………………………………………………… “the collateral damage calculation was completely automatic and statistical” — even producing figures that were not whole numbers.
STEP 6: BOMBING A FAMILY HOME
‘You killed a family for no reason’
The sources who spoke to +972 and Local Call explained that there was sometimes a substantial gap between the moment that tracking systems like Where’s Daddy? alerted an officer that a target had entered their house, and the bombing itself — leading to the killing of whole families even without hitting the army’s target. “It happened to me many times that we attacked a house, but the person wasn’t even home,” one source said. “The result is that you killed a family for no reason.”……………………………………………………………………. https://www.972mag.com/lavender-ai-israeli-army-gaza/
No comments yet.

Leave a comment