The report states that as many as 37,000 Palestinians were designated as suspected militants who were selected as potential targets. Lavender’s kill lists were prepared in advance of the invasion, launched in response to the Hamas attack of October 7, 2023, which left about 1,200 dead and about 250 hostages taken from Israel. A related AI program, which tracked the movements of individuals on the Lavender list, was called “Where’s Daddy?” Sources for the +972 Magazine report said that initially, there was “no requirement to thoroughly check why the machine made those choices (of targets) or to examine the raw intelligence data on which they were based.” The officials in charge, these sources said, acted as a “rubber stamp” for the machine’s decisions before authorizing a bombing. One intelligence officer who spoke to +972 admitted as much: “I would invest 20 seconds for each target at this stage, and do dozens of them every day. I had zero added value as a human, apart from being a stamp of approval. It saved a lot of time.”
It was already known that the Lavender program made errors in 10 percent of the cases, meaning that a fraction of the individuals selected as targets might have had no connection with Hamas or any other militant group. The strikes generally occurred at night while the targeted individuals were more likely to be at home, which posed a risk of killing or wounding their families as well.
A score was created for each individual, ranging from 1 to 100, based on how closely he was linked to the armed wing of Hamas or Islamic Jihad. Those with a high score were killed along with their families and neighbors despite the fact that officers reportedly did little to verify the potential targets identified by Lavender, citing “efficiency” reasons. “This is unparalleled, in my memory,” said one intelligence officer who used Lavender, adding that his colleagues had more faith in a “statistical mechanism” than a grieving soldier. “Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier.”
The IDF had previously used another AI system called “The Gospel,” which was described in a previous investigation by the magazine, as well as in the Israeli military’s own publications, to target buildings and structures suspected of harboring militants. “The Gospel” draws on millions of items of data, producing target lists more than 50 times faster than a team of human intelligence officers ever could. It was used to strike 100 targets a day in the first two months of the Gaza fighting, roughly five times more than in a similar conflict there a decade ago. Those structures of political or military significance for Hamas are known as “power targets.”