Lavender: the First AI Genocide Device

“The machine did it coldly. And that made it easier.”

The technology being used in Gaza reveals a terrifying new convergence of Artificial Intelligence and drone warfare. It has the capacity for precision but also for indiscriminate and horrific violence operated in the abstract. It conjures something of our worst dystopian imagination but this is grim reality, not sci-fi. The testimony revealing the use of the technology came from six intelligence officers, who have been involved in using AI systems to identify Hamas targets in the killing spree. It was given to the journalist Yuval Abraham for a report published by the Israeli-Palestinian publication +972 Magazine and the Hebrew-language outlet Local Call.

The operations of the World Central Kitchen in Gaza remain suspended after the killing of their workers, as do many other aid agencies, who can’t possibly work under such conditions. In a statement on Friday the Israeli military said a retired general’s investigation into the killings found the officers “mishandled critical information” and “violated the army’s rules of engagement.”

“The strike on the aid vehicles is a grave mistake stemming from a serious failure due to a mistaken identification, errors in decision-making, and an attack contrary to the standard operating procedures,” the statement said.

“The deployment of the so-called ‘Lavender System’, is reminiscent of dystopian narratives from ‘The Terminator’, employs drone technology equipped with facial recognition to track individuals to their residences. Following identification, it orchestrates targeted bombings on buildings where suspected individuals, and their families, and extended kin are known to be residing. This systematic indiscriminate extermination breaches international law, regarding genocide and war crimes as the systems was preauthorised with killing of civilians known to be inside home and hospitals and places of worship in fact any buildings or shelter. Once the bombs fall on the building those inside are killed not by the bomb but by the destruction of the building and the falling concrete steel and debris which become the graves of the hapless victims. This is therefore, doubly indiscriminate and a war crime, first the authorisation to kill innocent civilians known residents of the suspect from a database and then the destruction of dwellings and public places containing human life.”

“It is akin multiple 9/11s strikes on all the buildings in Gaza. This explains why Gaza landscape has disappeared to nothingness and why the Israelis have not permitted independent teams on the ground. This explains why journalists whole families were being wiped out. This thing had perplexed me was why families were being wiped off the face of the earth, think of the famous Wael Dahdouh who lost all his blood line in one incident and the recent killing of the seven aid members of the WCK World Central Kitchen. The inherent capability of these drones to pinpoint civilian locations with precision implicates a chilling reality: if the military has the means to identify and locate civilians, it thereby possesses the capacity to apprehend them directly, should suspicions warrant. The choice to bypass such direct interventions in favor of aerial bombings—resulting in widespread destruction of infrastructure and significant civilian deaths—is a deliberate strategy of indiscriminate violence. The database underpinning this system, cataloging detailed demographics of each household, implicates the military command and its top leadership in targeted killings. The explicit knowledge that women, children, and the elderly would be among those targeted, reveals an intentional disregard for civilian life. Such actions unequivocally constitute war crimes and is incumbent upon the community of international legal experts to interpret the law that criminalises this dystopian behaviour and “where’s daddy”killing program. The instigators of this program, including any commanding officials up to and including ‘Netanyahu’, must be held accountable before the criminal court. The principles of justice and international humanitarian law demand no less.”

Unimportant People

This is not about ‘precision’ bombing though. In fact, as Yuval Abraham explains: “When it came to targeting alleged junior militants marked by Lavender, the army preferred to only use unguided missiles, commonly known as “dumb” bombs (in contrast to “smart” precision bombs), which can destroy entire buildings on top of their occupants and cause significant casualties. “You don’t want to waste expensive bombs on unimportant people — it’s very expensive for the country and there’s a shortage [of those bombs],” said C., one of the intelligence officers.”

What we are witnessing here, and really beginning to understand, is the dystopian nature of this warfare, in which human decisions (formally known as morality) are essentially removed from the ‘process’ of killing. Yuval Abraham again:

“During the early stages of the war, the army gave sweeping approval for officers to adopt Lavender’s kill lists, with no requirement to thoroughly check why the machine made those choices or to examine the raw intelligence data on which they were based. One source stated that human personnel often served only as a “rubber stamp” for the machine’s decisions, adding that, normally, they would personally devote only about “20 seconds” to each target before authorizing a bombing — just to make sure the Lavender-marked target is male. This was despite knowing that the system makes what are regarded as “errors” in approximately 10 percent of cases, and is known to occasionally mark individuals who have merely a loose connection to militant groups, or no connection at all.

Moreover, the Israeli army systematically attacked the targeted individuals while they were in their homes — usually at night while their whole families were present — rather than during the course of military activity. According to the sources, this was because, from what they regarded as an intelligence standpoint, it was easier to locate the individuals in their private houses. Additional automated systems, including one called “Where’s Daddy?” also revealed here for the first time, were used specifically to track the targeted individuals and carry out bombings when they had entered their family’s residences.”

“The result, as the sources testified, is that thousands of Palestinians — most of them women and children or people who were not involved in the fighting — were wiped out by Israeli airstrikes, especially during the first weeks of the war, because of the AI program’s decisions.

“We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity,” A., an intelligence officer, told +972 and Local Call. “On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”

There has been, its true, drone warfare for years. Think of the horrific bombing exposed by Wikileaks. The difference here is the selection criteria, and the consequences that run from it. Essentially, after October 7, the Israeli army just became unhinged. The concept of proportionality was discarded and carnage ensued.

Yuval Abraham writes: “In the Israeli army, the term “human target” referred in the past to a senior military operative who, according to the rules of the military’s International Law Department, can be killed in their private home even if there are civilians around. Intelligence sources told +972 and Local Call that during Israel’s previous wars, since this was an “especially brutal” way to kill someone — often by killing an entire family alongside the target — such human targets were marked very carefully and only senior military commanders were bombed in their homes, to maintain the principle of proportionality under international law.”

But after October 7 the Israeli army, according to the intelligence sources said, took a dramatically different approach. Under “Operation Iron Swords,” the army designated ALL operatives of Hamas’ military wing as human targets, regardless of their rank or military importance:

“The new policy also posed a technical problem for Israeli intelligence. In previous wars, in order to authorize the assassination of a single human target, an officer had to go through a complex and lengthy “incrimination” process: cross-check evidence that the person was indeed a senior member of Hamas’ military wing, find out where he lived, his contact information, and finally know when he was home in real time. When the list of targets numbered only a few dozen senior operatives, intelligence personnel could individually handle the work involved in incriminating and locating them.”

“However, once the list was expanded to include tens of thousands of lower-ranking operatives, the Israeli army figured it had to rely on automated software and artificial intelligence. The result, the sources testify, was that the role of human personnel in incriminating Palestinians as military operatives was pushed aside, and AI did most of the work instead. According to four of the sources who spoke to +972 and Local Call, Lavender — which was developed to create human targets in the current war — has marked some 37,000 Palestinians as suspected “Hamas militants.”

This is then a self-fulfilling prophecy.

We thought we would have to wait years to see what awful turn our use of AI might reveal, but here it is right before us.


You can read the full report by Yuval Abraham here.


Comments (6)

Join the Discussion

Your email address will not be published.

  1. Cathie Lloyd says:

    Words fail. Can the international community make anything from the profound shock?

  2. SleepingDog says:

    Real hate crimes, unpeople, the long historical trend of perpetrators of genocide working out ways of increasing efficiency (smallpox-infested charity-blankets are slow compared to recent developments). The vast number of criminals worldwide complicit in these atrocities.

    Israel played their first game (against Slovakia) apparently as if none of this was happening. Sports boycott seems the least of what we should be doing.

    I’ve been reading philosopher Susan Neiman on how to reckon with one’s own nation’s history of evildoing (the British have hardly started down that road). Is it going to takes decades of hard work and older generations dying off before modern Israelis learn to take such responsibility for this genocide?

    1. SleepingDog says:

      As Media Lens points out, some of those complicit criminals enabling genocide are in the corporate media:
      who have systematically failed to address the Israeli intent behind such systemised slaughter, maiming, terror, persecution, torture, incarceration, starvation etc nor expose their lies.

  3. Dougie Blackwood says:

    Gut wrenchig stuff. I found this horrifying not only because of what’s happening in Gaza but I’m sure the arms industry will already be hard at work to make this casual killing even more effective. The world of big “Big Brother” is with us here and now.

  4. Alasdair Macdonald says:

    My father was in the army in North Africa for almost the entire Second World War period. He was an anti-aircraft gunner. In the course of the war he actually met German and Italian soldiers face-to-face, as well as seeing bodies of soldiers of many nations and civilians. Like most old soldiers I knew, he did not talk about combat. He talked about the cameraderie and the things he saw, like The Pyramids. When I asked him if he had ever shot anyone, he said, “I hope I didn’t”.

    A number of studies during World War 1 indicated that most soldiers in the trenches fired to miss.

    After the war, he, like others who had served as ground troops, bore no hatred towards Germans and Italians – “they are people like us, I have spoken to them”. Hostility towards Germans and others was stronger in people like my mother who had spent the war at home and experienced the Blitz. Ex-servicemen of the RAF and Navy tended to display more anti-German feelings. I think this was largely because, they did not encounter Germans and Italians face-to-face and see their common humanity.

    I think that this is what the increasing mechanisation of warfare with AI controlled drones. Those deploying them are not actually experiencing the humanity of those being attacked.

    1. John says:

      It’s beginning to resemble the Star Trek episode-‘A Taste of Armageddon’ where wars were not actually fought but civilians were still sent to death.

Help keep our journalism independent

We don’t take any advertising, we don’t hide behind a pay wall and we don’t keep harassing you for crowd-funding. We’re entirely dependent on our readers to support us.

Subscribe to regular bella in your inbox

Don’t miss a single article. Enter your email address on our subscribe page by clicking the button below. It is completely free and you can easily unsubscribe at any time.