Thursday, April 2, 2026

‘AI-assisted genocide’: Israel reportedly used database for Gaza kill lists | Israel-Palestine conflict News | Al Jazeera

‘AI-assisted genocide’: Israel reportedly used database for Gaza kill lists | Israel-Palestine conflict News | Al Jazeera

Israel-Palestine conflict
How war has restructured Gaza’s job market
What is 'Greater Israel'?
The Gaza Tribunal: A question of complicity
‘Tears and grief’: Mother’s Day in Gaza
How Israel turned food into a weapon of war

News
|
Israel-Palestine conflict
‘AI-assisted genocide’: Israel reportedly used database for Gaza kill lists
Two Israeli media outlets report the Israeli military’s use of an AI-assisted system called Lavender to identify Gaza targets.

Save

Click here to share on social media
Share


Add Al Jazeera on Google


Palestinians carry the body of a World Central Kitchen worker at Al Aqsa hospital in Deir al-Balah, Gaza Strip
00:54
WCK aid worker killings are a ‘message to the world’

Published On 4 Apr 2024
4 Apr 2024
The Israeli military’s reported use of an untested and undisclosed artificial intelligence-powered database to identify targets for its bombing campaign in Gaza has alarmed human rights and technology experts who said it could amount to “war crimes”.

The Israeli-Palestinian publication +972 Magazine and Hebrew-language media outlet Local Call reported recently that the Israeli army was isolating and identifying thousands of Palestinians as potential bombing targets using an AI-assisted targeting system called Lavender.

Recommended Stories
list of 4 items
list 1 of 4UK’s Rishi Sunak faces growing pressure to stop arms sales to Israel
list 2 of 4Huge protests across Israel are telling Netanyahu to leave, will it happen?
list 3 of 4Israel strike that killed 106 people in Gaza ‘apparent war crime’: Probe
list 4 of 4UN approves its first resolution on artificial intelligence
end of list


“That database is responsible for drawing up kill lists of as many as 37,000 targets,” Al Jazeera’s Rory Challands, reporting from occupied East Jerusalem, said on Thursday.

The unnamed Israeli intelligence officials who spoke to the media outlets said Lavender had an error rate of about 10 percent. “But that didn’t stop the Israelis from using it to fast-track the identification of often low-level Hamas operatives in Gaza and bombing them,” Challands said.



It is becoming clear the Israeli army is “deploying untested AI systems … to help make decisions about the life and death of civilians”, Marc Owen Jones, an assistant professor in Middle East Studies and digital humanities at Hamid Bin Khalifa University, told Al Jazeera.

“Let’s be clear: This is an AI-assisted genocide, and going forward, there needs to be a call for a moratorium on the use of AI in the war,” he added.

The Israeli publications reported that this method led to many of the thousands of civilian deaths in Gaza.

Advertisement

On Thursday, Gaza’s Ministry of Health said at least 33,037 Palestinians have been killed and 75,668 wounded in Israeli attacks since October 7.


Play Video
6:08
Now Playing
06:08
Israel’s AI tactics, resulting in high civilian casualties, being exported abroad: Analysis
Israel’s AI tactics, resulting in high civilian casualties, being exported abroad: Analysis
Next
01:46
Trump posts Iran bridge strike video, calls for deal and warns of escalation
Trump posts Iran bridge strike video, calls for deal and warns of escalation
Show more videos

AI use ‘violates’ humanitarian law
“The humans that were interacting with the AI database were often just a rubber stamp. They would scrutinise this kill list for perhaps 20 seconds before deciding whether or not to give the go-ahead for an air strike,” Challands reported.

Want to come back to this article? Save it for later.
Save
In response to widening criticism, the Israeli military said its analysts must conduct “independent examinations” to verify that the identified targets meet the relevant definitions in accordance with international law and additional restrictions stipulated by its forces.

It refuted the notion that the technology is a “system”, but instead “simply a database whose purpose is to cross-reference intelligence sources, in order to produce up-to-date layers of information on the military operatives of terrorist organisations”.

But the fact that there were “five to 10 acceptable civilian deaths” for every single Palestinian fighter who was an intended target shows why there are so many civilian deaths in Gaza, according to Challands.



Professor Toby Walsh, an AI expert at the University of New South Wales in Sydney, said legal scholars will likely argue that the use of AI targeting violates international humanitarian law.

“From a technical perspective, this latest news shows how hard it is to keep a human in the loop, providing meaningful oversight to AI systems that scale warfare terribly and tragically,” he told Al Jazeera.

‘War crimes’
The media outlets cited sources who said the Israeli army made decisions during the first weeks of the current conflict that “for every junior Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians”.



The sources also said that if senior Hamas officials were the target then “the army on several occasions authorised the killing of more than 100 civilians in the assassination of a single commander”.

Ben Saul, the United Nations special rapporteur on human rights and counterterrorism, said if details in the report prove to be true, “many Israeli strikes in Gaza would constitute the war crimes of launching disproportionate attacks”.


“Israel is currently trying to sell these tools to foreign entities, to governments that are looking to what Israel’s doing in Gaza, not with disgust, but actually with admiration,” said Antony Loewenstein, an Australian journalist and author of The Palestine Laboratory: How Israel Exports the Technology of Occupation Around the World.

Advertisement

“We’ll find out in the coming months and years who they may be … my sense is it’s gonna be countries that are currently saying they’re opposed to what Israel is doing.”


Play Video
28:15
Is attacking aid convoys an Israeli tactic in its genocidal war?

No comments:

Post a Comment

Israel accused of using AI to choose Gaza targets - Transcript | CBC Radio

Israel accused of using AI to choose Gaza targets - Transcript | CBC Radio Front Burner Transcript for April 8, 2024 Host: Jayme Poisson JAY...