UN chief ‘deeply troubled’ by reports Israel using AI to identify Gaza targets

Share:

Large swathes of Gaza have been ravaged by the latest war  © MOHAMMED ABED / AFP

United Nations (United States) (AFP) – UN Secretary-General Antonio Guterres on Friday expressed serious concern over reports that Israel was using artificial intelligence to identify targets in Gaza, resulting in many civilian deaths. 

According to a report in independent Israeli-Palestinian magazine +972, Israel has used AI to identify targets in Gaza — in some cases with as little as 20 seconds of human oversight.

Guterres said that he was “deeply troubled by reports that the Israeli military’s bombing campaign includes Artificial Intelligence as a tool in the identification of targets, particularly in densely populated residential areas, resulting in a high level of civilian casualties.” 

“No part of life and death decisions which impact entire families should be delegated to the cold calculation of algorithms,” he said.

The +972 report claims that “the Israeli army has marked tens of thousands of Gazans as suspects for assassination, using an AI targeting system with little human oversight and a permissive policy for casualties.”

The report said that, according to “six Israeli intelligence officers”, a system called Lavender had “played a central role in the unprecedented bombing of Palestinians, especially during the early stages of the war.”

Israel’s retaliatory campaign in Gaza has killed at least 33,091 people, mostly women and children, according to the health ministry in the Hamas-run territory  © – / AFP

“According to the sources, its influence on the military’s operations was such that they essentially treated the outputs of the AI machine ‘as if it were a human decision’,” +972 reported. 

Two sources said “the army also decided during the first weeks of the war that, for every junior Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians”.

If “the target was a senior Hamas official… the army on several occasions authorized the killing of more than 100 civilians,” it added.

‘Collateral damage’

The Israeli army, known as the IDF, on Friday rejected the claims.

“The IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist,” it said. 

Instead it has a “database whose purpose is to cross-reference intelligence sources… on the military operatives of terrorist organizations” to be used as a tool for analysts, it added. 

“The IDF does not carry out strikes when the expected collateral damage from the strike is excessive,” it said, using a term that includes civilian casualties. 

The deadliest ever Gaza war erupted with Palestinian militant group Hamas carrying out an unprecedented attack against Israel on October 7, which resulted in the deaths of 1,170 Israelis and foreigners, most of them civilians, according to an AFP tally of Israeli official figures. 

Palestinian militants took more than 250 hostages on October 7, of whom 130 remain in Gaza, including 34 who the army says are dead.

Israel’s retaliatory campaign in the Gaza Strip has killed at least 33,091 people, mostly women and children, according to the health ministry in the Hamas-run Palestinian territory.

The United Nations has warned of imminent famine in the besieged territory.

‘Mass assassination factory’

Israel began hyping AI-powered targeting after an 11-day conflict in Gaza during May 2021, which commanders branded the world’s “first AI war”.

The military chief during the 2021 war, Aviv Kochavi, told Israeli news website Ynet last year the force had used AI systems to identify “100 new targets every day”, instead of 50 a year previously.

Weeks into the latest Gaza war, a blog entry on the Israeli military’s website said its AI-enhanced “targeting directorate” had identified more than 12,000 targets in just 27 days. 

An unnamed Israeli official was quoted as saying the AI system, called Gospel, produced targets “for precise attacks on infrastructure associated with Hamas, inflicting great damage on the enemy and minimal harm to those not involved”. 

But an anonymous former Israeli intelligence officer, quoted in November by +972, described Gospel’s work as creating a “mass assassination factory”. 

In a rare confession of wrongdoing, Israel on Friday admitted a series of errors and violations of its rules in the killing of seven aid workers in Gaza, saying it had mistakenly believed it was “targeting armed Hamas operatives”. 

War Crimes

Alessandro Accorsi, a senior analyst at Crisis Group, said the +972 report was “very concerning”.

“It feels very apocalyptic. It’s clear… the degree of human control is very low,” he told AFP.

“There are a thousand questions around this obviously — how moral it is to use it — but it is hardly surprising it is used,” he said.

Johann Soufi, a human rights lawyer and former director of the UN Palestinian refugee agency UNRWA’s legal office in Gaza, said the +972 article described methods that were “undeniably war crimes”.

They were “likely crimes against humanity” in view of the high civilian casualties, he added on X, formerly Twitter.

Lavender software company

Lavender is developer of an Artificial Intelligence (AI) based e-mail assistant tool according to its website.

The company is based in the US state of Georgia

  • 3423 Piedmont Road, Northeast, Atlanta, GA 30305

Lavender was founded in 2020 by William Ballance, William Allred, and Casey Corvino

William Ballance is its current CEO

FRANCE24/AFP

Issued on: 05/04/2024 – 19:57

Share: