The Fate of Hundreds of Thousands of Civilians in Gaza depends on Artificial Intelligence – Sarajevo Times

Israeli sources say Israel risks at least 20 civilian casualties for every 37,000 suspects identified by an artificial intelligence program called Lavander to identify human targets in attacks on the blockaded Gaza Strip.

Sources from Tel Aviv testified for the media houses +972 and Local Call that Lavander analyzed data on about 2.3 million people in Gaza according to unclear criteria and assessed whether any of the persons had ties to Hamas.

A total of six sources stated that the Israeli army fully complied with that program, especially in the early stages of the war, and the names identified by Lavander were labeled as targets without control and without taking into account any special criteria, except that its about men.

37,000 suspected Palestinians

Sources who testified to +972 said that the concept of military target, which allows killing on private property even if there are civilians in the facility and surroundings, previously included only high-level military targets, and that after October 7 concept extended to all members of Hamas.

Due to the enormous increase in the number of targets, the need for artificial intelligence has arisen because the possibility of examining and checking targets individually by humans has been eliminated, and sources also state that artificial intelligence has marked close to 37,000 Palestinians as suspects.

The sources said that Lavander was very successful in classifying Palestinians, and that the process was fully automated.

We killed thousands of people. We automated everything and did not control each target separately. We bombed the targets as soon as they moved in their houses, the source said, confirming that human control of the targets had been eliminated.

The comment of one of the sources that he found it very surprising that they were asked to bomb a house to kill an unimportant person is a sort of acknowledgment of the Israeli massacre of civilians in Gaza.

Green light for high level targets with up to 100 civilian casualties

The sources stated that up to 20 civilian victims were allowed in the action that was carried out against the lower ranks, and that this number often changed during the process, and they emphasized that the principle of proportionality was not applied.

On the other hand, it was stated that the number of possible collateral civilian casualties increased to 100 for high-level targets.

While the sources said they were ordered to bomb every place they could, one of the sources said that hysteria dominated senior officials and all they knew was to bomb like crazy to limit Hamass capabilities.

A senior soldier with the initials B., who used the Lavander program, said that the margin of error of the program is about ten percent and that there is no need for people to control targets and waste time on it.

Israeli soldier B stated that in the beginning there were fewer labeled targets, but that with the expansion of the definition of Hamas members, the practice was further expanded and the number of targets grew. He added that members of the police and civil protection who may have helped Hamas, but who were not a threat to the Israeli army, were also targeted.

There are many shortcomings of the system. If the target person gave their phone to another person, that person is bombed at home with their entire family. This happened very often. This was one of the most common mistakes Lavander made, said Soldier B.

Most of the killed are women and children

On the other hand, the same sources said that the software called Wheres Daddy? it tracks thousands of people at a time and notifies Israeli authorities when they enter their homes. Attacks are also carried out on the database of this program.

Lets say you calculate that there is one member of Hamas and ten civilians in the house, usually those ten people are women and children. So, absurdly, most of the people you kill are women and children, said one of the sources.

Unguided bombs are used to save money

Sources also said that many civilians were killed because less important targets were hit with ordinary and cheaper missiles instead of guided smart missiles.

We usually carried out the attacks with unguided missiles, which meant literally destroying the entire house with its contents. The system kept adding new targets, one of the sources said.

Artificial intelligence is not used to reduce civilian casualties, but to find more targets

Speaking to Al Jazeera on the subject, Marc Owen Jones, professor of Middle East Studies and Digital Humanities at Hamid bin Khalifa University in Qatar, said it was increasingly clear that Israel was using unproven artificial intelligence systems that had not undergone transparent evaluation to help in making decisions about the lives of civilians.

Jones believes that Israeli officials activated an artificial intelligence system to select targets to avoid moral responsibility.

He emphasized that the purpose of using the program is not to reduce civilian casualties, but to find more targets.

Even the officials who run the system see AI as a killing machine. It is unlikely that Israel will stop using artificial intelligence in attacks if its allies do not put pressure on it. The situation in Gaza is genocide supported by artificial intelligence. A call for a moratorium on the use of artificial intelligence in warfare is needed, Jones concluded.

Habsora

Another study published on December 1, 2023 revealed that an artificial intelligence application called Habsora (Gospel), which the Israeli military also used to identify targets in its attacks on the Gaza Strip, was used to precisely target civilian infrastructure and that was used in attacks against automatically generated targets. In this case, the balance of civilian victims who would die with the target was known.

Habsora is an artificial intelligence technology used by Israel to attack buildings and infrastructure, and Lavander is used when targeting people, AA writes.

Read more from the original source:
The Fate of Hundreds of Thousands of Civilians in Gaza depends on Artificial Intelligence - Sarajevo Times

Related Posts

Comments are closed.