Israeli Military AI: An Ethical Dilemma in Gaza Operations

In Gaza, Israeli military operations have integrated advanced AI systems, fundamentally altering the conflict landscape. Technologies like Gospel and Lavender enhance target identification and streamline military decision-making processes. These advancements aim to improve efficiency and precision in military actions.

However, the use of these AI systems raises significant ethical concerns, particularly about civilian casualties and reduced human oversight. This article explores their impact on military strategy, civilian populations, and international perceptions. It delves into the complex intersection of technology, warfare, and ethics, highlighting tactical shifts, policy adjustments, and the human toll of modern conflicts.

Screenshot of Israeli military AI systems. Credit: IDF
Screenshot of Israeli military AI systems. Credit: IDF

1. Utilizing Gospel for Target Identification 

During its operations in Gaza, the Israeli military has utilized an AI system named the Gospel to identify targets. Targets include schools, medical facilities, places of worship, and offices of aid organizations. 

Hamas officials have reported over 30,000 Palestinian casualties, many of whom were women and children. The Gospel system employs machine learning technologies to sift through extensive data to pinpoint possible targets. 

Israeli Forces claim that the Gospel not only enhances target accuracy, but also accelerates the target generation process through automation. In the initial 27 days of the conflict, over 12,000 targets were struck according to an Israel Forces statement.

Israel’s investment in military technology extends to advanced anti-rocket systems and surveillance capabilities. However, it is unclear if these funds are specifically allocated to the Gospel system. 

Additionally, Israel has developed AI-enhanced precision assault rifle sights, like the SMASH from Smart Shooter. These sights feature sophisticated image-processing algorithms to accurately identify targets. These rifle sights have been deployed in both Gaza and the occupied West Bank. [source]

2. The Impact of Lavender: An Israeli military AI System 

The Lavender AI system has significantly influenced military operations against Palestinians, particularly during the initial phases of the conflict. It was used to generate extensive “kill lists” of individuals affiliated with actions in Hamas and Palestinian Islamic Jihad (PIJ). 

During these early stages, military officers were permitted to use these lists with minimal oversight. They are often not required to scrutinize the rationale behind the selections or the underlying intelligence data. Consequently, human oversight was minimal, typically amounting to a brief confirmation that the target was male, lasting about 20 seconds. 

Despite an error rate of approximately 10%, the system sometimes misidentified individuals with minor or no connections to militant groups as targets. Decision-makers often accepted these determinations without additional verification.

The Israeli military conducted systematic attacks on targeted individuals within their homes. Attacks are predominantly at night when family members were likely present, rather than in military settings. Automated tracking systems, including one newly revealed called “Where’s Daddy?”, facilitated this approach by targeting and locating individuals within their family residences for bombing. [source]

3. META, SpaceX, and Israel

3.1 Whatsapp and Lavender

Paul Biggar, a respected software engineer and founder of Tech For Palestine, sheds light on Lavender’s methodologies. He asserts that Lavender likely draws its intelligence from the digital trails left within WhatsApp groups. Thus, the trails form a critical component in identifying targets in Gaza. This revelation underscores the profound impact of data mining and social media on contemporary military operations.

Reports unveil a disturbing dimension to Lavender’s functionality. Its inclination towards “pre-crime” tactics, where association with suspected militants through WhatsApp affiliations are grounds for targeting individuals. Metadata extracted from WhatsApp, including group memberships, serves as fodder for Lavender’s algorithmic decision-making apparatus.

Moreover, Meta, the parent company of WhatsApp, finds itself embroiled in this controversy. Indeed, Meta is accused of enabling the transfer of data to Lavender. Notably, Meta’s leadership, including Chief Information Security Officer Guy Rosen and CEO Mark Zuckerberg.

Both individuals maintain close ties with Israel. And, Rosen’s background in Israel’s Unit 8200 raises questions about the extent of collaboration between tech giants and defence.

Guy Rosen | FRONTLINE
Guy Rosen Chief Information Security Officer Guy Rosenof META [source]

The ramifications of Lavender’s deployment are profound, with critics condemning its high civilian casualty rate. Israeli officials’ admission of targeting ‘suspects’ within the confines of their homes, including innocent civilians and children, underscores the ethical quandaries posed by AI-enabled warfare. [source]

3.2 Elon Musk and Israeli Military AI Discussion

Elon Musk’s discussions with Israeli Prime Minister Benjamin Netanyahu highlight the critical intersection of technology and national security. A seemingly inconspicuous reference in a government readout reveals Musk’s deeper engagement with Israeli officials on AI’s security implications.

This meeting, attended by senior security establishment officials, underscores the strategic importance attributed to AI advancements in safeguarding national interests. The involvement of high-level security personnel hints at the significant role AI is expected to play in national defense strategies.

[source]

Meanwhile, SpaceX’s collaboration with Israel in launching the EROS C3 reconnaissance satellite highlights Musk’s broader involvement in AI-driven endeavors. Grok reports that SpaceX uses AI to optimize trajectories and manage satellite networks across its operations.

The EROS C3 satellite, entrusted to SpaceX for deployment, is a cornerstone of Israel’s intelligence infrastructure. It offers unparalleled capabilities in Earth observation, reinforcing the strategic significance of AI in modern reconnaissance and intelligence gathering.

Developed by Israel Aerospace Industries (IAI) and operated by ImageSat International, the EROS C3 satellite represents cutting-edge reconnaissance technology. Its high-resolution imagery benefits governmental and business applications. Integrated with a sophisticated space camera from Elbit Systems, the satellite captures detailed imagery essential for diverse missions. [source]

Elon Musk, center, and Benjamin Netanyahu, right, in Kfar Aza on Nov. 27. [source]

4. Escalation of Military Response 

This strategy led to a high casualty rate among Palestinians. Predominantly amongst women, children, and non-combatants. And, particularly in the initial weeks of the conflict due to decisions made by the AI systems.

After the October 7 attack by Hamas-led militants, which resulted in approximately 1,200 deaths and 240 abductions in southern Israeli communities, the Israeli military adopted a markedly different strategy.

Under “Operation Iron Swords,” the army broadened targeting criteria to include all members of Hamas’ military wing as potentials. Further, targeting was irrespective of their rank or direct involvement in combat activities. This shift represented a significant escalation in military response. [source]

5. Challenges in Intelligence Operations 

This change also introduced a complex challenge for Israeli intelligence. Previously, the process to authorize the assassination of a high-value target involved a detailed “incrimination” procedure. This included verifying that the individual was a senior Hamas member, determining their residence, gathering contact information, and pinpointing their real-time location.

When the focus was on a limited number of senior operatives, intelligence personnel could manage the detailed work required for each target. However, expanding the list of targets increased the complexity and scope of intelligence operations.

As the targeting criteria broadened to include tens of thousands of lower-ranking operatives, the Israeli army increasingly relied on automated software and artificial intelligence due to the expanded scope of potential targets.
This shift significantly reduced the role of human personnel in the verification process, allowing AI to take over most of the decision-making in identifying military operatives. [source]

6. Authorization of Lavender Kill Lists 

Approximately two weeks into the conflict, after manually reviewing the accuracy of a randomly selected sample of several hundred targets identified by the AI system, authorities granted approval to fully implement Lavender’s kill lists. This sample demonstrated that Lavender achieved 90 percent accuracy in verifying an individual’s affiliation with Hamas.

Consequently, the army extensively used the system, treating Lavender’s identification of an individual as a Hamas militant as definitive. This eliminated the need for human personnel to independently verify the reasons behind the AI’s decisions or to scrutinize the underlying intelligence data.

The relaxation of targeting restrictions in the early stages of the war resulted in severe consequences. According to figures from the Palestinian Health Ministry in Gaza, which has been the primary data source for the Israeli army since the war’s outset, Palestinians killed numbered about 15,000 in just the first six weeks of the conflict. This number represents nearly half of the total casualties reported up to that point, leading up to a ceasefire on November 24. [source]

6.1 Utilizing Advanced Tracking Systems 

The Lavender system extensively uses data collected from the nearly 2.3 million residents of the Gaza Strip through mass surveillance to assess and rank each person’s potential involvement with the military wings of Hamas or PIJ. Also, Lavender assigns a probability score ranging from 1 to 100 to each individual, estimating the likelihood of their militant involvement.

In essence, Lavender operates by learning to recognize characteristics associated with known operatives from Hamas and PIJ, using this information as training data. It then searches for these traits among the general population.

Individuals who exhibit a combination of these incriminating features receive a higher score, and thus, authorities are more likely to flag them as potential targets for assassination.

Features that might elevate a person’s score are membership in a WhatsApp group with a known militant, frequently changing cell phones, and often moving residences. Overall, these features could indicate potential links to militant activities. [source]

6.2 Impact on Civilian Populations 

The current conflict involved instructing officers not to independently verify the assessments made by the AI system. This measure aimed to streamline the process and enable the rapid generation of human targets. Internal acknowledgments revealed that the AI system, Lavender, was only 90% accurate.

The system sometimes erroneously targeted individuals based on communication patterns similar to those of known militants. In particular, police, civil defense workers, militants’ relatives, individuals sharing a name or nickname with a militant, or users of devices previously owned by militants. [source]

7. Shift in Israeli Military Strategy 

Moreover, the sole protocol for human supervision before executing strikes on homes of suspected “junior” militants identified by Lavender involved verifying the gender of the target. Indeed, the underlying assumption was that if the target was female, the machine had likely erred, as there are no female members in the military wings of Hamas. This protocol reflects a significant reliance on the AI’s decision-making, with minimal human oversight integrated into the process. 

The next phase in the Israeli army’s assassination protocol involves pinpointing locations for strikes on targets identified by the AI system, Lavender. Despite official statements, a significant reason for the high death toll from the ongoing bombardment is the army’s methodical choice to attack these targets at their private residences, often alongside their families. This approach is partly due to the ease with which automated systems can mark family homes for bombing. [source]

7.1 Consequences of Targeting Tactics 

Contrary to instances where Hamas operatives have conducted military activities from civilian locations, the systematic assassination strikes have predominantly targeted suspected militants within civilian households, where no military activity was occurring. This decision reflects the operational framework of Israel’s mass surveillance systems in Gaza, which facilitate easy linkage of individuals to their family homes.

Developers have created advanced software systems, including one named “Where’s Daddy?”, to monitor and automatically notify when individuals enter their homes, enabling precise timing for bombings. The proportion of families entirely decimated within their homes during this conflict significantly exceeds those in the 2014 conflict, indicating a notable escalation in the use of this tactic. [source]

7.2 “Where’s Daddy?” tracking system

As the pace of assassinations slowed, authorities integrated additional targets into tracking systems like “Where’s Daddy?” to identify individuals entering their homes, making them susceptible to airstrikes. Relatively low-ranking officers within the military hierarchy could make the decision on whom to include in these tracking systems.

Initially, during the first two weeks of the conflict, “several thousand” targets were entered into tracking programs such as Where’s Daddy? These included members of Hamas’ elite special forces unit, the Nukhba, anti-tank operatives, and those who entered Israel on October 7. However, this list quickly expanded to include a much broader range of targets. [source]

8. Israeli Military AI & Tactical Decision-Making  

The combination of Lavender and systems like Where’s Daddy? proved to be devastating, resulting in the deaths of entire families, including those identified by Israeli military AI systems. Adding a name from Lavender’s lists to the Where’s Daddy? home tracking system would place individuals under continuous surveillance, making them vulnerable to airstrikes as soon as they returned home, often resulting in the collapse of the entire structure and everyone inside.

After Lavender identified a target, and authorities confirmed that the target was male and tracking software located them in their home, the next step was selecting the munition for the airstrike.

Junior operatives marked by Lavender were typically targeted with “dumb bombs” to conserve more expensive armaments. This implied that if a junior target resided in a high-rise building, the army would refrain from using a more precise and costly “floor bomb” to minimize collateral damage. However, if the target was in a low-rise building, the army authorized the use of a dumb bomb, potentially resulting in casualties among all occupants of the building.[source]

9. Policy Changes and International Pressure 

During the initial weeks of the war, when attacking junior operatives, including those marked by AI systems like Lavender, authorities established that they could kill up to 20 civilians alongside each target. The fixed number was up to 15.

Now, partly due to American pressure, the Israeli army is no longer mass-generating junior human targets for bombing in civilian homes, including those identified by Israeli military AI systems. This also impaired the army’s ability to rely on intelligence databases and automated house-locating programs. [source]

10. Humanitarian Consequences of Israeli military AI 

In the aftermath, witnesses recalled the grim task of recovering bodies from the rubble, with around 50 dead and approximately 200 wounded, many critically so, on the first day alone. The residents of the camp spent five days searching for and rescuing the dead and injured.

Nael Al-Bahisi, a paramedic, was among the first responders and counted between 50 to 70 casualties on the initial day.

Similarly, in mid-December, the army targeted a high-rise building in Rafah in an attempt to assassinate Mohammed Shabaneh, the commander of Hamas’ Rafah Brigade. This strike resulted in the deaths of “dozens of civilians,” although it remains unclear whether Shabaneh was killed in the attack. Often, senior commanders hide in tunnels beneath civilian buildings, making airstrikes targeting them inevitably lethal for civilians.

11. Israeli Military AI: Accuracy and Verification 

The Israeli army’s estimation of civilian casualties alongside each target relied on automatic and often inaccurate tools. In previous conflicts, intelligence personnel meticulously verified the number of individuals in a targeted house, with potential civilian casualties documented in a “target file.” However, after October 7, this thorough verification process was largely replaced by automation.

In October, The New York Times reported on a system operated from a special base in southern Israel, which gathered data from mobile phones in the Gaza Strip to provide real-time estimates of the number of Palestinians migrating from northern Gaza to the south. This system operates on a color-coded basis: red indicates areas with high population density, while green and yellow signify areas that are relatively depopulated. [source]

11.1 Impact on Operational Decisions 

A similar system is used to calculate collateral damage, assisting in decisions regarding airstrikes on buildings in Gaza. This software initially calculated the number of civilians residing in each household before the onset of the conflict. For instance, if the army estimated that half of a neighborhood’s residents had evacuated, the program would adjust the count accordingly, considering a house with 10 residents as housing five people.

In some instances, there was a significant delay between the alert from tracking systems like Where’s Daddy? indicating that a target had entered their house and the subsequent airstrike. This delay often resulted in the tragic deaths of entire families, even if the intended target was not hit.

11.2 Changes in Post-Strike Procedures 

In previous conflicts in Gaza, Israeli intelligence typically conducted post-strike bomb damage assessments (BDAs) following the assassination of human targets. These assessments aimed to confirm whether the senior commander was killed and to determine the number of civilian casualties. However, in the current conflict, particularly concerning junior militants identified using AI, authorities abandoned this procedure to expedite operations.

While the Israeli military may quickly move on from each strike, the impact lingers for civilians like Amjad Al-Sheikh, a resident of Shuja’iya who lost 11 family members in a December 2 bombardment. Al-Sheikh and his neighbors continue to search for the remains of their loved ones amidst the devastation.

12. Conclusion

The integration of Israeli military AI systems into Israeli military operations in Gaza, including Israeli military AI systems, represents a paradigm shift in modern warfare, with profound implications for civilian populations and international norms of conflict conduct. While these technologies promise to enhance precision and efficiency in targeting, their deployment has led to alarming levels of civilian casualties and raised serious questions about accountability and oversight.

As conflicts continue to evolve in complexity and intensity, it is imperative for policymakers, military leaders, and the international community to engage in robust dialogue and regulation to mitigate the humanitarian impact of AI-driven warfare. Only through responsible and ethical utilization of technology can we hope to navigate the complexities of contemporary conflicts while upholding the principles of human rights and international law.

Table of Contents

Related Content

Sting: Ukraine’s Shahed Drone Hunter

TYPE:_ Article
Location:_ Europe

Electronic Warfare 101: Understanding the Basics and Applications

TYPE:_ Article

Chinese Hypersonic Vehicle Programmes

TYPE:_ Article
Location:_ Far East

Lancet 3: Russia’s Spear in the Sky

TYPE:_ Article
Location:_ Europe

Operation Rubicon: How the CIA and BND Spied on the World

TYPE:_ Article

Narco-Drones: The Use of Drones by Drug Cartels

TYPE:_ Article

Stay in the loop

Get a free weekly email that makes reading intel articles and reports actually enjoyable.

Log in

Stay in the loop

Get a free weekly email that makes reading Intelligence Reports and Articles actually enjoyable.

Table of Contents

Contact

Contact

"*" indicates required fields

This field is for validation purposes and should be left unchanged.