The technology has advanced to the point that artificial intelligence (AI) is now a reliable tool for many job tasks. While others find it blissful, understanding its disturbing reality is essential. AI weaponization to target specific populations has been a feature of global warfare, particularly in Israel, where most of the destruction in Gaza since the October 7th attacks has been aided and frequently directed by artificial intelligence (AI) technologies. The promise of AI is often centred on two key advantages: speed and accuracy. The expectation is that AI will be able to execute pinpoint strikes, but with over 34,000 Palestinians dead compared to just over 1,400 during Israel’s 2014 conflict in Gaza, something different and more lethal is evidently going on.
The most well-known example of AI used in Israel is the Iron Dome, a protective system meant to deflect missile strikes. In April 2024, this technology effectively defended Israel from Iran’s drone and missile attacks. Another notable AI use is Smart Shooter’s Smash, which is an AI precision assault rifle sight that uses powerful image processing algorithms to assist targeting, similar to video game auto-aim.
Israel also uses AI to monitor Palestinians in the occupied regions. Movements through hundreds of checkpoints are recorded, with facial photos and other biometrics compared to a database. Recent revelations from Israeli publications 972 and Local Call have shed light on AI algorithms selecting bombing targets in Gaza, exposing a more complex and frightening use of technology.
How does AI decide the target?
One of the primary systems found is gospel, which creates bombing targets for Gaza’s buildings and structures. Alchemist is used to discover possible targets by combining data from numerous sources, including surveillance and historical information. The Alchemist platform collects and uploads data to another system, Fire Factory, which divides the information into four categories: tactical targets, underground targets, operatives’ family homes, and residential/high-rise buildings containing civilians, also known as power targets. Within the first five days of the battle, half of all recognised targets fell into the power target category.
Tactical tunnels: Armed militant cells, weapon warehouses, launchers, militant headquarters.
Underground tunnels: Living spaces of Hammas
Family tunnels: Homes of Hammas
Power targets: Residential or high-rise buildings with civilians
Once the data is generated, it goes through Gospel, which creates the output to target. The Israeli army has systematically assaulted the targeted individuals while they were at home, generally at night with their entire family present, rather than during military operations. According to Yuwal Abraham, a journalist from the Middle East, this was because, from an intelligence aspect, it was easier to find the individuals in their homes. Additional automated systems, including one called “Where’s Daddy?” which was revealed here for the first time, were used to hunt down the targeted persons and carry out explosions when they entered their family’s homes.
Who have been the targets so far?
According to reports, the goal of power targets is to put pressure on Hamas through civil means. The Gospel system generates suggested targets, potential munitions, and collateral damage alerts. More worrying is the Lavender system, which targets specific persons based on historical and surveillance data, resulting in up to 37,000 Hamas and Islamic Jihad targets. According to sources, approximately 10% of these targets are wrong, and the definition of a Hamas operative now includes civil society persons who contact Hamas.
Lavender connects targets with individual family houses and recommends weaponry based on the operative’s rating. Low-ranking militants employ cheaper, unguided bombs, which may increase collateral damage. Reports indicate that for each junior Hamas operative targeted, it was deemed acceptable to kill up to 15-20 civilians, and for some targets, permissible casualties were as high as 300. In a recent operation to rescue Israelis from Hamas, almost 300 Palestinians were killed.
AI systems generate predictions rather than facts, and their accuracy is strongly dependent on the quality and understanding of the data input into them. The IDF has denied allegations of a policy to murder tens of thousands of civilians, emphasising the need for human analysts to conduct independent assessments before selecting targets. However, some sources indicate that human control for targeting junior terrorists’ residences was limited to determining the target’s gender.
Reduce Palestine to nothing but a graveyard
In November 2023, the US released an international framework for responsible AI use in warfare. 50 countries signed the framework except Israel, and hardly anyone seems bothered. This raises a critical question: Is Gaza becoming a testing ground for future AI technologies, and will the world sit on its hands while the weaponization of technology reduces Palestine to nothing but a graveyard?