‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza - eviltoast

In 2021, a book titled “The Human-Machine Team: How to Create Synergy Between Human and Artificial Intelligence That Will Revolutionize Our World” was released in English under the pen name “Brigadier General Y.S.” In it, the author — a man who we confirmed to be the current commander of the elite Israeli intelligence unit 8200 — makes the case for designing a special machine that could rapidly process massive amounts of data to generate thousands of potential “targets” for military strikes in the heat of a war. Such technology, he writes, would resolve what he described as a “human bottleneck for both locating the new targets and decision-making to approve the targets.”

Such a machine, it turns out, actually exists. A new investigation by +972 Magazine and Local Call reveals that the Israeli army has developed an artificial intelligence-based program known as “Lavender,” unveiled here for the first time. According to six Israeli intelligence officers, who have all served in the army during the current war on the Gaza Strip and had first-hand involvement with the use of AI to generate targets for assassination, Lavender has played a central role in the unprecedented bombing of Palestinians, especially during the early stages of the war. In fact, according to the sources, its influence on the military’s operations was such that they essentially treated the outputs of the AI machine “as if it were a human decision.”

  • Milk_Sheikh@lemm.ee
    link
    fedilink
    arrow-up
    7
    ·
    7 months ago

    “It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”

    … the IDF imposed pre-authorised limits on the number of civilians it deemed acceptable to kill in a strike aimed at a single Hamas militant… The IDF judged it permissible to kill more than 100 civilians in attacks on a top-ranking Hamas officials. “ … earlier in the war they were authorised to kill up to “20 uninvolved civilians” for a single operative, regardless of their rank, military importance, or age.

    So to recap;

    1. They knew/had a very good idea the death toll their actions would have
    2. There was a conscious decision (born from apathy and indifference it seems) to target the homes of militants instead of individuals directly
    3. All this activity has operational oversight and signify - this is not a rogue unit/drone operator but a policy course that is being pursued even now.

    Given the recent triple airstrike on the WCK aid convoy, quotes like this help see behind the curtain and speaks to their mindset:

    “…you don’t want to invest manpower and time in it”. They said that in wartime there was insufficient time to carefully “incriminate every target”. “So you’re willing to take the margin of error of using artificial intelligence, risking collateral damage and civilians dying, and risking attacking by mistake, and to live with it,” they added.

    Sure, you get to live with it.