a notable point in here, particularly given the recent WCK murders:

In an unprecedented move, according to two of the sources, the army also decided during the first weeks of the war that, for every junior Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians; in the past, the military did not authorize any “collateral damage” during assassinations of low-ranking militants. The sources added that, in the event that the target was a senior Hamas official with the rank of battalion or brigade commander, the army on several occasions authorized the killing of more than 100 civilians in the assassination of a single commander.

  • Gaywallet (they/it)@beehaw.orgM
    link
    fedilink
    arrow-up
    24
    ·
    8 months ago

    They were going to kill these people whether an AI was involved or not, but it certainly makes it a lot easier to make a decision when you’re just signing off on a decision someone else made. The level of abstraction made certain choices easier. After all, if the system is known to be occasionally wrong and everyone seems to know it yet you’re still using it, is that not some kind of implicit acceptance?

    One source stated that human personnel often served only as a “rubber stamp” for the machine’s decisions, adding that, normally, they would personally devote only about “20 seconds” to each target before authorizing a bombing — just to make sure the Lavender-marked target is male. This was despite knowing that the system makes what are regarded as “errors” in approximately 10 percent of cases, and is known to occasionally mark individuals who have merely a loose connection to militant groups, or no connection at all.

    It also doesn’t surprise me that when you’ve demonized the opposition, it becomes a lot easier to just be okay with “casualties” which have nothing to do with your war. How many problematic fathers out there are practically disowned by their children for their shitty beliefs? Even if there were none, it still doesn’t justify killing someone at home because it’s ‘easier’

    Moreover, the Israeli army systematically attacked the targeted individuals while they were in their homes — usually at night while their whole families were present — rather than during the course of military activity. According to the sources, this was because, from what they regarded as an intelligence standpoint, it was easier to locate the individuals in their private houses. Additional automated systems, including one called “Where’s Daddy?” also revealed here for the first time, were used specifically to track the targeted individuals and carry out bombings when they had entered their family’s residences.

    All in all this is great investigative reporting, and it’s absolutely tragic that this kind of shit is happening in the world. This piece isn’t needed to recognize that a genocide is happening and it shouldn’t detract from the genocide in any way.

    As an aside, I also help it might get people to wake up and realize we need to regulate AI more. Not that regulation will probably ever stop the military from using AI, but this kind of use should really highlight the potential dangers.

    • luciole (he/him)@beehaw.org
      link
      fedilink
      arrow-up
      16
      ·
      8 months ago

      Step 6 is baffling. They bomb the Hamas operative’s family house, but they don’t bother checking if their target is even there at the time of striking - let alone minimizing civilian deaths. Then once the residential building is destroyed they don’t even care to know if they actually killed their target. The alignment between the declared objective and the methods employed is awkward.

      • Gaywallet (they/it)@beehaw.orgM
        link
        fedilink
        arrow-up
        10
        ·
        8 months ago

        When you abstract out pieces of the puzzle, it’s easier to ignore whether all parts of the puzzle are working because you’ve eliminated the necessary interchange of information between parties involved in the process. This is a problem that we frequently run into in the medical field and even in a highly collaborative field like medicine we still screw it up all the time.

        In the previous process, intelligence officers were involved in multiple steps here to validate whether someone was a target, validate information about the target, and so on. When you let a machine do it, and shift the burden from these intelligence officers to someone without the same skill set who’s only task is to review information given to them by a source which they are told is competent and their role is to click yes/no, you lose the connection between this step and the next.

        The same could be said, for example, about someone who has the technical proficiency to create new records, new sheets, new displays, etc. in an electronic health record. A particular doctor might come and request a new page to make their workflow easier. Without appropriate governance in place and people who’s job is to observe the entire process, you can end up with issues where every doctor creates their own custom sheet, and now all of their patient information is siloed to each doctors workflow. Downstream processes such as the patient coming back to the same healthcare system, or the patient going to get a prescription, or the patient being sent to imaging or pathology or labs could then be compromised by this short-sighted approach.

        For fields like the military which perhaps are not used to this kind of collaborative work, I can see how segmenting a workflow into individual units to increase the speed or efficiency of each step could seem like a way to make things much better, because there is no focus on the quality of what is output. This kind of misstep is extremely common in the application of AI because it often is put in where there are bottlenecks. As stated in the article-

        “We [humans] cannot process so much information. It doesn’t matter how many people you have tasked to produce targets during the war — you still cannot produce enough targets per day.”

        the goal here is purely to optimize for capacity, how many targets you can generate per day, rather than on a combination of both quality and capacity. You want a lot of targets? I can just spit out the name of every resident in your country in a very short period of time. The quality in this case (how likely they are to be a member of hamas) will unfortunately be very low.

        The reason it’s so fucked up is that a lot of it is abstracted yet another level away from the decision makers. Ultimately it is the AI that’s making the decision, they are merely signing off on it. And they weren’t involved in signing off on the AI, so why should they question it? It’s a dangerous road - one where it becomes increasingly easy to allow mistakes to happen, except in this case the mistake can be counted as innocent lives that you killed.

    • t3rmit3@beehaw.org
      link
      fedilink
      arrow-up
      16
      ·
      edit-2
      8 months ago

      Additional automated systems, including one called “Where’s Daddy?” also revealed here for the first time, were used specifically to track the targeted individuals and carry out bombings when they had entered their family’s residences.

      This kind of flippant and humorous treatment of the murder of families (given the name, specifically children) is literally Nazi shit.

      • qdJzXuisAndVQb2@lemm.ee
        link
        fedilink
        arrow-up
        7
        ·
        8 months ago

        I genuinely scrolled up to double check the post wasn’t about an Oniom article or something. Unreal callousness.

      • derbis@beehaw.org
        link
        fedilink
        arrow-up
        2
        ·
        7 months ago

        There are reports that low level Nazis involved in the Holocaust drank themselves stupid wracked with guilt. Meanwhile the IDF thinks they’re a bunch of comedians.

  • t3rmit3@beehaw.org
    link
    fedilink
    arrow-up
    14
    ·
    edit-2
    8 months ago

    the army on several occasions authorized the killing of more than 100 civilians in the assassination of a single commander.

    Yes, we know this already. We know it because they leveled an entire block of apartment buildings, and then as justification for their actions claimed that they were targeting one Hamas commander:

    At least 50 people were killed after six Israeli air raids hit a residential area of the [Jabalia] camp on Tuesday (October 2023). A Hamas statement said there were 400 dead and injured in the attack. Casualty figures for Wednesday’s attacks are still not known.

    An Israeli military statement said that the attacks on Jabalia had killed Hamas commander Ibrahim Biari. The military believes Biari played a pivotal role in the planning and execution of the Hamas attack on southern Israel on October 7.

    “There was a very senior Hamas commander in that area,” Israeli army spokesperson Lieutenant Colonel Richard Hecht told CNN.

  • esaru@beehaw.org
    link
    fedilink
    arrow-up
    2
    ·
    8 months ago

    Lavender has been fed with information that targeted people often hide in healthcare facilities. The threashold for labeling anything as a target that is related to medical or humanitarian support is passed easily obviously. A food truck that we are informed about and gave clearance for? Anyways, threashold is passed, Lavender says, so … approval stamp for drone attack!

  • delirious_owl@discuss.online
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    7 months ago

    Looks like the article is not accessible on Tor. Here’s as much of the article I can paste before reaching the max char limit of Lemmy

    ‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza

    The Israeli army has marked tens of thousands of Gazans as suspects for assassination, using an AI targeting system with little human oversight and a permissive policy for casualties, +972 and Local Call reveal.

    By Yuval Abraham | April 3, 2024

    In 2021, a book titled “The Human-Machine Team: How to Create Synergy Between Human and Artificial Intelligence That Will Revolutionize Our World” was released in English under the pen name “Brigadier General Y.S.” In it, the author — a man who we confirmed to be the current commander of the elite Israeli intelligence unit 8200 — makes the case for designing a special machine that could rapidly process massive amounts of data to generate thousands of potential “targets” for military strikes in the heat of a war. Such technology, he writes, would resolve what he described as a “human bottleneck for both locating the new targets and decision-making to approve the targets.”

    Such a machine, it turns out, actually exists. A new investigation by +972 Magazine and Local Call reveals that the Israeli army has developed an artificial intelligence-based program known as “Lavender,” unveiled here for the first time. According to six Israeli intelligence officers, who have all served in the army during the current war on the Gaza Strip and had first-hand involvement with the use of AI to generate targets for assassination, Lavender has played a central role in the unprecedented bombing of Palestinians, especially during the early stages of the war. In fact, according to the sources, its influence on the military’s operations was such that they essentially treated the outputs of the AI machine “as if it were a human decision.”

    Formally, the Lavender system is designed to mark all suspected operatives in the military wings of Hamas and Palestinian Islamic Jihad (PIJ), including low-ranking ones, as potential bombing targets. The sources told +972 and Local Call that, during the first weeks of the war, the army almost completely relied on Lavender, which clocked as many as 37,000 Palestinians as suspected militants — and their homes — for possible air strikes.

    During the early stages of the war, the army gave sweeping approval for officers to adopt Lavender’s kill lists, with no requirement to thoroughly check why the machine made those choices or to examine the raw intelligence data on which they were based. One source stated that human personnel often served only as a “rubber stamp” for the machine’s decisions, adding that, normally, they would personally devote only about “20 seconds” to each target before authorizing a bombing — just to make sure the Lavender-marked target is male. This was despite knowing that the system makes what are regarded as “errors” in approximately 10 percent of cases, and is known to occasionally mark individuals who have merely a loose connection to militant groups, or no connection at all.

    Moreover, the Israeli army systematically attacked the targeted individuals while they were in their homes — usually at night while their whole families were present — rather than during the course of military activity. According to the sources, this was because, from what they regarded as an intelligence standpoint, it was easier to locate the individuals in their private houses. Additional automated systems, including one called “Where’s Daddy?” also revealed here for the first time, were used specifically to track the targeted individuals and carry out bombings when they had entered their family’s residences.

    Palestinians transport the wounded and try to put out a fire after an Israeli airstrike on a house in the Shaboura refugee camp in the city of Rafah, southern Gaza Strip, November 17, 2023. (Abed Rahim Khatib/Flash90)
    Palestinians transport the wounded and try to put out a fire after an Israeli airstrike on a house in the Shaboura refugee camp in the city of Rafah, southern Gaza Strip, November 17, 2023. (Abed Rahim Khatib/Flash90)

    The result, as the sources testified, is that thousands of Palestinians — most of them women and children or people who were not involved in the fighting — were wiped out by Israeli airstrikes, especially during the first weeks of the war, because of the AI program’s decisions.

    “We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity,” A., an intelligence officer, told +972 and Local Call. “On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”

    The Lavender machine joins another AI system, “The Gospel,” about which information was revealed in a previous investigation by +972 and Local Call in November 2023, as well as in the Israeli military’s own publications. A fundamental difference between the two systems is in the definition of the target: whereas The Gospel marks buildings and structures that the army claims militants operate from, Lavender marks people — and puts them on a kill list.

    In addition, according to the sources, when it came to targeting alleged junior militants marked by Lavender, the army preferred to only use unguided missiles, commonly known as “dumb” bombs (in contrast to “smart” precision bombs), which can destroy entire buildings on top of their occupants and cause significant casualties. “You don’t want to waste expensive bombs on unimportant people — it’s very expensive for the country and there’s a shortage [of those bombs],” said C., one of the intelligence officers. Another source said that they had personally authorized the bombing of “hundreds” of private homes of alleged junior operatives marked by Lavender, with many of these attacks killing civilians and entire families as “collateral damage.”

    In an unprecedented move, according to two of the sources, the army also decided during the first weeks of the war that, for every junior Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians; in the past, the military did not authorize any “collateral damage” during assassinations of low-ranking militants. The sources added that, in the event that the target was a senior Hamas official with the rank of battalion or brigade commander, the army on several occasions authorized the killing of more than 100 civilians in the assassination of a single commander.

    Palestinians wait to receive the bodies of their relatives who were killed in an Israeli airstrike, at Al-Najjar Hospital in Rafah, southern Gaza Strip, October 24, 2023. (Abed Rahim Khatib/Flash90)
    Palestinians wait to receive the bodies of their relatives who were killed in an Israeli airstrike, at Al-Najjar Hospital in Rafah, southern Gaza Strip, October 24, 2023. (Abed Rahim Khatib/Flash90)

    The following investigation is organized according to the six chronological stages of the Israeli army’s highly automated target production in the early weeks of the Gaza war. First, we explain the Lavender machine itself, which marked tens of thousands of Palestinians using AI. Second, we reveal the “Where’s Daddy?” system, which tracked these targets and signaled to the army when they entered their family homes. Third, we describe how “dumb” bombs were chosen to strike these homes.

    Fourth, we explain how the army loosened the permitted number of civilians who could be killed during the bombing of a target. Fifth, we note how automated software inaccurately calculated the amount of non-combatants in each household. And sixth, we show how on several occasions, when a home was struck, usually at night, the individual target was sometimes not inside at all, because military officers did not verify the information in real time.

    STEP 1: GENERATING TARGETS

    ‘Once you go automatic, target generation goes crazy’

    In the Israeli army, the term “human target” referred in the past to a senior military operative who, according to the rules of the military’s International Law Department, can be killed in their private home even if there are civilians around. Intelligence sources told +972 and Local Call that during Israel’s previous wars, since this was an “especially brutal” way to kill someone — often by killing an entire family alongside the target — such human targets were marked very carefully and only senior military commanders were bombed in their homes, to maintain the principle of proportionality under international law.

    (max char reached). Read the entire article here (mirror)

    • DdCno1@beehaw.org
      link
      fedilink
      arrow-up
      5
      ·
      8 months ago

      It is not a war crime to kill commanding officers in a war. On the contrary, this is a common and widely accepted strategy that can reduce the number of overall casualties, since it often motivates lower ranks to surrender.

      • Mycatiskai@lemmy.ca
        link
        fedilink
        arrow-up
        3
        ·
        8 months ago

        It’s too bad that killing a commander along with killing 100 civilians probably net increases the amount of fighters because of innocent people killed in the indiscriminate bombings and their surviving family members.

        • DdCno1@beehaw.org
          link
          fedilink
          arrow-up
          2
          ·
          8 months ago

          Independent Palestinian polls show that support for Hamas and especially violent struggle are lower in Gaza than in the West Bank. This goes against your hypothesis. Also, as the war rages on, there has been a recent and rather dramatic uptick among Gazans who support a two-state solution, a complete change from before. This indicates that people directly affected of the war are becoming tired of it and want peaceful coexistence, not a continuation of the fighting.

          https://i.imgur.com/h5TlemJ.png

          https://i.imgur.com/jyP8jrT.png

          https://i.imgur.com/3JncEU4.png

          Source: https://www.pcpsr.org/en/node/969

          One might even go so far as to say that Israel is breaking the fighting spirit of Gazans. Israel is winning this war, slowly but surely, at a great cost - especially, but not only, at the cost of Palestinian lives - but there will be no other outcome. Hamas could end it right now by admitting defeat, but they choose to let the people die until the bitter end while they are hiding, in bunkers and in Qatar.

          One more thing: Striking a commander and hitting civilians with him is the opposite of an indiscriminate bombing. That’s still a targeted discriminate strike, even with additional civilian casualties. Indiscriminate would be carpet bombing - and Israel doesn’t even have aircraft that are capable of this.

          • Umbrias@beehaw.org
            link
            fedilink
            arrow-up
            3
            ·
            8 months ago

            “Civilians want to stop being bombed en masse, more open to demands of bombers” isn’t exactly the argument you think it is.

            Your comment also fully assumes that the goals of Israel are to reduce war sentiment in Palestine.

            Among other odd issues in this comment, ultimately you’re not addressing the core issue people are taking with Israel’s choices by taking a hard-line ‘realism’ stance.

            • DdCno1@beehaw.org
              link
              fedilink
              arrow-up
              1
              ·
              8 months ago

              “Civilians want to stop being bombed en masse, more open to demands of bombers” isn’t exactly the argument you think it is.

              It’s an observation. Not to mention, a two-state solution is the best thing Palestinians can ever hope for. At this point, it’s highly optimistic at best, but still more realistic than the genocidal “from the river to the sea” pipe dream.

              Your comment also fully assumes that the goals of Israel are to reduce war sentiment in Palestine.

              Not the goal, but it’s most certainly a goal. Before this war, I was under the impressions that many Palestinians and their supporters failed to realize just how massive the difference in capabilities between the two sides are. Palestinian leadership decided to essentially poke the bear through horrendous massacres and rapes, hoping that the rest of the Arab world would join in before they’ve been bombed back to the stone age. They were extra stupid doing this while a far-right coalition was in power. One can only call these a series of grave miscalculations that ordinary Palestinians will be paying for for decades to come.

              Among other odd issues in this comment, ultimately you’re not addressing the core issue people are taking with Israel’s choices by taking a hard-line ‘realism’ stance.

              So being realistic is a bad thing now?

              Here’s the deal: Israel had no other choice but to declare war over this. No other nation would have acted any differently. If you do not strike back at a pseudo-state that staged one of the worst terrorist attacks in history, you are inviting more attacks like these. If you make concessions in response, you are showing terrorists that terrorism works, also inviting more attacks. Even Denmark would have declared war in this kind of situation.

              The only valid point worth discussing is how they are choosing to fight this war. I feel like Israel is between a rock and a hard place and can only ever hope to choose the least terrible option - and since they are not infallible, they are not capable of always doing that and even if they do in certain situations, this can still result in the suffering of civilians. War is awful, always has been and people should get rid of the delusion that a clean war is even possible.

              I wish there was a different and far more moderate government in power in Israel instead one under the leadership of the Israeli equivalent of Donald Trump, one that is far less callous about human lives, but here’s the problem: After every single terrorist attack in the past, the Israeli public has moved further to the right. I’m sure Palestinian leadership knew this when they made their plans, I’m convinced they hoped for this, because it means the conflict will live on. Seemingly paradoxically, both the Israeli far right and the Palestinian leadership need this mess to remain unsolved, because they rely on it for power. Neither are capable nor willing to actually solve problems and are largely in it for personal gain.

              • Umbrias@beehaw.org
                link
                fedilink
                arrow-up
                2
                ·
                7 months ago

                People often mistake convenient and unsympathetic rationalizations of their own views as realism.

                • DdCno1@beehaw.org
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  7 months ago

                  I would have never chosen this word myself to describe any of my stances, but since you did it, even with quotation marks, it’s a bit odd that you’re now complaining about it.

                  Not that I was expecting much from your response, but that’s all you have to say?