‘The Gospel’: how Israel uses AI to select bombing targets in Gaza - eviltoast
  • Ultraviolet@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    11 months ago

    There should be some sort of law where if you want to offload decisions to AI, the person who decides to let the AI make those decisions needs to step up to take full civil and criminal liability for everything it does.

      • jonne@infosec.pub
        link
        fedilink
        English
        arrow-up
        2
        ·
        11 months ago

        Yes, one person we can pin all of humanity’s sins on, and then we just kill them. It’s almost like a religious ritual.

    • ForgotAboutDre@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      11 months ago

      No, every decision maker in the chain of command should be responsible. They should know what the intelligence is based on, if the people sharing the information are competent and should be validating the information.

      Using AI to perform these tasks requires gross negligence at several stages. However, it does appear killing civilians and children is the intended outcome so negligence about AI is likely just a cover.