Air Canada must pay damages after chatbot lies to grieving passenger about discount – Airline tried arguing virtual assistant was solely responsible for its own actions - eviltoast
  • Fushuan [he/him]@lemm.ee
    link
    fedilink
    English
    arrow-up
    18
    ·
    9 months ago

    If it’s integrated in their service, unless they have a disclaimer and the customer has to accept it to use the bot, they are the ones telling the customer that whatever the bot says is true.

    If I contract a company to do X and one of their employees fucks shit up, I will ask for damages to the company, and They internally will have to deal with the worker. The bot is the worker in this instance.

    • lunar17@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      9 months ago

      So what you’re saying is that companies will start hiring LLMs as “independent contractors”?

      • Fushuan [he/him]@lemm.ee
        link
        fedilink
        English
        arrow-up
        8
        ·
        9 months ago

        No, the company contracted the service from another company, but that’s irrelevant. I’m saying that in any case, the company is responsible for any service it provides unless there’s a disclaimer. Be that service a chat bot, a ticketing system, a store, workers.

        If an Accenture contractor fucks up, the one liable for the client is Accenture. Now, Accenture may sue the worker but that’s besides the point. If a store mismanaged products and sold wrong stuff or inputted incorrect prices, you go against the store chain, not the individual store, nor the worker. If a ticketing system takes your money but sends you an invalid ticket, you complain to the company that manages, it, not the ones that program it.

        It’s pretty simple actually.