Google AI making up recalls that didn’t happen - eviltoast
  • micka190@lemmy.world
    link
    fedilink
    English
    arrow-up
    71
    ·
    6 months ago

    We had a case in Canada where Air Canada was forced to give a customer a refund after its AI told him he was eligible for one, because the judge stated that Air Canada was responsible for what their AI said.

    So, maybe?

    I’ve seen some legal experts talk about how Google basically got away from misinformation lawsuits because they weren’t creating misinformation, they were giving you search results that contained misinformation, but that wasn’t their fault and they were making an effort to combat those kinds of search results. They were talking about how the outcome of those lawsuits might be different if Google’s AI is the one creating the misinformation, since that’s on them.

    • SpaceCowboy@lemmy.ca
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      6 months ago

      Yeah the Air Canada case probably isn’t a big indicator on where the legal system will end up on this. The guy was entitled to some money if he submitted the request on time, but the reason he didn’t was because the chatbot gave the wrong information. It’s the kind of case that shouldn’t have gotten to a courtroom, because come on, you’re supposed to give him the money any it’s just some paperwork screwup caused by your chatbot that created this whole problem.

      In terms of someone someone getting sick because they put glue on their pizza because google’s AI told them to… we’ll have to see. They may do the thing where “a reasonable person should know that the things an AI says isn’t always fact” which will probably hold water if google keeps a disclaimer on their AI generated results.