Thousands of authors demand payment from AI companies for use of copyrighted works - eviltoast

Thousands of authors demand payment from AI companies for use of copyrighted works::Thousands of published authors are requesting payment from tech companies for the use of their copyrighted works in training artificial intelligence tools, marking the latest intellectual property critique to target AI development.

      • dan@lemm.ee
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        5
        ·
        1 year ago

        No but a lazy copy of someone else’s work might be copyright infringement.

        • Odusei@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          4
          ·
          1 year ago

          So when does Kevin Costner get to sue James Cameron for his lazy copy of Dances With Wolves?

          • dan@lemm.ee
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            3
            ·
            1 year ago

            Idk, maybe. There are thousands of copyright infringement lawsuits, sometimes they win.

            I don’t necessarily agree with how copyright law works, but that’s a different question. Doesn’t change the fact that sometimes you can successfully sue for copyright infringement if someone copies your stuff to make something new.

          • tenitchyfingers@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            Why not? Hollywood is full to the brim with people suing for copyright infringement. And sometimes they win. Why should it be different for AI companies?

    • lily33@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      3
      ·
      edit-2
      1 year ago

      Language models actually do learn things in the sense that: the information encoded in the training model isn’t usually* taken directly from the training data; instead, it’s information that describes the training data, but is new. That’s why it can generate text that’s never appeared in the data.

      • the bigger models seem to remember some of the data and can reproduce it verbatim; but that’s not really the goal.
    • Chailles@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      1 year ago

      What does inspiration have to do with anything? And to be honest, humans being inspired has led to far more blatant copyright infringement.

      As for learning, they do learn. No different than us, except we learn silly abstractions to make sense of things while AI learns from trial and error. Ask any artist if they’ve ever looked at someone else’s work to figure out how to draw something, even if they’re not explicitly looking up a picture, if they’ve ever seen a depiction of it, they recall and use that. Why is it wrong if an AI does the same?