New AI systems collide with copyright law - eviltoast
  • DekkerNSFW@kbin.social
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    There’s usually nothing left of the original image. But sometimes a specific image pops up in the dataset more often and gets overtrained, which is why you can get a pretty close copy of the Starry Night from vanilla SD. But yeah, it’s not tracing.

    • FaceDeer@kbin.social
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      Those instances are considered a flaw and trainers work hard to prevent them. When they do occur you have to know they’re in there in order to dredge them back out.