A.I.’s un-learning problem: Researchers say it’s virtually impossible to make an A.I. model ‘forget’ the things it learns from private user data - eviltoast

I’m rather curious to see how the EU’s privacy laws are going to handle this.

(Original article is from Fortune, but Yahoo Finance doesn’t have a paywall)

  • FaceDeer@kbin.social
    link
    fedilink
    arrow-up
    7
    ·
    1 year ago

    More that they know enough about how it works that they know it’s impossible to do. The data isn’t stored like files on a hard drive, in some discrete bundle of bytes somewhere, and the problem is simply trying to find and erase them. It’s stored as a distributed haze of weightings spread out over all of the nodes in the network, blended with all the other distributed hazes of everything else that the AI knows. A court may as well order a human to forget a specific fact, memories are stored in a similar manner.

    Best the law can probably do right now is forbid AIs from speaking about certain facts. And even then as we’ve seen with the like of ChatGPT there will be ways to talk around such bans.

    • garyyo@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      they know it’s impossible to do

      There is some research into ML data deletion and its shown to be possible, but maybe not on larger scales and maybe not something that is actually feasible compared to retraining.