A.I.’s un-learning problem: Researchers say it’s virtually impossible to make an A.I. model ‘forget’ the things it learns from private user data - eviltoast

I’m rather curious to see how the EU’s privacy laws are going to handle this.

(Original article is from Fortune, but Yahoo Finance doesn’t have a paywall)

  • AWittyUsername@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    1 year ago

    Much like DLLs exist for compiled binary executables, could we not have modular AI training data? Then only a small chunk would need to be relearned at a time.

    Just throwing this into the void here.

    • SGforce@lemmy.ca
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      Nah, it’s too much like how a lobotomy works. Even taking a small chunk of your brain might have huge impacts.

    • Aceticon@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      The difference in between having or not something in the training set of a Neural Network is going to be different values for non-integer factors all over the neural network and, worse, it is just as like that they’re tiny differences as it is that they’re massive differences.

      Or to give you a decent metaphor for it, “it would be like trying to remove a specific egg from a bowl of scrambled eggs”.