An Analysis of DeepMind's 'Language Modeling Is Compression' Paper - eviltoast
  • AbouBenAdhem@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    Firstly—maybe what we consider an “association” is actually an indicator that our brains are using the same internal tokens to store/compress the memories.

    But what I was thinking of specifically is narrative memories: our brains don’t store them frame-by-frame like video, but rather, they probably store only key elements and use their predictive ability to extrapolate the omitted elements on demand.

    • InvertedParallax@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      No, because our brains also use hierarchical activation for association, which is why if we’re talking about bugs and I say “I got a B” you assume its a stinging insect, not a passing grade.

      If it was simple word2vec we wouldn’t have that additional means of noise suppression.