Reddit changes have blocked all search engines except Google amid AI 'misuse' [U] - eviltoast
  • Even_Adder@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    8
    ·
    4 months ago

    It sounds a lot like this quote from Andrej Karpathy :

    Turns out that LLMs learn a lot better and faster from educational content as well. This is partly because the average Common Crawl article (internet pages) is not of very high value and distracts the training, packing in too much irrelevant information. The average webpage on the internet is so random and terrible it’s not even clear how prior LLMs learn anything at all.

    • vxx@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      4 months ago

      So it will end in a downward spiral because it starts learning from AI articles, from which articles are being written, from which the AI learns, from which articles are being written …

      • Even_Adder@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        As long as there’s supervision during training, which there always will be, this isn’t really a problem. This just shows how bad it can get if you just train on generated stuff.