wow. sensible - eviltoast
  • DumbAceDragon@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    46
    ·
    edit-2
    5 months ago

    What these people don’t realize is you’re never gonna get AGI by just feeding a machine an infinite amount of raw data.

      • David Gerard@awful.systemsM
        link
        fedilink
        English
        arrow-up
        25
        ·
        5 months ago

        There might actually be nothing bad about the Torment Nexus, and the classic sci-fi novel “Don’t Create The Torment Nexus” was nonsense. We shouldn’t be making policy decisions based off of that.

        wild

      • Soyweiser@awful.systems
        link
        fedilink
        English
        arrow-up
        20
        ·
        5 months ago

        Yes, we know (there are papers about it) that for LLMs every increase of capabilities we need exponentially more data to train it. But don’t worry, we only consumed half the worlds data to train LLMs, still a lot of places to go ;).