wow. sensible - eviltoast
  • HeckGazer@programming.dev
    link
    fedilink
    English
    arrow-up
    50
    ·
    5 months ago

    Oh ez, that’s only 17 orders of magnitude!

    If we managed an optimistic pace of doubling every year that’d only take… 40 years. The last few survivors on desert world can ask it if it was worth it

    • Eiim@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      7
      ·
      5 months ago

      Rather amusing prediction that despite the obscene amount of resources being spent on AI compute already, it’s apparently reasonable to expect to spend 1,000,000x that in the “near future”.

  • DumbAceDragon@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    46
    ·
    edit-2
    5 months ago

    What these people don’t realize is you’re never gonna get AGI by just feeding a machine an infinite amount of raw data.

      • David Gerard@awful.systemsM
        link
        fedilink
        English
        arrow-up
        25
        ·
        5 months ago

        There might actually be nothing bad about the Torment Nexus, and the classic sci-fi novel “Don’t Create The Torment Nexus” was nonsense. We shouldn’t be making policy decisions based off of that.

        wild

      • Soyweiser@awful.systems
        link
        fedilink
        English
        arrow-up
        20
        ·
        5 months ago

        Yes, we know (there are papers about it) that for LLMs every increase of capabilities we need exponentially more data to train it. But don’t worry, we only consumed half the worlds data to train LLMs, still a lot of places to go ;).

    • Soyweiser@awful.systems
      link
      fedilink
      English
      arrow-up
      17
      ·
      5 months ago

      You don’t understand, after we invent god AGI all our problems are solved. Now step into the computroniuminator, we need your atoms for more compute.

    • someacnt_@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      5 months ago

      Yeah I don’t see why people are so blind on this. Computation is energy-intensive, and we are yet to optimize it for the energy. Yet, all the hopes…

      • DivineDev@kbin.run
        link
        fedilink
        arrow-up
        12
        ·
        5 months ago

        We do optimize, it’s just that when you decrease the energy for computations by half, you just do twice the computations to iterate faster instead of using half the energy.

  • froztbyte@awful.systems
    link
    fedilink
    English
    arrow-up
    26
    ·
    5 months ago

    that looks like someone used win9x mspaint to make a flag, fucked it up, and then fucked it up even more on the saving throw

    • sinedpick@awful.systems
      link
      fedilink
      English
      arrow-up
      10
      ·
      5 months ago

      did you even experience a single conscious thought while writing that? what fucking potential are you referring to? generating reams of scam messages and Internet spam? automating the only jobs that people actually enjoy doing? seriously, where is the thought?