From Stockholm Impact/Watch 2023 here's a must watch: - eviltoast
  • paradrenasite@lemmy.ca
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I admit I’m on the verge of losing sight of the overall point of this thread. But thinking about it more, I will add that looking at the actual tech curve itself may not be that important, depending on where you draw the line between technology and capability. For example, it may not matter that increase in transistor density is slowing down when global total compute keeps increasing exponentially. Further, how would quantum computing factor into this (the movement in the cryptography space suggests that a post-quantum world is imminent). On the topic of LLMs, would it matter if those stagnate while the ability of companies and states to manipulate us and drown us in misinformation keeps growing exponentially? And how would the advent of AGI factor into this - in some ways that would be the last invention we have to make ourselves. I guess the point is that some advancements, even ones that are just incremental, seem to have an outsized effect on our ability to impact the world around us.

    Anyway, I read the article you linked and enjoyed it. It reminded me a bit of Meditations On Moloch, also a good read if you haven’t seen it yet, attempting to explain the behaviors of civilizations.

    • jsdz@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      That there’s a difference between transistor density and total compute capacity does seem crucial. We can and often do have more microprocessors in the world, and more automobiles, more coal mines, more injection-molded plastic, more batteries, more bombs, more of everything without any improvement in technology whatsoever. Just more of the same, mass-produced by the ever-expanding machine. If there does exist some curve that measures overall tech progress in a useful way, the curve which measures its applications and their effects is necessarily offset from it in time and need not follow its shape. It seems natural to expect our thrust into ecological overshoot to continue for some time after the big impulse that pushed us into it has begun to fade. Much like the “macabre whale analogy” from Scott Alexander there, in which we’re enjoying the results of a whalefall.

      As long as resources aren’t scarce enough to lock us in a war of all against all, we can do silly non-optimal things – like art and music and philosophy and love – and not be outcompeted by merciless killing machines most of the time.

      On a more optimistic note I think history demonstrates that art, music, and love are not so easily done away with although I’m not so sure about philosophy. Scarcity is going to make a comeback, and the results will be as unpredictable as anything people imagine AGI might do if it does actually arrive some day. I suspect we should ideally aim to make some changes to what is normally thought of as “human nature” in order to avoid the worst outcomes, but that they need not be larger ones than those that have happened in the past. But who the hell knows really.