LessWrong classics: “A Bayesian superintelligence, hooked up to a webcam of a falling apple, would invent general relativity by the third frame” - eviltoast
  • self@awful.systemsM
    link
    fedilink
    English
    arrow-up
    11
    ·
    1 year ago

    But in this world there are careful thinkers, of great prestige as well, and they are not so sure. “There are easier ways to send a message,” they post to their blogs

    please destroy this gate address, it leads to Reply Guy Earth

    • self@awful.systemsM
      link
      fedilink
      English
      arrow-up
      14
      ·
      1 year ago

      also, sincerely, can anyone explain to me what’s good about Yud’s writing? this shit is structured exactly like a goosebumps short except instead of being written by a likeable author targeting grade schoolers it’s written by some asshole who loves using concepts he doesn’t understand, targeting other assholes who don’t understand fucking anything because all their knowledge got filtered through Yud

      • froztbyte@awful.systems
        link
        fedilink
        English
        arrow-up
        8
        ·
        1 year ago

        I don’t think there’s anything good about the writing, but there’s a few things that stand out ito mechanics employed and to which outcome effect they appear to be aiming

        • (bad) storyteller style (nerds love 'em some stories as much as the next, even those who think they don’t)
        • touching on sufficiently many topics (“oh wow he’s thought about this so hard”
        • going just far enough in detail to convince that there’s some kind of deeper aspect/more (“wow he knows so much about this”)

        even this horrible essay pulled the infomercial “but wait, there’s more!” at least 5 times. a terrible Plot Twist because he can’t figure out how to layer his story devices any better

        • David Gerard@awful.systemsOPM
          link
          fedilink
          English
          arrow-up
          8
          ·
          edit-2
          1 year ago

          feel free!

          edit: I read through the sequences three times: once on the site, once as an epub and once reading every post on LW main from 2007-2011 in order of posting. I can state that I have Done The Fucking Reading. The sequences finished in 2009, then you can see the site get weirder as people riff off them, up to the basilisk post in mid-2010. At that point everyone noticeably cools it on the weirdness and the site’s haunted by a post nobody will talk out loud about. Then HPMOR takes off and the site has a new recruiting point.

        • self@awful.systemsM
          link
          fedilink
          English
          arrow-up
          6
          ·
          1 year ago

          oh absolutely, and check out the ridiculous amount of ideological priming Yud does in this post. one example:

          (Oh, and every time someone in this world tries to build a really powerful AI, the computing hardware spontaneously melts. This isn’t really important to the story, but I need to postulate this in order to have human people sticking around, in the flesh, for seventy years.)

          (and it’s very funny to me that a number of comments are “oh I had no idea this was about AI until the end…!”, how young are these kids you’re programming, Yud?)

          in general, the ridiculous amount of slog going in combined with regular priming reminds me a lot of another sci-fi flavored cult I know, if you get my meaning

        • froztbyte@awful.systems
          link
          fedilink
          English
          arrow-up
          5
          ·
          1 year ago

          oh yeah the complexity and effort is almost certainly one of the points - people don’t like to admit they got swindled or wasted their time, and ostensibly-clever people are just as capable of falling victim to this as others