LessWrong classics: “A Bayesian superintelligence, hooked up to a webcam of a falling apple, would invent general relativity by the third frame” - eviltoast
    • David Gerard@awful.systemsOPM
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      1 year ago

      feel free!

      edit: I read through the sequences three times: once on the site, once as an epub and once reading every post on LW main from 2007-2011 in order of posting. I can state that I have Done The Fucking Reading. The sequences finished in 2009, then you can see the site get weirder as people riff off them, up to the basilisk post in mid-2010. At that point everyone noticeably cools it on the weirdness and the site’s haunted by a post nobody will talk out loud about. Then HPMOR takes off and the site has a new recruiting point.

    • self@awful.systemsM
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      oh absolutely, and check out the ridiculous amount of ideological priming Yud does in this post. one example:

      (Oh, and every time someone in this world tries to build a really powerful AI, the computing hardware spontaneously melts. This isn’t really important to the story, but I need to postulate this in order to have human people sticking around, in the flesh, for seventy years.)

      (and it’s very funny to me that a number of comments are “oh I had no idea this was about AI until the end…!”, how young are these kids you’re programming, Yud?)

      in general, the ridiculous amount of slog going in combined with regular priming reminds me a lot of another sci-fi flavored cult I know, if you get my meaning

    • froztbyte@awful.systems
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      oh yeah the complexity and effort is almost certainly one of the points - people don’t like to admit they got swindled or wasted their time, and ostensibly-clever people are just as capable of falling victim to this as others