Does anyone here know what exactly happened to lesswrong to become so cult-y? I had never seen or heard anything about it for years, back in my day it was seen as that funny website full strange peopl - eviltoast

Does anyone here know what exactly happened to lesswrong to become so cult-y? I had never seen or heard anything about it for years, back in my day it was seen as that funny website full strange people posting weird shit about utliltarianism, nothing cult-y, just weird. The aritcle on TREACLES and this sub’s mentioning of lesswrong made me very curious about how it went from people talking out of their ass for the sheer fun of “thought experiments” to a straight-up doomsday cult?
The one time I read lesswrong was probably in 2008 or so.

  • TerribleMachines@awful.systems
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    1 year ago

    Only half joking: there was this one fanfic you see…

    Mainly I don’t think there was any one inciting incident beyond its creation: Yud was a one man cult way before LW, and the sequences actively pushed all the cultish elements required to lose touch with reality. (Fortunately, my dyslexic ass only got as far as the earlier bits he mostly stole from other people rather than the really crazy stuff.)

    There was definitely a step-change around the time CFAR was created, that was basically a recruitment mechanism for the cult and part of the reason I got anywhere physically near those rubes myself. An organisation made to help people be more rational seemed like a great idea—except it literally became EY/MIRI’s personal sockpuppet. They would get people together in these fancy ass mansions for their workshops and then tell them nothing other than AI research mattered. I think it was 2014/15 when they decided internally that CFAR’s mission was to create more people like Yudkowsky. I don’t think its a coincidence that most of the really crazy cult stuff I’ve heard about happened after then.

    Not that bad stuff didn’t happen before either.___

    • skillissuer@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 year ago

      I think it was 2014/15 when they decided internally that CFAR’s mission was to create more people like Yudkowsky

      the real AI doom is Eliezer cloning facility