Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 23 June 2024 - eviltoast

Need to make a primal scream without gathering footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid!

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cutā€™nā€™paste it into its own post ā€” thereā€™s no quota for posting and the bar really isnā€™t that high.

The post Xitter web has spawned soo many ā€œesotericā€ right wing freaks, but thereā€™s no appropriate sneer-space for them. Iā€™m talking redscare-ish, reality challenged ā€œculture criticsā€ who write about everything but understand nothing. Iā€™m talking about reply-guys who make the same 6 tweets about the same 3 subjects. Theyā€™re inescapable at this point, yet I donā€™t see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldnā€™t be surgeons because they didnā€™t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I canā€™t escape them, I would love to sneer at them.

  • BigMuffin69@awful.systems
    link
    fedilink
    English
    arrow-up
    22
    Ā·
    6 months ago

    https://xcancel.com/AISafetyMemes/status/1802894899022533034#m

    The same pundits have been saying ā€œdeep learning is hitting a wallā€ for a DECADE. Why do they have ANY credibility left? Wrong, wrong, wrong. Year after year after year. Like all professional pundits, they pound their fist on the table and confidently declare AGI IS DEFINITELY FAR OFF and people breathe a sigh of relief. Because to admit that AGI might be soon is SCARY. Or it should be, because it represents MASSIVE uncertainty. AGI is our final invention. You have to acknowledge the world as we know it will end, for better or worse. Your 20 year plans up in smoke. Learning a language for no reason. Preparing for a career that wonā€™t exist. Raising kids who might justā€¦ suddenly die. Because we invited aliens with superior technology we couldnā€™t control. Remember, many hopium addicts are just hoping that we become PETS. They point to Ian Banksā€™ Culture series as a good outcomeā€¦ where, again, HUMANS ARE PETS. THIS IS THEIR GOOD OUTCOME. Whatā€™s funny, too, is that noted skeptics like Gary Marcus still think thereā€™s a 35% chance of AGI in the next 12 years - that is still HIGH! (Side note: many skeptics are butthurt they wasted their career on the wrong ML paradigm.) Nobody wants to stare in the face the fact that 1) the average AI scientist thinks there is a 1 in 6 chance weā€™re all about to die, or that 2) most AGI company insiders now think AGI is 2-5 years away. It is insane that this isnā€™t the only thing on the news right now. Soā€¦ we stay in our hopium dens, nitpicking The Latest Thing AI Still Canā€™t Do, missing forests from trees, underreacting to the clear-as-day exponential. Most insiders agree: the alien ships are now visible in the sky, and we donā€™t know if theyā€™re going to cure cancer or exterminate us. Be brave. Stare AGI in the face.

    This post almost made me crash my self-driving car.

    • self@awful.systems
      link
      fedilink
      English
      arrow-up
      20
      Ā·
      6 months ago

      Remember, many hopium addicts are just hoping that we become PETS. They point to Ian Banksā€™ Culture series as a good outcomeā€¦ where, again, HUMANS ARE PETS. THIS IS THEIR GOOD OUTCOME.

      I am once again begging these e/acc fucking idiots to actually read and engage with the sci-fi books they keep citing

      but who am I kidding? the only way you come up with a take as stupid as ā€œhumans are pets in the Cultureā€ is if your only exposure to the books is having GPT summarize them

    • maol@awful.systems
      link
      fedilink
      English
      arrow-up
      19
      Ā·
      6 months ago

      Itā€™s mad that we have an actual existential crisis in climate change (temperature records broken across the world this year) but these cunts are driving themselves into a frenzy over something that is nowhere near as pressing or dangerous. Oh, people dying of heatstroke isnā€™t as glamorous? Fuck off

    • Mii@awful.systems
      link
      fedilink
      English
      arrow-up
      16
      Ā·
      6 months ago

      Seriously, could someone gift this dude a subscription to spicyautocompletegirlfriends.ai so he can finally cum?

      One thing thatā€™s crazy: itā€™s not just skeptics, virtually EVERYONE in AI has a terrible track record - and all in the same OPPOSITE direction from usual! In every other industry, due to the Planning Fallacy etc, people predict things will take 2 years, but they actually take 10 years. In AI, people predict 10 years, then it happens in 2!

      ai_quotes_from_1965.txt

    • Soyweiser@awful.systems
      link
      fedilink
      English
      arrow-up
      14
      Ā·
      6 months ago

      humans are pets

      Actually not what is happening in the books. I get where they are coming form but this requires redefining the word pet in such a way it is a useless word.

      The Culture series really breaks the brains of people who can only think in hierarchies.

    • gerikson@awful.systems
      link
      fedilink
      English
      arrow-up
      12
      Ā·
      6 months ago

      If youā€™ve been around the block like I have, youā€™ve seen reports about people joining cults to await spaceships, people preaching that the world is about to end &c. Itā€™s a staple trope in old New Yorker cartoons, where a bearded dude walks around with a billboard saying ā€œThe End is nighā€.

      The tech world is growing up, and a new internet-native generation has taken over. But everyone is still human, and the same pattern-matching that leads a 19th century Christian to discern when the world is going to end by reading Revelation will lead a 25 year old tech bro steeped in ā€œrationalismā€ to decide that spicy autocomplete is the first stage of The End of the Human Race. The only difference is the inputs.