Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 9 June 2024 - eviltoast

Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid!

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

  • Soyweiser@awful.systems
    link
    fedilink
    English
    arrow-up
    8
    ·
    7 months ago

    Same with when they added some features to the UI of gpt with the gpt40 chatbot thing. Don’t get me wrong, the tech to do real time audioprocessing etc is impressive (but has nothing to do with LLMs, it was a different technique) but it certainly is very much smoke and mirrors.

    I recall when they taught developers to be careful with small UI changes without backend changes as for non-insiders that feels like a massive change while the backend still needs a lot of work (so the client thinks you are 90% done while only 10% is done), but now half the tech people get tricked by the same problem.

    • ebu@awful.systems
      link
      fedilink
      English
      arrow-up
      8
      ·
      7 months ago

      i suppose there is something more “magical” about having the computer respond in realtime, and maybe it’s that “magical” feeling that’s getting so many people to just kinda shut off their brains when creators/fans start wildly speculating on what it can/will be able to do.

      how that manages to override people’s perceptions of their own experiences happening right in front of it still boggles my mind. they’ll watch a person point out that it gets basic facts wrong or speaks incoherently, and assume the fault lies with the person for not having the true vision or what have you.

      (and if i were to channel my inner 2010’s reddit atheist for just a moment it feels distinctly like the ways people talk about Christian Rapture, where flaws and issues you’re pointing out in the system get spun as personal flaws. you aren’t observing basic facts about the system making errors, you are actively in ego-preserving denial about the “inevitability of ai”)