LLMs can’t reason — they just crib reasoning-like steps from their training data - eviltoast
  • V0ldek@awful.systems
    link
    fedilink
    English
    arrow-up
    17
    ·
    edit-2
    1 month ago

    This has been said multiple times but I don’t think it’s possible to internalize because of how fucking bleak it is.

    The VC/MBA class thinks all communication can be distilled into saying the precise string of words that triggers the stochastically desired response in the consumer. Conveying ideas or information is not the point. This is why ChatGPT seems like the holy grail to them, it effortlessly1 generates mountains of corporate slop that carry no actual meaning. It’s all form and no substance, because those people – their entire existence, the essence of their cursed dark souls – has no substance.

    1 batteries not included

    • Optional@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 month ago

      I think you’re right. But they’re wrong. And only the chowderheads who don’t interact with customers or service personnel would believe that crap. Now, that’s not to say they can’t raise a generation that does believe that crap.

      Hence the bleakness.

      • V0ldek@awful.systems
        link
        fedilink
        English
        arrow-up
        10
        ·
        1 month ago

        I am so cynical at this point I am fully bought into the idea that these chowderheads don’t even interact with reality, just with the PowerPoint and Jira-driven shadows on the wall.