LLMs can’t reason — they just crib reasoning-like steps from their training data - eviltoast
  • froztbyte@awful.systems
    link
    fedilink
    English
    arrow-up
    9
    ·
    1 month ago

    I know someone who was hired (around turn of the century) because they knew how to xml with a certain kind of then-important big systems api

    the stories I’ve heard from there are hilarious

    but is also incredibly ease to create proprietary semantics with

    christ the shit I’ve seen with network vendors…. shibboleth NETCONF/YANG. advance warning; abyss grade 6+