LLMs can’t reason — they just crib reasoning-like steps from their training data - eviltoast
  • sc_griffith@awful.systems
    link
    fedilink
    English
    arrow-up
    14
    ·
    edit-2
    1 month ago

    guy who totally gets what these words mean: “an llm simply encodes the semantics into the vectors”

    • self@awful.systems
      link
      fedilink
      English
      arrow-up
      15
      ·
      1 month ago

      all you gotta do is, you know, ground the symbols, and as long as you’re writing enough Lisp that should be sufficient for GAI