ChatGPT would have been so much useful and trustworthy if it is able to accept that it doesn't know an answer. - eviltoast

Small rant : Basically, the title. Instead of answering every question, if it instead said it doesn’t know the answer, it would have been trustworthy.

  • FaceDeer@fedia.io
    link
    fedilink
    arrow-up
    8
    arrow-down
    1
    ·
    5 months ago

    And sometimes that’s exactly what I want, too. I use LLMs like ChatGPT when brainstorming and fleshing out fictional scenarios for tabletop roleplaying games, for example, and in those situations coming up with plausible nonsense is specifically the job at hand. I wouldn’t want to go “ChatGPT, I need a description of the interior of a wizard’s tower is like” and get the response “I don’t know what the interior of a wizard’s tower is like.”

    • mozz@mbin.grits.dev
      link
      fedilink
      arrow-up
      7
      ·
      5 months ago

      At one point I messed around with a lore generator that would chop up sections of “The Dungeon Alphabet” and “Fire on the Velvet Horizon” along with some other stuff, and feed random sections of them into the LLM for inspiration and then ask it to lay out a little map, and it pretty reliably came up with all kind of badass stuff.