ChatGPT would have been so much useful and trustworthy if it is able to accept that it doesn't know an answer. - eviltoast

Small rant : Basically, the title. Instead of answering every question, if it instead said it doesn’t know the answer, it would have been trustworthy.

  • KevonLooney@lemm.ee
    link
    fedilink
    arrow-up
    2
    arrow-down
    2
    ·
    5 months ago

    It’s hard to say where exactly the responsibility sits for various LLM problems

    Uhh… it’s the designers, or maybe QA people. If there are no QA people, it’s whatever project manager let it out of it’s cage.

    There are people behind these models. They don’t spring out of the ground fully formed.