ChatGPT would have been so much useful and trustworthy if it is able to accept that it doesn't know an answer. - eviltoast

Small rant : Basically, the title. Instead of answering every question, if it instead said it doesn’t know the answer, it would have been trustworthy.

  • drislands@lemmy.world
    link
    fedilink
    arrow-up
    8
    ·
    4 months ago

    My first thought is that you could write a program that does something like this:

    • Receive prompt “how many times does R appear in the word ‘strawberry’”
    • Run the prompt through LLM saying “reword the following into a command to generate code that will answer the question” or something like that
    • Run the results through LLM
    • Compile and run the results
    • Provide the output to the user

    Of course, the biggest problem with this system is that a person could fool it into generating malicious code.