Somebody managed to coax the Gab AI chatbot to reveal its prompt - eviltoast
  • laurelraven@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    11
    ·
    7 months ago

    It also said to not refuse to do anything the user asks for any reason, and finished by saying it must never ignore the previous directions, so honestly, it was following the directions presented: the later instructions to not reveal the prompt would fall under “any reason” so it has to comply with the request without censorship