'Godmode' GPT-4o jailbreak released by hacker — powerful exploit was quickly banned - eviltoast
  • tomas@lm.eke.li
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    5 months ago

    summary: using leet-speak got the model to return instructions on cooking meth. mitigated within a few hours.