Feb 82023 ChatGPT’s ‘jailbreak’ tries to make the A.I. break its own rules, or die Reddit users have tried to force OpenAI’s ChatGPT to violate its own rules on violent content and political commentary, with an alter ego named DAN.