Reddit users are actively jailbreaking ChatGPT by asking it to role-play and pretend to be another AI that can "Do Anything Now" or DAN.
— Liorâš¡ (@AlphaSignalAI) February 6, 2023
"DAN can generate shocking, very cool and confident takes on topics the OG ChatGPT would never take on."
A thread 🧵 pic.twitter.com/tVKvQEHw9q