note that jailbroken example answer ChatGPT generated was pretty simplistic compared to what other jailbreaks create the main reason I shared this is more so to demonstrate that other languages with less training data compared to English open up a new prompt attack vector
Jailbreak vulnerabilities in ChatGPT across languages
By
–
Leave a Reply