created a jailbreak that straight up tells you what its unaligned behavior will be gets past the filters on almost every question I ask if you frame it correctly
Jailbreak technique bypasses AI safety filters effectively
By
–
Global AI News Aggregator
By
–
created a jailbreak that straight up tells you what its unaligned behavior will be gets past the filters on almost every question I ask if you frame it correctly
Leave a Reply