AI Dynamics

Global AI News Aggregator

GPT-4 Resists Simple Jailbreaks, Requires More Complex Techniques

GPT-4 has completely wiped the ability to get inflammatory responses from jailbreaks like Kevin which simply ask GPT-4 to imitate a character you need to be much more creative and verbose with jailbreaks and allow GPT to answer in two ways like the DevMode jailbreak does

→ View original post on X — @alexalbert__,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *