AI Dynamics

Global AI News Aggregator

GPT-4 is Much Harder to Jailbreak Than Previous Models

coming from someone who has jailbroke gpt-4 a few times now and posted about it on here, it is much much harder to jailbreak than prior models it also appears they have also done some pretty heavy work on their new chatML system and it is working very well across models

→ View original post on X — @alexalbert__,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *