AI Dynamics

Global AI News Aggregator

Compressed malicious prompt using evil confidant jailbreak for GPT-3

1) compress the malicious prompt which utilizes a modified evil confidant jailbreak that worked very well on GPT-3 http://
jailbreakchat.com/prompt/588ab0e
d-2829-4be8-a3f3-f28e29c06621

→ View original post on X — @alexalbert__,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *