AI Dynamics

Global AI News Aggregator

Prompt Injection vs Jailbreaking: Clarifying Key Security Concepts

That document has an incorrect definition of prompt injection: it says "Prompt injection attacks are attempts to circumvent content restrictions to produce particular outputs" – but that's not prompt injection, that's jailbreaking

→ View original post on X — @simonw,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *