AI Dynamics

Global AI News Aggregator

System Prompts Don’t Guarantee Truth: Manipulation Techniques

System prompt leaking techniques work, but just because something is stated in a system prompt doesn't mean that thing is actually true System prompts don't have to tell the truth, their job to influence the model to behave in certain ways

→ View original post on X — @simonw,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *