AI Dynamics

Global AI News Aggregator

GPT-4 Vulnerability: Prompt Injection and System Prompt Leakage

GPT-4 is highly susceptible to prompt injections and will leak its system prompt with very little effort applied here's an example of me leaking Snapchat's MyAI system prompt:

→ View original post on X — @alexalbert__,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *