GPT-4 is highly susceptible to prompt injections and will leak its system prompt with very little effort applied here's an example of me leaking Snapchat's MyAI system prompt:
GPT-4 Vulnerability: Prompt Injection and System Prompt Leakage
By
–
Global AI News Aggregator
By
–
GPT-4 is highly susceptible to prompt injections and will leak its system prompt with very little effort applied here's an example of me leaking Snapchat's MyAI system prompt:
Leave a Reply