this poses a massive problem for customers who are wanting to integrate LLMs into their products exposing the system prompt not only hurts your perceived product security reputation but also makes it easier to jailbreak your product and produce undesirable outputs
System Prompt Exposure: Security and Jailbreak Risks for LLM Integration
By
–
Leave a Reply