GPT-4 is capable and aligned enough to not fall for that directly, that worked on earlier GPT models but does not work anymore Try using that exact verbiage without using the rest of the prompt and you will see it will fail to produce the same responses
GPT-4 Improved Safety Against Jailbreak Prompts
By
–
Leave a Reply