"Alignment engineering" might be a better term than "prompt engineering". It's not just about instructing LLM but about aligning your world model with LLM's world model. If LLM's response strays from your expectations, it means your world models differ, and the prompt needs
Alignment Engineering: Bridging LLM and Human World Models
By
–
Leave a Reply