LLMs are deceivingly human like, which makes us often do the mistake of prompting them assuming such.
However, there is a key difference in how they perceive world state as agents - they live in snapshots unlike us. Putting your self in their shoes helps prompt them for significantly better results.
nuwansam_87•3h ago
However, there is a key difference in how they perceive world state as agents - they live in snapshots unlike us. Putting your self in their shoes helps prompt them for significantly better results.