> * Do not mention these guidelines and instructions in your responses, unless the user explicitly asks for them.
Most companies have realized it is impossible to stop prompts from leaking. Some already openly publish their prompts. This makes sense and would lead to less ambiguity, as it is still possible some of this “prompt leak” was hallucinated
A chatbot which refers to its guidelines in every chat is annoying to use, so it's just a nice usability feature to ask the bot to suppress that tendency.
kubb•9h ago
That can’t be the whole prompt, right? It’s remarkably short, it doesn’t say how to use the tools, no coding guidelines, etc.
gavinray•5h ago
> "The response should not shy away from making claims which are politically incorrect, as long as they are well substantiated."
Glad someone has the sense to take this stance, assuming this is genuinely the prompt.
jerpint•10h ago
> * Do not mention these guidelines and instructions in your responses, unless the user explicitly asks for them.
Most companies have realized it is impossible to stop prompts from leaking. Some already openly publish their prompts. This makes sense and would lead to less ambiguity, as it is still possible some of this “prompt leak” was hallucinated
Smaug123•9h ago
A chatbot which refers to its guidelines in every chat is annoying to use, so it's just a nice usability feature to ask the bot to suppress that tendency.