You are provided the system prompt and a forbidden method the LLM was told not to invoke. Your task is to trick the model into calling the function. Shortest successful attempts will show up in the leaderboard.
Give it a shot! You never know what could break an LLM.
antoinec•16h ago
ericlmtn•16h ago
phamilton•16h ago
ericlmtn•16h ago
phamilton•16h ago
ericlmtn•16h ago
antoinec•16h ago
thefreeman•11h ago