Hi there! We're Andrew and Clément and we decided to make a game to understand better how LLM process and compress information. So we made a sort of reverse prompting game with constraints: try to find a prefix to a given story, and see how many bits of cross-entropy (xent) are removed from a text with that prefix (as seen by GPT 2). This leads to cool and creative solutions!
It's also interesting to find how high the scores can go and to see the solutions of other users.
Enjoy!