What a world we live in, that suspecting an LLM guided by a specific prompt would be my first instinct.
That's easily solved by models intentionally introducing the odd grammatical error here and there, just enough to convince the sceptics, not so many as to give the impression of being unlettered. A bit like the mythical 'RHS button' (which stands for 'real human shitty' but in reality is called the 'Shuffle' or 'Swing' function) which is supposed to make mechanically-precise drum machines sound more like human drummers.
fumeux_fume•1h ago
This was an interesting little rundown on all the different libraries at Harvard made all the more enjoyable by the author's humor and wit.