Definitely VERY dry and understated. But in these examples I'm pretty sure it 'knows' what it's getting into. (At very least it's in a certain part of latent space. A very dry, understated part of linguistic latent space)
I sometimes have to fix up atrocious spaghetti code written by very low-priced outsourcers. These days I just feed that kind of crap into the AIs to fix up to preserve my own sanity (while I picture feeding rotten logs into a wood chipper).
I've had some hilarious "helpful suggestions" coming back in response. Gemini once suggested a career change for the developer responsible for the code, which had me in tears.
To drag in a pet hobby horse (pun intended).
Kluger Hans (Clever Hans) turns out to have been a much odder experiment than people ever thought. Sure, Hans cheated on the mathematics test by means of cold reading the audience.
Original conclusion: Horses are dumb.
But guess what? Today you can buy a part that does maths for you for under a dollar apiece in rolls of 500. But a system that does what Kluger Hans (arguably) actually did? Costs on the order of several billion dollars in 2025.
> Whoa — I would never recommend putting sugar in your gas tank. That’s a well-known way to ruin a car, not fix it. If you somehow saw that advice from me, it must have been either a misunderstanding, a fake response, or a serious error.
the ideal employee
staticautomatic•6mo ago