"Based on a fundamentally incorrect model" is what I would call code by a new coworker or intern who didn't stop to ask questions. The model being fundamentally wrong means it's cognitively taxing to refactor, but you can at least reason about it, and put band-aid fixes if it somehow flows to prod.
With LLM code you're lucky if there's any sort of ostensibly identifiable model underlying the cryptic 10,000 lines PR corresponding to the statistically extrapolated token stream flowing out the matrix multiplication brick. And the person raising that PR tries to create that mental model, instead of taking the easy way out.
Chance-Device•57m ago
Indeed, this might even be more valuable to you if you wanted to show the motion of the planets and constellations relative to Earth directly, rather than the sun. Here I’m thinking about a physical model than might cycle through the zodiac for example.
If this complexity costs you nothing to build and maintain, and is automatically validated, then the mechanism under the hood probably isn’t something you care about very much.