It worries me that articles like this are needed. "AIs make shit up" is a core truth, and while there are ways to mitigate the risk of hallucinations, it is disturbing how many people seem to dismiss or forget that factual accuracy is simply not what LLMs are for.
codingdave•1h ago