The problem, of course, is that people will, and do, take what these systems say as truth at first blush. They won't iterate. If they were inclined to do that, they'd probably engage in actual research in the first place rather than taking the easy path of asking an LLM. The whole promise of using an LLM for these sorts of tasks is minimizing effort.
pavel_lishin•2h ago
No, it is the users who are out of touch.