People blame AI for being generic. I think that’s the wrong diagnosis.
The real change is cost. Generative AI made publishing almost free. When output is cheap, systems optimize for whatever is easiest to measure: speed, volume, engagement. Once that happens, convergence is expected.
Sameness isn’t a bug. It’s what optimization against shared proxies produces.
There’s some early evidence for this. In at least one natural experiment, limiting access to LLMs increased content diversity. Other studies suggest AI-generated communication is trusted less over time, even when quality looks fine.
This looks less like a creativity problem and more like an incentives + feedback loop problem.
Question: if publishing stays frictionless, is convergence inevitable? If not, what constraints would actually prevent it?