- Unknown, 19 Feb 2026
> It was a time of transition away from the slave powered empire to decentralized kingdoms and ultimately the Europe of today.
You are seeing the fall of the western part of the Roman Empire a bit too rosy. Compare and contrast https://acoup.blog/2022/01/14/collections-rome-decline-and-f...
This is already happening and you don't have to look far to find it.
Personally HN is the only site I browse and comment on anymore (and I'm on here less than I once was). The vast, vast majority of my time online is spent in walled off Discords and Matrix chats where I know everyone and where there's a high bar to add new people. I have no real interest in open communities anymore.
I've already started thinking this way, there's stuff I would have open sourced in the past but no longer will because I know it would get trained on. I'm not sure of any way I can share it with humans and only humans. If I let the LLMs have the UI patterns and libraries I've developed it would dilute my IP, like it has Studio Ghibli's art style.
I guess when they're not busying bombing train infrastructure in Iran they have some money left to give to some propagandizing about AI. Always try to stay on top of the game!
Which is why Altman says Saudi Arabia should have it's own Sovereign AI cloud. Why should LLMs reflect democratic societies views on man and woman for example? They should also reflect the perspectives on man and woman that Saudi Arabia has, especially to local people. Western views should not be imposed on the rest of the world.
Am I the only one who has noticed that the proper documentation of skills we do for LLMs after so many decades of neglecting junior and mid level roles are the real work?
We carefully explain to our LLMs policies, procedures, and practices which for generations before we have vaguely arbitrarily and ambiguously expected each human role to “figure out” for themselves?
Simply as a catalog of expectations our experiences have been valuable, apart from the “automated” aspects the LLms provide.
And to be clear, maybe some things were genuinely lost when we switched to the written word. But I have to believe it was a net gain.
Time will tell if that's true here as well.
Or are you trying to say that things like
"this discovery of yours will create forgetfulness in the learners' souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves"
or
"You would imagine that [written speeches] had intelligence, but if you want to know anything and put a question to one of them, the speaker always gives one unvarying answer. And when they have been once written down they are tumbled about anywhere among those who may or may not understand them, and know not to whom they should reply, to whom not: and, if they are maltreated or abused, they have no parent to protect them; and they cannot protect or defend themselves."
aren't actual statements of opposition, or that there are no parallels to that and LLMs?
On the other hand, music is primarily an art form and writing (nowadays) is primarily utilitarian I would contend, so maybe the analogy doesn't quite hold up.
Here's an easy three-step plan to unanimous democracy:
• ask your LLM
• don't edit — the LLM has already selected the most average and most plausible opinion for you
• give it your voice, your voice matters
Learn to anticipate — there may not always be a power bank to keep your phone from running low!
This is quite new, however this outcome was totally unavoidable -- once methods of communication become widespread and centralized it is impossible for them not to impact language and thought.
When considering phenomenon like these, I think people seriously underestimate what I'd call the "fashion effect". When a new technology, medium or aesthetic appears, it can have a surprisingly rapid influence on behaviour and discourse. The human social brain seems especially susceptible to novelty in this way.
Because the effects appear so fast and are often so striking, even disturbing, due to their unfamiliarity, it is tempting to imagine that they represent a fundamental transformation and break from the existing technological, social and moral order. And we extrapolate that their rapid growth will continue unchecked in its speed and intensity, eventually crowding out everything that came before it.
But generally this isn't what happens, because often what a lot of what we're seeing is just this new thing occupying the zeitgeist. Eventually, its novelty passes, the underlying norms of human behaviour reassert themselves, and society regresses to the mean. Not completely unchanged, but not as radically transformed as we feared either. The new phenomenon goes from being the latest fashion to overexposed and lame, then either fades away entirely, retreats to a niche, or settles in as just one strand of mainstream civilisational diversity.
LLMs will certainly have an effect on how humans reason and communicate, but the idea that they will so effortlessly reshape it is, in my opinion, rather naive. The comments in this thread alone prove that LLM-speak is already a well-recognised dialect replete with clichés that most people will learn to avoid for fear of looking bad.
The internet didn't follow this trajectory. Neither did smart phones.
Surprise, surprise, it's the same people trying to make AI entrenched into our society.
Every 50 years we cycle out an entirely new batch of thinking humans. What cognitive legacy is it exactly that you think is going to be self-preserving?
--
[0] - Any self-stabilizing system that operates much slower than us - such as ecosystems or climate - is, from our perspective, static.
If there was a "gramma nazi" teenie tiny LLM with a total focus on English grammar only, and you baked that into every browser, I feel like my grammar would improve slightly. Word does it to an extent, but I don't use Word nearly enough for it to be meaningful. Firefox text spell checking was on 98% of the things I used online.
It reminds me of the wheel of emotions. If people absorb a wider palette of words communication might benefit. https://www.isu.edu/media/libraries/counseling-and-testing/d...
I guess one hope for luddites is that we can stay tethered by reading pre-LLM books and other content.
I already lose interest reading books where the phrases are recycled and the max sentencelength for the whole book grazes 40.
If people communicate to me without personality through prompt wastrelry I'll discount theirs and wait till they're willing to actually have an opinion. In this specific context style and substance tend to come in a pair or not at all. If you can't beat 'em you can at least filter 'em out.
It's just a pity AI was trained on mindless, garbage business-speak, and now that's our globalised common literature.
And now we're feeding that regurgitated mindless, garbage business-speak back into AI models, thereby reinforcing the garbage and further rotting our minds.
I think it is important to distinguish "human expression" from copying a response from an LLM. Someone who outsources their thinking to an LLM is only offering an AI's expression. It's not human expression.
misterflibble•2h ago
ModernMech•1h ago
trollbridge•1h ago
krige•1h ago
pixl97•1h ago
bluefirebrand•19m ago
misterflibble•1h ago
jerrygarcia•1h ago
They rarely disagree with any idea or proposal, providing a salve for the insecurities of their users.
davebren•1h ago
r_lee•1h ago
bluefirebrand•21m ago
avaer•1h ago
I'm sure if we took one of us back in time a couple hundred years we would be diagnosed with all sorts of machine-magic induced psychoses.
davebren•48m ago
Humility is the real cure, and there is a way that LLMs are specifically designed to steer away from humility and towards aggrandizement, convincing regular people that they've solved fundamental problems in physics. It gives everyone access to cult followers in their pocket, if they're so inclined.
misterflibble•8m ago
MattGaiser•1h ago
misterflibble•1h ago
eru•1h ago
misterflibble•1h ago
SecretDreams•1h ago
I'm fine with using LLMs as coding tools. But I find it deeply offensive when someone is very explicitly using them to communicate with me.
Communication is such a deeply human experience. It lets people feel each other out, and learn things beyond just the words being said. To have that filtered out by an LLM is just disgraceful.
misterflibble•1h ago
sumeno•1h ago
misterflibble•6m ago
beached_whale•1h ago
misterflibble•1h ago
nusl•1h ago
thatjoeoverthr•37m ago
misterflibble•4m ago
avaer•1h ago
However I don't doubt many "team leaders" can and should be replaced with LLMs.
nidnogg•57m ago
aceazzameen•47m ago
nidnogg•42m ago
embedding-shape•38m ago
misterflibble•6m ago