If you REALLY need something long-forgotten, then you have lazy-load it back into being at significant cost. That's the price of constant progress.
COBOL is a bad example, but higher-level languages vs. assembly is not. If you write a lot of C you really don't need to know assembly.... until you stumble across a weird gcc bug and have no clue where to look. If you write a lot of C# you don't really need to know anything about C... until your app is unusably slow because you were fuzzy on the whole stack / heap concept. Likewise with high-level SSGs and design frameworks when you don't know HTML/CSS fundamentals.
As the author says maybe AI is different. But with manufacturing we were absolutely confusing "comfortable development" with "progress." In Ukraine the bill came due, and the EU was not actually able to manufacture weapons on schedule. So people really should have read to the end of "building a C compiler with a team of Claudes":
The resulting compiler has nearly reached the limits of Opus’s abilities. I tried (hard!) to fix several of the above limitations but wasn’t fully successful. New features and bugfixes frequently broke existing functionality.
At least with Opus 4.6, a human cannot give up "the old ways" and embrace agentic development. The bill comes due. https://www.anthropic.com/engineering/building-c-compilerI'm going to steal that one and add it to Stross': "Efficiency is the reciprocal of resilience."
The other that really resonated was something that I read before along the lines of… we think that once humanity learns something, that knowledge stays and we build on it. But it’s not true, knowledge is lost all the time. We need to actively work to keep knowledge alive
That’s why libraries and the internet archive are so important. Wikipedia, too
With all due respect, but many european taxpayers help pay for Ukraine. I am not disagreeing on the premise of the West killing itself via systematic recessions - Trump invading Iran leading to inflation as an example - so a lot of things are going on that show a ton of incompetency both in the USA and the EU, but at the same time I also get question marks in my eyes when this criticism comes from a country that receives money from others. That money could instead go to make EU countries more competitive, for instance. I am not saying this should necessarily be the case, mind you; I fully understand the nature of Putin's imperialism. But we need to really consider all factors when it comes to strategic mistakes with regards to production - and that includes taking up debts all the time. There are always a few who benefit in war, just as they benefit from subsidies from taxpayers (inside and outside as well).
Yes. https://www.eeas.europa.eu/delegations/united-states-america...
You are, of course, free to disagree and make your point, but ignoring the argument does not advance the discussion.
Factually correct.
> We are benefactors of the Ukrainians' bravery and sacrifices.
Who's we?
> How much money could we have not spent if Hitler had been stopped in Czechoslovakia?
Very different situation, in all aspects.
Hitler was more about wanting more land and resources for Germany, and he saw war as being a legitimate tool for achieving his aims that he deployed early and enthusiastically.
>In defense, the substitute was the peace dividend. In software, it’s AI.
Before it was AI, the cheaper alternative was remote contract dev teams in Eastern Europe, right?
Much better quality stock.
I see a talent pipeline collapse in next 5 years. "Software engineering is over coding is a solved problem" as being chanted by semi literate media and the AI grifter's marketing departments would further scare away the allocation of human capital to software engineering easily commanding 3x rise in salaries due to resource shortage.
We’ll see, but right now I now see developers 24/7 hooked onto their agents and in the future we will experience a de-skilling problem which clean code, best practices, security and avoiding NIH syndrome will be all flushed down the toilet.
AI code generators are trolls. They confidently plausible content which is partly wrong. Then humans try to find their errors.
This is not fun. It has no flow.
People come and go at rates that would not be sustainable in any manufacturing business.
The problem is a management pattern: removing people and organizational slack because they don’t generate immediate profit, and then expecting the knowledge to still be there when it’s needed.
Short-term cost cutting leads to less junior hiring, and removes the slack that experienced engineers need in order to teach. As a result, tacit knowledge stops being transferred.
What remains is documentation and automation.
But documentation is not the same as field experience. Automation is not the same as judgment. Without people who have actually worked with the system, you end up with a loss of tacit knowledge—and eventually, declining productivity.
AI is following the same pattern.
What AI is being sold as right now is not really productivity. In many domains, productivity is already sufficient. What’s being sold is workforce reduction.
The West has seen this before, especially in the case of General Electric.
GE pursued aggressive short-term financial optimization, cutting costs, focusing on quarterly results, and maximizing shareholder returns. In the process, it hollowed out its own long-term capabilities. It effectively traded its future for short-term gains.
The same mindset is visible today.
The core problem is that decision-makers—often far removed from actual engineering work— believe that tacit knowledge can be replaced with documentation, tools, and processes.ti cannot.
Tacit knowledge comes from direct experience with real systems over time. If you remove the people and the learning pipeline, that knowledge does not stay in the organization. It disappears.
Launder it through all the euphemisms and PC language you want, end of the day all the problems? ...made of people.
Really you're describing well known physics; attenuation and entropy are things. Information states dissipate over time! Film@11
Most Fortune 500 companies from the 1900s are long gone.
We don't speak Latin or even ye olde English. Social knowledge lost.
Such is a physical constant and those of us alive today won't be stopping it
Just burning resources on compute to do nothing but parrot the most probable grammatically correct string of words we already know relative to the context. Deep.
Also; oh no a meaningless social credit score go down! So negative. Don't you care about feelings.
Haha just a euphemism for "go fuck yourself". The negative connotation isn't gone just translated to a different syntax.
It's almost comical how all our solutions are just temporary until we decode the new syntax and realize it means some old negative statement
I think that's still a symptom. The real problem is ideology: the monomaniacal focus on profit-making business, which infects our political leaders, down to capitalists and business leaders, down to the indoctrinated rank-and-file. Towards the end of the cold war, the last constraint on it were abolished, the the victory over the Soviet Union made it unquestioned.
The Chinese don't have that ideological problem. Their government appears to not give a shit about how much profit individual business make, they care about building out supply chains and a capabilities. They will bury the West, so long as the West remains in the thrall of libertarian business ideology.
No idea how this should take form, though, and if it’s even realistic. But it seems like due to AI, formal specs and all kinds of “old school” techniques are having a renaissance while we figure out how to distribute load between people and AI.
Coding is different though, coding doesn't have a cost barrier, it has a ability barrier. I think we will loose a lot of people who never were passionate about programming and perhaps go back to a happy equilibrium. AI is only production ready if you have someone who understands software development. AI will improve speed to market if you have the right team, it doesn't remove the need for some to learn to code. You will of course end up with startups using exclusively AI but they will be those who end up with major security breaches or simply cannot scale as the AI goes in the wrong direction for the future. Tbh that's probably a positive as it weeds out the start ups that are focused on buzzwords for funding and not product.
Anecdotally, what I’m seeing right now is the opposite. People who don’t care about programming are joining, while those who do care are getting tired of the bullshit and leaving. The good programmers are the ones leaving, the hacks are extremely happy to use LLMs.
When shit hits the, there won’t be many people left to clean it.
It doesn’t seem much like defense industry problems.
Same thing that happened to the unfortunate Dr. Jekyll!
For the actual problem, I fear this can't be solved by warning people, the pain will need to be felt. The system we live in, basically free market capitalism, cannot do anything else except local optimization. Maybe it's for the best, I don't know. The alternative of top down planning wouldn't have this problem, but it would have other problems. I work for a mid size somewhat luxury brand, and the major goal right now is cost cutting and AI for efficiency everywhere instead of using it to create better products or better ways to reach out customers. When I think about who will buy our luxury products if all jobs were optimized out of existence, I don't have an answer, but again I think the pain will need to be felt to change course.
As it was said - the future is here, it just distributed non-uniformly, so somebody is still and will be for some time sailing, manufacturing things and writing code.
It's minor but this is just wrong. If you're going to hire 4 candidates, there could be 2,253 perfectly qualified candidates even if only 0.18% get hired. The conversion rate is meaningless; it just tells us how many jobs were on offer. There is no way that the skills this fellow wanted were so rare and difficult that only 1/500 candidates could possibly handle the job. Humans even in the 1/20 mark are pretty competent if you're willing to train them and legitimate geniuses crop up at around 1/200.
You mean the world?
Deepseek was being glazed here, Im sure chinese programmers use it like CC
The history of technology is the replacement of manual processes with automated ones.
Consider a very basic process: checkout of a restaurant.
Writing the price of each item on a sheet of paper, manually adding them and writing the total was replaced with typing in the prices and eventually with just pushing the button for the item. Paper still exists for jotting down your order but within seconds of leaving the table it’s transitioned to computer.
This has enabled lots of desirable advances- speed, accuracy, new payment rails, and increasingly, elimination of the server in checkout- you tap a credit card on a tabletop device.
Did we “forget” how to do checkout? No. We purposely changed it.
But if the internet connection goes down or the backend server powering the cash register app goes down, there is an atrophied and not-regularly exercised skill set (maybe not even trained, IDK) that has to be implemented on-the-fly and it’s slow and frustrating for everyone.
Businesses don’t exercise (or perhaps even train) this process because it’s just not needed enough to warrant the cost.
Military procurement of weapons systems is hardly the place to point to as a technological tradition. There are lots of cases where no one pays the money to keep a production process in place; the reasons are all related to shortsighted “cost savings” or failing to anticipate changing needs.
With coding today, we are seeing the same kind of shift in priorities as my restaurant example. Having humans write code in the 2020 (pre-GPT) tradition was extremely inefficient in terms of time-from-idea-to-implementation.
We’ve found a new way to do the mundane part of that task (the mechanics of translating spec to implementation).
We are figuring out how to do that while preserving quality (and a lot of it is learning how to specify appropriately).
Will we “forget” how to “build” code?
No, but the skills to generate source code by hand will atrophy just as the skills to draw blueprints by hand atrophied with the advent of CAD.
Will we find examples where someone prematurely optimized away knowledge of a skill or process, incorrectly thinking it was no longer needed? Of course.
But the productivity gains we get will be so great on average that no one will go back to doing things the old way.
There will be old-timers and hobbyists who will preserve some of that knowledge; for most it will just be a curiosity.
I agree, as with everything in 2026, the reality lands somewhere in the middle of the discourse online. But pretending this is in practice anything like the check out example is wrong.
LLMs are a magnificent tool if you use them correctly. They enable deep work like nothing before.
The problem is the education system focused on passivity (obeyance), memorization, and standardized testing. And worst of all, aiming for the lowest common denominator. So most people are mentally lazy and go for the easy win, almost cheating. You get school and interview cheating and vivecoders.
But it's not the only way to use LLMs.
Similarly, in Wikipedia you can spend hours reading banal pop-slop content or instead spend that time reading amazing articles about history, literature, arts, and science.
Another reason is that LLMs train on the existing code we already know, I don't expect new programming languages or frameworks this means that the software engineering skills that exist today will be relevant for a long time.
They did not properly prepare and as a result lost 20% of its territory in days.
Days after that I was back is Austria and could not stop thinking about some of the people I spoke with being dead.
Since that I have also been in Dubai and Saudi Arabia as an entrepreneur and engineer. "What are you going to do when drones are used against your infrastructure?" If you followed the Russian war and first Iranian strike it was obvious that drones were going to be used against them. "not going to happen" again.
The have lost tens of billions for lacking proper preparation. They could have been protected spending just hundreds of millions of dollars over years.
It is about humans, not AI.
All those computers will sit there useless because no-one knows how to program them any more.
I guess the computer revolution has come to an end because we forgot how to code. Very sad.
Meirambek_VIDI•1h ago
great_psy•51m ago