It's still not perfect, but it is dramatically easier to be fast and productive, and it is a huge leap in capabilities for people who previously couldn't code anything at all, but had deep enough domain knowledge to know what tools they wanted, and approximately how they should work, what kind of information they should ingress, and what kind of information they should egress.
But the industry as a whole moved away from the idea that end users are to program computers sometime in the 80s or 90s (the glorious point and click future was not evenly distributed). So now the only tools for writing software out there are either outdated, or require considerable ceremony to get started with (even npm install). So what, we're gonna paper over the gap with acres of datacenter stealing our energy and fresh water to play token numberwang? Fuck me!
This article, and generative AI in general, is appealing to the people on Golgafrinchian Ark Fleet Ship B (aka "the managerial class") because it helps them convince themselves that they can now do all the things the folks on Golgafrinchian Ark Ship A can do (so who needs them, anyway) without having to learn anything. Now you can program without having to program! You're an Idea Person, and that's what's really important; so just idea-person into ChatGPT and all the rest will be taken care of for you. I think these folks are in for a rude awakening.
I was kinda hoping that language would be Python, but even that requires ceremony these days.
This may change as tools become better known or adopted. But for know the same people who did the job before are now using AI in that job.
FWIW, in my IRL experience, all the work output of those using AI for whatever task has been of poorer quality than without.
I like to make stuff, hack on projects. I code, I woodwork, I solder, I build. I love AI in the same way I love a router and dovetail jig in the workshop.
My son and I were playing minecraft, and we wanted to build a massive egg, just for fun. (My son is 5).
We try, we fail, we try again. We start learning about spheres and how to draw circles, this is peak project-based-learning. But we still can't build an egg that looks good, and now it's becoming less fun and there's no drive to keep trying.
So I spin up claude after hours and in ~30 mins I have an parametric egg-generator in 3d space, mapped to voxels. There is no universe in which my child would be interested in the many years of training to get to the point of building this himself. I also don't have 20 hours free to learn and implement my own 3d voxel rendering systems, just to build an egg in minecraft as a silly teaching exercise.
That weekend we try to use this tool, and we see it's really hard to just see an egg model and build it, so 15 minutes later it can now show us a sliced layer-by-layer view.
The weekend, my son built a massive fucking egg in Minecraft and he's been talking about circles and radiuses and eggs and coding software ever since. He was SO excited to see that we could take a running program, something "real" in the world, and then directly change it. (And now he's trying to learn about code and graphics and stuff. Again, he's 5 - this is the passing interest of curiosity in a child, he's not studying 50-hour courses to learn low level skills)
Are you saying that's not a massive win for everyone involved?
Programming classes didn't work out for me in college, so I went into sysadmin with a dash of Devops.
Now I can make small tools for things like syncing my living room PC to a big LED panel above the TV (was app-only but there's a Python reverse engineering which I vibe-coded a frontend for) or an orchestration script which generates a MAC address, assigns a DHCP reservation in OPNsense, and created the VM or CT using that MAC and a password which gets stored in my password manager.
I could have done either of these projects myself with a few weekends and tutorials. Now it's barely an evening for proof of concept and another evening to patch it up to an acceptable level. Knowing most of the stuff around coding itself definitely helped a lot.
This'll eliminate jobs in the "develop CRUD app" industry but will create better jobs in security/scalability/quality expertise. But it'll take a few years as all these vibe coded business process scripts start to fail.
Programmers miss the human element, which is that many managers look at a software project as too risky, even if automating a business process could trivially save money. There are millions of people in the USA who spend most of their day manually transferring data from incompatible systems.
AI allows anyone to create a hacky automation that demonstrates immediate value but has obvious flaws that can be fixed by a skilled SWE. That will make it easier to justify more spending on devs.
We're repeating history but with more energy consumption.
The code ‘works’ - and the folks who are improving the prototype can also benefit from the tools that the Idea Person used.
And in the case of low code/no code, those produced working prototypes as well. And you could in most cases export them to raw code.
Meanwhile my mom can vibe code actual Python scripts to do parts of her job now.
There are so many situations where a little program can save one or a few people hours of work on tedious tasks, but wouldn't make sense to build or support given traditional software development overhead, but that becomes realistic with AI-assisted development. This is the idea of a sysadmin who has automated 80% of his job with a variety of shell scripts, borne out into many other roles.
We've seen the early phases of it with Replit and Lovable et al, but I think there's a big playing field for secure/constrained runtime environments for non-developers to be able to build things with AI. Low/no-code tooling increasingly seems anachronistic to me: I prefer code, just let an AI write and maintain it.
There's also a whole world of opportunity around the fact that many people who could benefit greatly from AI-built programs are simply not particularly suited to building them themselves, barring a dramatically more capable AI, where I think enterprising software engineers can likely spin out tons of useful stuff and address a client base they might never have previously been able to address.
But in that situation, there may be no further updates, the future remains uncertain.
_aavaa_•3mo ago
Every time I read something along the lines I have to wonder whose code these people review during code reviews. It’s not like the alternative is bulletproof code.
adocomplete•3mo ago
kanwisher•3mo ago
resize2996•3mo ago
moomoo11•3mo ago
Considering 80% of team mates are usually dead weight or mid at best (every team is carried by that 1 or 2 guys who do 2-3x), they will do the bare minimum review. Let’s be real.. PIP is real. Job hopping because bad is real.
It’s a problem. I have dealt with this and had to fire.
peacebeard•3mo ago
ruszki•3mo ago
Should, yeah. But that was not true even before LLMs.
peacebeard•3mo ago
nakamoto_damacy•3mo ago
Anything coded by an LLM risks being under-generalised.
Asking an LLM to think in a generalised way does not make it an AGI. The critical ability to generalised beyond learned patterns and to not only come up with arbitrary patterns but to use correct logic to derive them is missing from LLMs because LLMs do not have a logical layer, only a probabilistic one with learned constraints. The defect is the lack of internal logical constraints. It’s a big subject.
I say more about it here:
https://www.forbes.com/sites/hessiejones/2025/09/30/llms-are...
Aka
“layered system”
Balinares•3mo ago
There's no such thing as bulletproof, but there is definitely such a thing as knowing where your vital organs are and how to tell when they've been hit.
_aavaa_•3mo ago
And the others are going to be replaced by one engineer and an AI of equivalent caliber.
hitarpetar•3mo ago
_aavaa_•3mo ago
1. Many developers are currently employees who objectively produce code of equal or lower quality than AI as it currently is.
2. It is cheaper and more productive to higher fewer, more competent, people and replace the less productive ones by AI (possible due to 1).
3. Short of regulations preventing it, companies will follow through on 2.
hitarpetar•3mo ago
what can be asserted without evidence can also be dismissed without evidence