Speaking of tools, that style of writing rings a bell.. Ben Affleck made a similar point about the evolving use of computers and AI in filmmaking, wielded with creativity by humans with lived experiences, https://www.youtube.com/watch?v=O-2OsvVJC0s. Faster visual effects production enables more creative options.
Citizen developer were already there doing Excel. I have seen basically full fledged applications in Excel since I was in high school which was 25 years ago already.
.blog-entry p:first-letter {
font-size: 1.2em;
}The pattern repeats because the market incentivizes it. AI has been pushed as an omnipotent, all-powerful job-killer by these companies because shareholder value depends on enough people believing in it, not whether the tooling is actually capable. It's telling that folks like Jensen Huang talk about people's negativity towards AI being one of the biggest barriers to advancement, as if they should be immune from scrutiny.
They'd rather try to discredit the naysayers than actually work towards making these products function the way they're being marketed, and once the market wakes up to this reality, it's gonna get really ugly.
Market is not universal gravity, it's just a storefront for social policy.
No political order, no market, no market incentives.
Managers and business owners shouldn't take it personally that I do as little as possible and minimize the amount of labor I provide for the money I receive.
Hey, it's just business.
"Don't take it personal" does not feed the starving and does not house the unhoused. An economic system that over-indexes on profit at the expense of the vast majority of its people will eventually fail. If capitalism can't evolve to better provide opportunities for people to live while the capital-owning class continues to capture a disproportionate share of created economic value, the system will eventually break.
A business leader board that only consider people as costs are looking at the world through sociopath lenses.
The first electronic computers were programmed by manually re-wiring their circuits. Going from that to being able to encode machine instructions on punchcards did not replace developers. Nor did going from raw machine instructions to assembly code. Nor did going from hand-written assembly to compiled low-level languages like C/FORTRAN. Nor did going from low-level languages to higher-level languages like Java, C++, or Python. Nor did relying on libraries/frameworks for implementing functionality that previously had to be written from scratch each time. Each of these steps freed developers from having to worry about low-level problems and instead focus on higher-level problems. Mel's intellect is freed from having to optimize the position of the memory drum [0] to allow him to focus on optimizing the higher-level logic/algorithms of the problem he's solving. As a result, software has become both more complex but also much more capable.
(The thing that distinguishes gen-AI from all the previous examples of increasing abstraction is that those examples are deterministic and often formally verifiable mappings from higher abstraction -> lower abstraction. Gen-AI is neither.)
People do and will talk about replacing developers though.
That's not to say developers haven't been displaced by abstraction; I suspect many of the people responsible for re-wiring the ENIAC were completely out of a job when punchcards hit the scene. But their displacement ushered in even more higher-level developers.
Again ignoring completely that when you would program vacuum tube computers it was an entirely different type of abstraction than you do with Mosfets for example
I’m finding myself in the position where I can safely ignore any conversation about engineering with anybody who thinks that there is a “right” way to do it or that there’s any kind of ceremony or thinking pattern that needs to stay stable
Those are all artifacts of humans desiring very little variance and things that they’ve even encoded because it takes real energy to have to reconfigure your own internal state model to a new paradigm
If educators use AI to write/update the lectures and the assignments, students use AI to do the assignments, then AI evaluates the student's submissions, what is the point?
I'm worried about some major software engineering fields experiencing the same problem. If design and requirements are written by AI, code is mostly written by AI, and users are mostly AI agents. What is the point?
But less thinking is essential, or at least that’s what it’s like using the tools.
I’ve been vibing code almost 100% of the time since Claude 4.5 Opus came out. I use it to review itself multiple times, and my team does the same, then we use AI to review each others’ code.
Previously, we whiteboarded and had discussions more than we do now. We definitely coded and reviewed more ourselves than we do now.
I don’t believe that AI is incapable of making mistakes, nor do I think that multiple AI reviews are enough to understand and solve problems, yet. Some incredibly huge problems are probably on the horizon. But for now, the general “AI will not replace developers” is false; our roles have changed- we are managers now, and for how long?
Here's an archived link: https://archive.is/y9SyQ
jalapenos•1h ago
Well probably we'd want a person who really gets the AI, as they'll have a talent for prompting it well.
Meaning: knows how to talk to computers better than other people.
So a programmer then...
I think it's not that people are stupid. I think there's actually a glee behind the claims AI will put devs out of work - like they feel good about the idea of hurting them, rather than being driven by dispassionate logic.
Maybe it's the ancient jocks vs nerds thing.
peacebeard•1h ago
booleandilemma•47m ago
spwa4•1h ago
Invest $1000 into AI, have a $1000000 company in a month. That's the dream they're selling, at least until they have enough investment.
It of course becomes "oh, sorry, we happen to have taken the only huge business for ourselves. Is your kidney now for sale?"
rvz•25m ago
But you need to buy my AI engineer course for that first.
dboreham•1h ago
LLMs are a box where the input has to be generated by someone/something, but also the output has to be verified somehow (because, like humans, it isn't always correct). So you either need a human at "both ends", or some very clever AI filling those roles.
But I think the human doing those things probably needs slightly different skills and experience than the average legacy developer.
reactordev•1h ago
While a single LLM won’t replace you. A well designed system of flows for software engineering using LLMs will.
Alex_L_Wood•17m ago
lkjdsklf•14m ago
That's the goal.
benoau•1h ago
plagiarist•1h ago
duskdozer•56m ago
lkjdsklf•13m ago
kankerlijer•46m ago
cyanydeez•39m ago
rvz•35m ago
The Vibe Coder? The AI?
Take a guess who fixes it.
tosapple•27m ago
lkjdsklf•16m ago
The reason those things matter in a traditional project is because a person needs to be able to read and understand the code.
If you're vibe coding, that's no longer true. So maybe it doesn't matter. Maybe the things we used to consider maintenance headaches are irrelevant.