I guess there is stuff like SquareSpace. No idea how good it is though. And FrontPage back in the day but that sucked.
But what alternatives are really left behind here that you view as superior?
To me, it is obvious the entire world sees a very high value in how much power can be delivered in a tiny payload via networked JS powering an HTML/CSS app. Are there really other things that can be viewed as equally powerful to HTML which are also able to pack such an information dense punch?
Er, no. Go watch some major site load.
This at first blush smells like "Don't write code that writes code," which... Some of the most useful tools I've ever written are macros to automate patterns in code, and I know that's not what he means.
Perhaps a better way to say it is "Automating writing software doesn't remove the need to understand it?"
Martin Fowler isn't the author, though. The author is Unmesh Joshi.
LLMs only really help to automate the production of the least important bit. That's fine.
Before the pearl clutching starts about Mr lack of coding ability. I started coding in 1986 in assembly on an Apple //e and spent the first dozen years of my career doing C and C++ bit twiddling
You don't learn these things by writing code? This is genuinely interesting to me because it seems that different groups of people have dramatically different ways of approaching software development
For me, the act of writing code reveals places where the requirements were underspecifed or the architecture runs into a serious snag. I can understand a problem space at a high level based on problem statements and UML diagrams, but I can only truly grok it by writing code.
Maybe there's a broader critique of LLMs in here: if you outsource most of your intellectual activity to an LLM, what else is left? But I don't think this is the argument the author is making.
The first one is mostly requiring experienced humans, the alter one is boring and good to automate.
The problem is with all the in between. And in getting people to be able to do the first. There AI can be a tool and a distraction.
The art of knowing what work to keep, what work to toss to the bot, and how to verify it has actually completed the task to a satisfactory level.
It'll be different than delegating to a human; as the technology currently sits, there is no point giving out "learning tasks". I also imagine it'll be a good idea to keep enough tasks to keep your own skills sharp, so if anything kinda the reverse.
I feel like maybe I'm preaching to the choir by saying this on HN, but this is what Paul Graham means when he says that languages should be as concise as possible, in terms of number of elements required. He means that the only thing the language should require you to write is what's strictly necessary to describe what you want.
Well i do this but i force it to make my code modular and i replace whole parts quite often, but it's tactical moves in an overall strategy. The LLM generates crap, however, it can replace crap quite efficiently with the right guidance.
For some reason johnwheeler editorialized it, and most of the comments are responding to the title and not the contents of the article (though that's normal regardless of whether the correct title or a different one is used, it's HN tradition).
[The title has been changed, presumably by a mod. For anyone coming later it was originally incorrect and included statements not present in the article.]
Once you can show, without doubt, what you should do software engineers have very little value. The reason they are still essential is that product choices are generally made under very ambiguous conditions. John Carmack said "If you aren't sure which way to do something, do it both ways and see which works better."[1] This might seem like it goes against what I am saying but actually narrowing "everything possible" to two options is huge value! That is a lot of what you provide as an engineer and the only way you are going to hone that sense is by working on your company's' product in production.
Why is the current level of language abstraction the ideal one for learning, which must be preserved? Why not C? Why not COBOL? Why not assembly? Why not binary?
My hypothesis is that we can and will adapt to experience the same kind of learning OP describes at a higher level of abstraction, specs implemented by agents.
It will take time to adapt to that reality, and the patterns and practices we have today will have to evolve. But I think OP's view is too short-sighted, rooted in what they know and are most comfortable with. We're going to need to be uncomfortable for a bit.
To be fair I have this with my own code, about 3 days after writing it.
This is... only true in a very very narrow sense. Broadly, it's our job to create assembly lines. We name them and package them up, and even share them around. Sometimes we even delve into FactoryFactoryFactory.
> The people writing code aren't just 'implementers'; they are central to discovering the right design.
I often remember the title of a paper from 40 years ago "Programming as Theory Building". (And comparatively-recently discussed here [0].)
This framing also helps highlight the strengths and dangers of LLMs. The same aspects that lead internet-philosophers into crackpot theories can affect programmers creating their no-so-philosophical ones. (Sycophancy, false appearance of authoritative data, etc.)
crabmusket•2h ago
Also, fun to see the literal section separator glyphs from "A Pattern Language" turn up.