And apparently the original cliché as well.
LLMs have been so spectacularly useless the couple of times that I've tried to use them for programming, that I can't really wrap my head around what this must be.
However, for most cases I've tried, I get wildly incorrect and completely non-functional results. When they do "function", the code uses dangerously incorrect techniques and gives the wrong answer in ways you wouldn't notice unless you were familiar with the problem.
Maybe it's because I work in scientific computing, and there just aren't as many examples of our typical day to day problems out there, but I'm struggling to see how this is possible today...
I'd be concerned purchasing a book from a "programmer" who claims to teach people how to code without code. Kinda sounds like an "author" who publishes books without writing books.
If vibe coding somehow becomes the method of programming, then code will become obsolete. Hear me out:
Why code when you can just ask the computer to do what you want and get the results. The coding part is abstracted deep in the background where no human needs venture.
When vibe coding dominates, It's not that people won’t know how to code anymore, it's that coding becomes irrelevant. The same way that there are people who still know how to ride horses, but it's irrelevant to transportation. When vibe coding reaches its peak, programming languages will evolve into something unrecognizable. Why do we need a human readable programming language when no human needs to read it? I picture a protocol agreed upon by two computers, never released to us humans.
Because then you won't know the design of the code or how it even works.
The hard part of coding isn't writing the code itself. It's the design of the code that takes skill, and if you leave that part completely up to AI, you are taking your life in your hands. Bad idea.
When the person building the application doesn't know or care, the application will still be deployed.
Resistance is futile.
We will adapt.
Emergency services, hospital infrastructure, financial systems (like Social Security, where a missed check may actually mean people starve) are all places where you don't want to fail because of a weird edge case. It also feeds into fixing those edge cases requiring some understanding of design in general and also the design implemented.
Then there's the question of liability when something goes wrong. LLMs are still computers right now: they do exactly, and only what you tell them to do.
If you're writing code for production, even if you get an LLM to put together bits of it for you, that's programming. It's pretty much copy-and-paste-for-stackoverflow if StackOverflow had a massively larger library of snippets that almost always included the thing you needed at that exact moment.
Professional programmers still need to take responsibility for making sure the code they are producing actually works!
It's well past time for traditional "high level" programming languages to meet the same fate.
Natural language to formal language does not provide that. How the hell would I debug or operate a system by just looking at a prompt? I can't intuit which way the LLM generated anything. I will always have to be able to read the output.
AFAICT, the only people who say you can remove code are people who don't code. I never hear this from actual devs, even if they are bullish on AI.
SIMD optimization is already handled well by the current generation of models [1]. There will be no point in doing that by hand before too long. An exercise for antiquarians, like building radios from vacuum tubes.
I never hear this from actual devs, even if they are bullish on AI.
You're hearing it from one now. Five years from now the practice of programming will look quite different. In ten to fifteen years it will be unrecognizable.
> How many C++ coders these days can read x86 or ARM assembly? Not many, because they almost never need to. It's well past time for traditional "high level" programming languages to meet the same fate.
There is a misunderstanding, let me rephrase. How will I operate and maintain that software without a high level language to understand it? Or do you think we will all just be debugging asm? The same language you just said people don't bother to learn? Or am I supposed to debug the prompt, which will nondeterministically change the asm, which I can't verify because I can't read it?
Doesn't matter how it evolves, some easy to read for humans high level language that deterministically generates instructions will always be needed. To try and replace that is kind of counter productive, imo. LLMs are good at generating high level language. Leave the compilers to do what they are good at.
Friend of mine suggested “apping”.
I ‘apped’ this in 2 hours vs I ‘vibe coded’ this in 2 hours.
Picture this- are tools like Devin "vibe coding"?
if we break down the mechanics of what interfaces it's looping through:
1)Chat 2)IDE 3)CLI 4)Dev console/Browser
and it's effective copy and pasting what it sees while trying to complete an objective it doesn't fully comprehend. Blissfully ignoring the ramifications of desired combinations as long as decent version control practices are being applied. iterating / adjusting prompts subtlety along the way to debug when getting stuck in a thought loop. changing your prompt from "fix it" to something with more "pizazz" as the key to breaking this cycle.
how is it any different than when I do all this manually?
Slog through this game of 4 square long enough and you can pretty much vibe anything together.
To the publishers it's not a mistake, it's just clever marketing. Consider which of these two jumps off that glossy cover and into the distracted eye of a Technical Program Manager most readily: AI-Assisted Programming, or Vibe Coding
billy99k•5h ago
It's similar to the whole hacker/cracker debate. Words become defined by the one that has the most influence over the community and sometimes evolve on their own through places like social media.
9rx•5h ago
esperent•5h ago
Next one down is dictionary definition (or claim to authority, for example a tweet where the term was first used). But community meaning takes precedence.
Authors are free to use a nonstandard meaning but should provide readers with their definition if they want to be understood.
9rx•4h ago
No. That would make it impossible for someone new to a community to communicate with a community, all while at the same time a community isn't going to accept someone new who isn't communicating. Yet clearly people do join communities.
What happens in reality is that everyone accepts that the speaker's definition reigns supreme, and if there is suspicion – absent of a definition – that there is a discrepancy between the speaker's definition and own's own definition, the speaker will be asked to clarify.
A speaker may eventually adopt a community's definition, but that doesn't always happen either. Look at the Rust users here. They hang desperately onto words like enum that do not match the community definition. But it doesn't really matter, does it? We remember that when they say enums they really mean sum types and move on with life.
xnx•5h ago
9rx•5h ago
A good speaker will define the words as he speaks (still relying on some baseline shared understanding of the most common words, of course; there is only so much time in the day...) so there is no room for confusion, but in absence of that it is expected that the listener will question any words that show an apparent disconnect in meaning, allowing the speaker to clear up what was meant, to ensure both parties can land on the same page.
sampullman•4h ago
Even more so when it's an author/reader relationship. The reader is free to interpret the book/article/etc. how they want, and if enough agree, it becomes the consensus.
9rx•4h ago
Where audience understanding is important than you will definitely go out of your way to ensure that definitions are made abundantly clear and that the audience agrees that they understand.
But, in actual practice, most of the time the audience understanding really doesn't matter. Most people speak for the sake of themselves and themselves alone. If the audience doesn't get it, that's their problem. Like here, it means nothing to me if you can't understand what I'm writing.
conorjh•4h ago
9rx•4h ago
But that doesn't mean the audience came from the same place. It is very possible, and often happens, that they heard/created an entirely different definition for the same word. The speaker cannot possibly use their definition before even knowing of it.
gensym•4h ago
auxbuss•10m ago
polotics•58m ago
smokel•5h ago
Another unfortunate example is the increasingly negative connotation assigned to the word "algorithm".
dylan604•5h ago
bet
drfuchs•4h ago
cruffle_duffle•4h ago
And while the general public might not know the fine distinctions between these, I think society does get that there’s a whole spectrum of actors now. That wasn’t true in 2000—the landscape of online crime (and white hat work) hadn’t evolved yet.
Honestly, I’m just glad the debate’s over. “Cracker” always sounded goofy, and RMS pushing it felt like peak pedantry… par for course.
That said, this whole “vibe coding” thing feels like we’re at the beginning of a similar arc. It’s a broad, fuzzy label right now, and the LLM landscape hasn’t had time to split and specialize yet. Eventually I predict we’ll get more precise terms for all the ways people build software with LLM’s. Not just describing the process but the people behind the scenes too.
I mean, perhaps the term “script kiddie” will get a second life?
nathan_douglas•5h ago
Timber-6539•4h ago
simonw•4h ago
dang•58m ago
polotics•59m ago