I’m hired to solve business problems with technology, not to self-improve or get on my high horse because I hand-wrote a silly abstraction layer for the n-th time
There aare probably 2 ways to see te future of LLMs / AI: they are either going to have the capabilities to replace all white collar work, or they are not.
If you think they are going to replace us, then yo ucan either surrender or fight back, and personally I read all these anti-AI posts as fighting back, to help people realize we might be digging our own grave.
If, OTOH, you see AI as a force-multiplier tool that's never going to completely replace a human developer then yes, probably the smartest thing to do is to learn how to master this new tool, but at the same time keep in mind the side effects it might bring, like atrophy.
We've always been in the business of replacing humans in the 3-D's space (dirty, dangerous, dull... And to be clear. data manipulation for its own sake is dull). If we make AI that replaces 90% of what I do at my desk every day... We did it. We realized the dream from the old Tom Swift novels where he comes up with an idea for an invention and hands the idea off to his computer to extrapolate it, or the ship's computer in Star Trek acting like a perfect engineering and analytical assistant to take fuzzy asks from humans and turn them into useful output.
They aren't going to willingly spread the wealth.
Point is: if that problem is solvable without me, that's the win condition for everyone. Then I go herd goats (and have this nifty tool that helps me spec out an optimal goat fence while I'm at it).
I have buy-in from a former co-worker with whom I remained in touch over the years, so there will be at least two of us working the fields.
A time and place may come where the AI are so powerful I’m not needed. That time is not right now.
I have used Rider for years at this point and it automatically handles most imports. It’s not AI, but its one of the things that is just not needed for me to even think about.
I do however, love solving business problems. This is what I am hired for. I speak to VP/managers to improve their day to day. I come up with feasible solution and translate them into code.
If AI could actually code, like really code(not here is some code, it may or may not work go read documentation to figure out why it doesn't), I would just go and focus on creating affordable software solutions to medium/small businesses.
This is kind of like gardening/farming, before industrial revolution most crops required a huge work force, these days with all the equipment and advancements a single farmer can do a lot on their own with small staff. People still "hand" garden for pleasure, but without using the new tech they wouldn't be able to compete on a big scale.
I know many fear AI, but it is progress and it will never stop. I do think many devs are intelligent and will be able to evolve in the workplace.
So, this "solve business problems" is some temporary[1] gig for you?[2]
------------------------------
[1] I'm reminded of the anti-union people who are merely temporarily embarssed millionaires.
[2] Skills atrophy. Maybe you won't need the atrophied skill in the future, but how sure are you that this is the case? The eventual outcome?
Same. Writing code is easy. Reading code is very very hard.
They could rename it "Using AI Generated Code Makes Programming Less Fun, for Me", that would be more honest.
The problem for programmers is (as a group) they tend to dislike the parts of their job that are hardest to replace with AI and love the stuff that is easiest for machines to copy. It turns out meetings, customer support, documentation, tests, and QA are core parts of being a good engineer.
I'm doing a lot of new things I never would have done before. Yes, I could have googled APIs and read tutorials, but I learn best by doing, and AI helps me learn a lot faster.
Not everyone has access to an expert that will guide them to the most efficient way to do something.
With either form of learning though, critical thinking is required.
Sometimes.
People who claim "It's not synthesized, it's just other people's work run through a woodchipper" aren't precisely right, but they also aren't precisely wrong... And in this space, having the whole ecosystem of programmers who published code looking over my shoulder as I try to solve problems is a huge boon.
But the skills you describe are still skills, reading and researching and doing your own fact finding are still important to practice and be good at. Those things only get more important in situations off the beaten path, where AI doesn't always give you trustworthy answers or do trustworthy work.
I'm still going to nurture some of these skills. If I'm trying to learn, I'll stick to using AI only when I'm truly stuck or no longer having fun.
If people aren't learning from AI it's their fault. Yeah AI makes stuff up and hallucinates and can be wrong but how is that different than a distracted senior dev? AI is available to me 24/7 to answer my questions in minutes or seconds where half the time when I message people I have to wait 30-60min for a response.
People just need to approach things intelligently and actually learn along the way. You can easily get to the point where you're thinking more clearly about a problem than the AI writing your code pretty quickly if you just pay attention and do the research you need to understand what's happening. They're not as factual as a textbook but they don't need to be to give you the space to ask the right questions and they'll frequently provide sources (though I'd heavily recommend checking them. Sometimes the sources are a joke)
I've refactored the sloppiest slop with AI in days with zero regressions. If I did it manually it could have taken months.
I mean in 2 years the entire mentality shifted. Most people on HN were just completely and utterly wrong (also quite embarrassing if you read how self assured these people were, this is like 70 percent of HN at the time).
First AI is clearly not a stochastic parrot and second it hasn’t taken our jobs yet but we can all see that potential up ahead.
Now we get articles like this saying your skills will atrophy with AI because the entire industry is using it now.
I think it’s clear. Everyone’s skills will atrophy. This is the future. I fully expect in the coming decades that the generation after zoomers have never coded ever without the assistance of AI and they will have an even harder time finding jobs in software.
Also: because the change happened so fast you see tons of pockets of people who aren’t caught up yet. People who don’t realize that the above is the overarching reality. You’ll know you’re one of these people if AI hasn’t basically taken over your work place and you and your coworkers aren’t going all in on Claude or Codex. Give it another 2 years and everyone will flip here too.
The money is never wrong! That's why the $100 billion invested in blockchain companies from 2020 to 2023 worked out so well. Or why Mark Zuckerberg's $50 billion investment in the Metaverse resulted in a world-changing paradigm shift.
Those people who invested cash in blockchain believed that they could develop something worthwhile on the blockchain.
Zuckerberg believed the Metaverse could change things. It's why he hired all of those people to work on it.
However, what you have here are people claiming LLMs are going to be writing 90% of code in the next 18 months, then turning around and hiring a bunch of people to write code.
There's another article posted here, "Believe the Checkbook" or something like that. And they point out that Anthropic had no reason to purchase Bun except to get the people working on it. And if you believe we're about to turn a corner on vibe coding, you don't do that.
Imagine the empire state building was just completed, and you had a man yelling at the construction workers: "PFFT that's just a bunch of steel and bricks"
This quote is so telling. I’m going to be straight with you and this is my opinion so you’re free to disagree.
From my POV you are out of touch with the ground truth reality of AI and that’s ok because it has all changed so fast. Everything in the universe is math based and in theory even your brain can be fully modelled by mathematics… it’s a pointless quote.
The ground truth reality is that nobody and I mean nobody understands how LLMs work. This isn’t me making shit up, if you know transformers, if you know the industry and if you even listened to the people behind the technology who make these things… they all say we don’t know how AI works.
But we do know some things. We know it’s not a stochastic parrot because in addition to the failures we’ve seen plenty of successes to extremely complicated problems that are too non trivial for anything other than an actual intelligence to solve.
In the coming years reality will change so much that your opinion will flip. You might be so stubborn as to continue calling it a stochastic parrot but by then it will just be lip service. Your current reaction is normal given the paradigm shift happened so fast and so recently.
This is a really insane and untrue quote. I would, ironically, ask an LLM to explain how LLMs work. It's really not as complicated as it seems.
You can boil LLM's down to "next token predictor". But that's like boiling down the human brain to "synapses firing".
The point that OP is making I think, is that we don't understand how "next token prediction" leads to more emergent complexity.
It seems clear you don't want to have a good faith discussion.
It's you claiming that we understand how LLM's work, while the researchers who built them say that we ultimately don't.
There’s tons more where that came from. Like I said lots of people are out of touch because the landscape is changing so fast.
What is baffling to me is that not only are you unaware of what I’m saying but you also think what I’m saying is batshit insane despite the fact that people in the center of it all who are creating these things SAY the same thing. Maybe it’s just terminology…understanding how t build an LLM is not the same as understanding why it works or how it works.
Either way I can literally provide tons and tons more of evidence to the contrary if you’re still not getting it: We do not understand how LLMs work.
Also you can prompt an LLM about whether or not we understand LLMs it should tell the same thing I’m saying along with explaining transformers to you.
Just because the restaurant says "World's Best Burgers" on its logo doesn't make it true.
Here’s another: https://youtube.com/shorts/zKM-msksXq0?si=bVethH1vAneCq28v
Geoffrey Hinton father of AI who quit his job at Google to warn people about AI. What’s his motivation? Altruism.
Man it’s not even about people saying things. If you knew how transformers and LLMs work you would know even for the most basic model we do not understand how they work.
It will just spew over-confident sycophantic vomit. There is no attempt to reason. It’s all worthless nonsense.
It’s a fancy regurgitation machine and that will go completely off the rails when it steps outside of it’s training area. That’s it.
I’ve also seen it fuck up in the same way you describe. So do I weigh and balance these two pieces of contrasting evidence to form a logical conclusion? Or do I pick and choose the evidence or convenient for my world view? What should I do? Actually why don’t you tell me what you ended up doing?
> "I’m a senior and LLM’s never provide code that pass my sniff test, and it remains a waste of time."
Even a year ago that seemed like a ridiculous thing to say. LLM's have made one thing very clear to me: A massive percentage of developers derive their sense of self worth from how smart coding makes them feel.
That being said, what will be critical is understanding business needs and being able to articulate them in a manner that computers (not humans) can translate into software.
What has to happen first is that people need to rebuild their identity before they can accept what is happening and that rebuilding process will take longer then the rate at which AI is outrunning all of us.
What is my role in tech if for the past 20 years I was a code ninja but now AI can do better than me? I can become a delegator or manager to AI, a prompt wizard or some leadership role… but even this is a target for replacement by AI.
Fascinating.
It also has already taken junior jobs. The market is hard for them.
Correction: it has been a convenient excuse for large tech companies to cut junior jobs after ridiculous hiring sprees during COVID/ZIRP.
Well, its taken blame for the job cutting due to the broad growth slowdown since COVID fiscal and monetary stimulus was stopped and replaced with monetary tightening, and then most recently the economy was hit with the additional hammers of the Trump tariff and immigration policies, as lots of people want to obscure, deny, and distract from the general economic malaise (and because many of the companies, and even more of their big investors, involved are in incestuous investment relationships with AI companies, so "blaming" AI for the cuts is also a form of self-serving promotion.)
Your memory of the discourse of that era has apparently been filtered by your brain in order to support the point you want to make. Nobody who thoughtlessly adopted an extreme position at a hinge point where the future was genuinely uncertain came out of that looking particularly good.
No it very clearly is. Even still today, it is obvious that it has zero understanding of anything and it's just parroting training data arranged in different ways.
However, for those in the first few years of their career, I'm definitely seeing the problem where junior devs are reaching for AI on everything, and aren't developing any skills that would allow them to do anything more than the AI can do or catch any of the mistakes that AI makes. I don't see them on a path that leads them from where they are to where I am.
A lot of my generation of developers is moving into management, switching fields, or simply retiring in their 40s. In theory there should be some of us left who can do what AI can't for another 20 years until we reach actual retirement age, but programming isn't a field that retains its older developers well. So this problem is going to catch up with us quickly.
Then again, I don't feel like I ever really lived up to any of the programmers I looked up to from the 80s and 90s, and I can't really point to many modern programmers I look up to in the same way. Moxie and Rob Nystrom, maybe? And the field hasn't collapsed, so maybe the next generation will figure out how to make it work.
If you want to be an artist be an artist, that's fine, don't confuse artististry with engineering.
I write art code for myself, I engineer code professionally.
The author wraps with a false dichotomy that uses emotionally loaded language at the end: "You Believe We have Entered a New Post-Work Era, and Trust the Corporations to Shepherd Us Into It". I mean, what? Why can't I think it's quickly becoming a new era _and_ not trust corporations? Why does the author take that idea off the table? Is this logic or rhetoric? Who is this author trying to convince?
SWE life has always had smatterings of weird gatekeeping, self identities wrapped up in external tooling or paradigms, fragile egos, general misanthropy, post-hoc rationalization, etc. but... man watching the progressions of the crash outs these last few years has been wild.
In my day job, I use best practices. If I'm changing a SQL database, I write database migrations.
In my hobby coding? I will never write a database migration. You couldn't force me to at gunpoint. I just hate them, aesthetically. I will come up with the most elaborate and fragile solutions to avoid writing them. It's part of the fun.
These things are all tradeoffs. A junior engineer who goes to the manual every time is something I encourage, but if they go exclusively to only the manual every time they are going to be slower and produce code more disjoint and harder to maintain than their peers who have taken advantage of other people's insights into the things the manuals don't say.
I'm trying to think of any examples of someone who said that "a generation ago" at all, let alone any that wasn't regarded as a fringe crackpot.
C makes you a bad programmer. Real men code in assembler.
IDEs make you a bad programmer. Real men code in a text editor.
Intellisense / autocomplete makes you a bad programmer. Real men RTFM.
Round and round we go...
AI is different.
Programming takes practice. And if all of your code is generated via LLM, then you're not getting any practice.
It's the same as saying using genAI will make you a bad artist. In the sense that putting hands-to-medium makes you a good artist, that is true. Unless you take deliberate steps to learn, your skills will attrophe.
However, being a good programmer|artist is different from being a successful programmer|artist. GenAI can help you churn out tons of content, and if you can turn that content into income, you'll be successful.
Even before LLMs, successful and capable were orthogonal features for most programmers. We had people who made millions churning out a crud website over a few months, and others that can build game engines, but are stuck in underpaid contracting roles.
I am mid career now.
High level langages like js or python have a lot of bad design / suboptimal code... as well as some java code in many places.
Some bad java code (it just needs to be a sql select in a loop) can easily perform thousand time worse than a clean python implementation of the same thing.
As said above, once c was high level programing language and still is in some places.
I do not code in python / go / js that much these days, but what made me a not so bad developper is my understanding of computing mechanism (why and how to use memory instead of disk, how to arange code so cpu can use it's cache efficiently...)
As said in many posts, code quality even for vibe coded stuff depends more on what was prompted and how many efforts the PR diff is human readable to get maintainable and efficient softwares at the end of the day.
Yet senior devs often spend more time reviewing code instead of actually writting some. Vibe coding ultimately feels the same for me at the moment.
I still love to write some code by hand, but I start to feel less and less productive with this approach while at the same time feeling I don't really lost my skills to do so.
I think I really feel and effectively am more efficient at delivering thing with appropriate quality level for my customers now that I have agentic coding skills in my belt.
It’s like a carpenter talking about IKEA saying “I remember when I got an electric sander, it’s the same thing”.
They haven't destroyed everyone but there definitely are sets of people who used the crutches and never got past them. And not just in a "well they never needed anything more" but worse programmers than they should or could have been.
Anybody know any weavers making > 100k a year?
If the demand for this work is high, maybe the individual workers aren't earning $100k per year, but the owner of the company who presumably was/is a weaver might well be earning that much.
What the loom has done is made the repeatable mass production of items cheap and convenient. What used to be a very expensive purchase is now available to more people at a significantly cheaper price, so probably the profits of the companies making them are about the same or higher, just on a higher volume.
It hasn't entirely removed the market for high end quality weaving, although it probably has reduced the number of people buying high-end bespoke items if they can buy a "good enough" item much cheaper.
But having said that, I don't think weavers were on the inflation-adjusted equivalent of 100k before the loom either. They may have been skilled artisans, but that doesn't mean the majority were paid multiples above an average wage.
The current price bubble for programming salaries is based on the high salaries being worth paying for a company who can leverage that person to produce software that can earn the company significantly more than that salary, coupled with the historic demand for good programmers exceeding supply.
I'm sure that even if the bulk of programming jobs disappear because people can produce "good enough" software for their purposes using AI, there will always be a requirement for highly skilled specialists to do what AI can't, or from companies that want a higher confidence that the code is correct/maintainable than AI can provide.
Joking aside, we have to understand that this is the way software is being created and this tool is going to be the tool most trivial software (which most of us make) will be created with.
I feel like the industry is telling me: Adopt of become irrelevant
Now I'm just telling AI what to do.
However I agree there's a different category here under the idea of 'craft'. I don't have a good way to express this. It's not that I'm writing these 'if' statements in a particular way, it's how the whole system is structured and I understand every single line and it's an expression of my clarity of the system in code.
I believe there a split between these two and both are focusing on different problems. Again I don't want to label, but if I *had to* I would say one side is business focused. Here's the thing though - your end customers don't give a fuck if it's built with AI or crafted by hand.
The other side is the craftsmanship, and I don't know how to express this to make sense.
I'm looking for a good way to express this - feeling? Reality? Practice?
IDK, but I do understand your side of it; However, I don't think many companies will give a shit.
If they can go to market in 2 weeks vs 2 month's you know what they'll choose.
I know plenty of 50-something developers out of work because they stuck to their old ways and the tech world left them behind.
I did, for a very long time. Then I realized that it's just work, and I'd like to spend my life minimizing the amount of that I do and maximizing the things I do want to do. Code gen has completely changed the equation for workaday folks. Maybe that will make us obsolete, and fall out of practice. But I tend to think the best software engineers are the laziest ones who don't try to be clever. Maybe not the best programmers per se, but I know whose codebase I'd rather inherit.
But just like physical fitness, the correct solutions are structural/societal. Studies show designing walkable cities has a much bigger effect than gyms (or shaming individuals)
People care if their software works. They don’t care how beautiful the code is.
AI can churn out 25 drafts faster than 99% of devs can get their boilerplate setup for the first time.
The new skill is fitting all that output into deployable code, which if you are experienced in shipping software is not hard to get the model to do.
I've still learned from it. Just read each line it generates carefully. Read the API references of unfamiliar functions or language features it uses. You'll learn things.
You'll also see a lot of stupidity, overcomplication, outdated or incorrect APIs calls, etc.
I don’t commit 1,000 lines that i don’t know how it works.
If people are just not coding anymore and trusting AI to do everything, i agree, they’re going to hit a wall hard once the complexity of their non-architected Frankenstein project hits a certain level. And they’ll be paying for a ton of tokens to spin the AI’s wheels trying to fix it.
crimsoneer•1h ago
Yes, taking the bus to work will make me a worse runner than jogging there. Sometimes, I just want to get to a place.
Secondly, I'm not convinced the best way to learn to be a good programmer is just to do a whole project from 0 to 100. International practice is a thing.
oofbey•1h ago
29ebJCyy•1h ago
I do think the “becoming dependent on your replacement” point is somewhat weak. Once AI is as good as the best human at programming (which I think could still be many years away), the conversation is moot.
takira•1h ago
mrkeen•1h ago
darkwater•48m ago
OptionOfT•52m ago
PunchyHamster•50m ago