Companies will continue to demand it (I know people working at companies that are literally looking at AI usage as an individual performance metric, puke emoji), and probably 95% of humans using pretty understandable human logic aren’t going to work harder than they need to on purpose.
I wish I had a solution. I think the jury is still out on whether programming will be a dead profession in a short number of years, replaced by technical protect operators.
Eg. When using Ai Deep Research for hard to debug issues, asking for the why makes for a much better response.
I worked under people who started as juniors that way but were politically savvy. Or just ruthless. And pushed their way to the top by stealing projects, lying through their teeth, and other such tactics.
They were slowing down progress because their methods involved sabotaging the progress of others because it might make their own contributions shine a little less.
They were the cause of using libraries like leftPad all through business critical code, and cutting anyone down who dared to simply question why.
These things cause ripples. The smartest and most capable staff leaves, what results is a churn of the same kind.
But hey, they get a trip to Mexico every year and burn through millions every two years. Profit any day now.
I think the irony of AI is going to be that it will make the remaining software jobs properly hard again, and implementers (ex coders) will be able to succeed with even less code knowledge than before.
https://htmx.org/essays/yes-and/
Everyone else: we must let the juniors write the code.
Seniors come from juniors. If you want seniors, you must let the juniors write the code.
1) People hearing "an LLM is as smart as a junior" and actually opting for the LLM subscription price instead of hiring a junior
2) The gap between senior and junior in terms of performance has become larger, since the senior devs had their hands get dirty for years typing stuff out manually AND also tackling challenges.
This generation of junior-mid developers will have a significant portion of the "typing stuff" chopped off, and we're still pretending that this will end up being fine.
I think my argument against humans still needing to know how to manage complexity, is that the models will become increasingly able to manage that complexity themselves.
The only thing that backs up that argument is the rate of progress the models have made in the last 3 years (ChatGPT turned 3 just 3 months ago)
I think software people as a whole need to see that the capabilities won’t stop here, they’re going to keep growing. If you can describe it, an LLM will eventually be able to do it.
The average tenure of a person in engineering role is so short that very few employers are thinking about developing individuals anymore.
The actual way this gets approached is "If you want seniors, you must hire seniors".
I'm not sure how this plays out now. But it's easy to imagine a scenario like the COBOL writers of the last generation.
I think the allure of high TC (150k base or more for entry level) led to many non engineer brained people to enter tech.
Many people can do rote memorization, it’s even ingrained heavily in some cultures iykyk. However they can’t come up with much original or out of the box thinking.
Seniors should be prepared that Seniority will mean different thing and path of getting there will be different too.
Just like there was a shift from lower lvl languages to high level
Companies know this as well, but this is a prisoner dilemma type situation for them. A company can skip out on juniors, and instead offer to pay seniors a bit better to poach them from other companies, saving money. If everyone starts doing this, everyone obviously loses - there just won't be enough new seniors to satisfy demand. Avoiding this requires that most companies play by the rules so to say, not something that's easily achieved.
And the higher the cost of training juniors relative to their economic output, the greater the incentive to break the rules becomes.
One alternative might just be more strict non-competes and the like, to make it harder for employees to switch companies in the first place. But this is legally challenging and obviously not a great thing for employees in general.
And therefore in my experience not every senior engineer would hack it as a senior engineer at a more intense company myself included.
This isn’t a software unique experience. It’s life.
If coding is an art then all the juniors will end up in the same places as other struggling artists and only the breakout artists will land paying coding gigs.
I'm sitting here on a weekend coding a passion project for no pay so I have to wonder.
On the plus side, as a dev with 30+ years of experience, I am commanding a very good contract salary these days. Revolving door companies stuck in process hell and product rot, and cannot deliver new value, so they’re scrambling to find experienced devs that cost a premium. My salary today makes up for peanuts at the start of my career.
I do not want more juniors, because given time they will be my competition.
I hired a junior "dev" who literally hadn't even completed an HTML course. Before AI I could not have hired them because they literally did not know how to dev. After AI, anyone with a little grit can push themselves into the field pretty easily.
As with everything in life: you can choose to hard route or you can choose the easy route and your results will follow accordingly.
so what is their value? proxy your requests to ai?
Hard agree, but probably not in the way you're implying.
It's the difficult things that make life fun and interesting. A life spent going from one easy thing to another is a life barely lived at all.
There are lots of ambiguous situations where a search and human "inference" can solve that AI still can't.
I can tell the AI to do something, it uses the worst approach, I tell it a better way exists, it says it validated it does not, I link to a GitHub issue saying it can be done via workarounds and it still fails. It's worse for longer tasks where it always shortcuts to if it fails pick a "safe" approach (including not doing it).
Funny enough we need the junior to guide the AI.
In a world where "Code is no longer a skill," the only way to survive is to stop being a "Prompt Operator" and start being a "System Auditor." If you can’t explain the trade-offs of the architectural pattern the AI just gave you, you aren't an engineer, you're just the person holding the screwdriver while the machine builds the house.
You don't get technical creativity reflexes by using AI. This is technical stagnation in the making. By cannibalizing its own sources, AI is ensuring that future generations are locked-in subscription models to do the most basic technical tasks. This is all obvious, yet we speed up every chance we get.
AI will deduplicate all of this
That sounds beyond wasteful.
Just look at new math proofs that will come out, as one example. Exploration vs Exploitation is a thing in AI but you seem to think that human creativity can’t be surpassed by harnesses and prompts like “generate 100 types of possible…”
You’re wrong. What you call creativity is often a manual application of a simple self-prompt that people do.
One can have a loop where AI generates new ideas, rejects some and ranks the rest, then prioritizes. Then spawns workloads and sandboxes to try out and test the most highly ranked ideas. Finally it accretes knowledge into a relational database.
Germans also underestimated USA in WW2, saying their soldiers were superior, and USA just had technology — but USA out produced tanks and machinery and won the war through sheer automation, even if its soldiers were just regular joes and not elite troops.
Back then it was mechanized divisions. Now it is mechanized intelligence.
While Stalin said: Quantity has a quality all its own.
In other words, creativity in humans is arguably just as derivative as in machines.
AI can innovate in synthetic-realm of novel ideas, while real-world novelty will remain untouched.
There are different types of novelties
And if AI was really about productivity they'd be talking about doing more faster with the same workforce, not reducing the workforce.
That's perfectly aligned with capitalistic motivations
There have been lots of instances of knowledge being rediscovered even when it was previously published but sitting on some shelf forgotten. LLMs ability to digest large volumes of data will I think help with this issue.
We will still need to reproduce and verify conclusions but will be interesting to see what might come from this.
We don’t need the same volume of developers to have the same or faster speed of innovation.
And conversely if there is stagnation there is a capital opportunity to out compete it and so there will be a human desire to do the work.
Tl;Dr. People like doing stuff and achieving. They will continue to do stuff.
ps it’s too much to claim other people don’t experience creative ideas using AI. You don’t really know that’s true. It hasn’t been my experience as I have had the capability and capacity to complete ideas on my back burner for decades and move onto the next thing.
At minimum, our current generation of leaders will have to get much better at managing resources and building people up. We have to up our games and build environments where the pursuit of deep understanding is permissible. Unfortunately with the current hiring issues, it’s totally understandable that young developers are scared to take time on tickets.
Who cares if it's derivative slop or a straight up bootleg of something else so long as the number goes up
Pretty much all software projects seem to peak, and then decline in quality. There are only a handful of senior devs in the world who are actually good programmers.
The same ethos makes sense with AI, it's just that every company is trying to avoid paying that training tax. Why turn a junior into a senior yourself if you can get the competition to pay for it instead.
Kraft 1977 Programmers and Managers talked about this if I recall. Still the best alternate take on our industry I have ever read.
I have never once told my manager “it would be really nice to have a few junior developers. It would really help us get this project done on time”. They do “negative work”.
Yes not having juniors become seniors is an industry problem. But my goal is to reach my company’s quarterly and anual goals - not what’s going to happen 10 years from now.
I have. A good junior can do in a week what a senior with domain knowledge can do in a half day, with only an hour of mentoring along the way. This isn’t a great exchange rate per dollar (juniors are cheaper than seniors, but not that much cheaper) — but seniors with domain knowledge are a finite resource, you can’t get more of them for love or money, while juniors are fresh-minted every semester. The cheapest way to shipping may not go through juniors, but the fastest way usually does; and that’s completely ignoring the HUGE side benefit of building seniors “the hard way,” which is still easier than hiring.
Don't worry, just leave all your problems for someone else to fix. I'm sure that won't have any lasting consequences at all.
Obviously that hasn't historically been true, else there wouldn't be any senior developers as companies would have wised up to that and nobody would hire them as juniors.
- Not everybody is a job hopper (even in Silicon Valley one sees that most junior FAANG devs stick around for a good while).
- The HR department is absolutely going to give junior developers that pass the cut after a year or so a market rate raise.
- In limited hiring periods, they'd be grateful to have the chance to stick around, while in bullish "boom" periods companies can afford to spend to keep people, expand and give them bigger roles, and so on. It's in the in-between that it becomes more problematic, but now we're in a "limited hiring" era.
>Yes not having juniors become seniors is an industry problem. But my goal is to reach my company’s quarterly and anual goals - not what’s going to happen 10 years from now.
That's how companies fail.
It's also not a good strategy at the personal level. If you command more devs, you get more leverage.
I see a lot of Sr engineers get very frustrated by how much time they have to spend helping Jr engineers. But, that’s the job, or at least a big part of it.
Or at least it was.
Sure there’s a wide range of skills and you can’t just hand any task to anyone and expect it to work out but some fresh collage graduates are more capable than the average person with 5 years of professional experience. At the other end you need to focus on whatever they actually are capable of doing. 40+ hours a week can slowly expand even an extremely narrow skillet as long as they’re a hard worker.
Junior devs are by your own explanation not useless. They are the most important human investment in your project.
We havent dont it and I never seen something like that.
Rather because you want them to go away, because management conveniently forgot to reduce your load to account for time spent on mentoring.
It will always be preferable to work on an understandable codebase, because that maximizes the AI's affordances too. And then the AI can explain things to you. A skilled human will always have a lot of solid knowledge relating to their hyper-specific niche that isn't part of your average general purpose AI, so humans will obviously have a key role to play still.
I haven’t have written code aside from tweaking stuff here and there in probably 3 or 4 months. Before that I wrote code by hand every day for many years.
I’ve found a lot of fun parts of my new workflow that I enjoy. I still miss being fully immersed in a problem deep in the files… and sometimes it feels like homework reading so many implementation summaries from Claude because the feature spans 4 repos and is too much code to read. But I do love shaping the code into different solutions exploring in a way that is unique to ai native workflows. And I love building agent skills and frameworks with/around them and expanding it out to more aspects of the company or life — there’s deep work to be had that still feels like hacking in the trenches. I get a lot of the same satisfaction in different ways, and there’s a lot of exciting novelty to explore that was previously out of reach due to time and energy constraints.
Also I don’t like our backend stack and I hate React / NextJS to the degree of derangement syndrome — I am so happy that I don’t have to write it and I can just focus on UX, making customers happy / lives easier / shaping the software into better and better versions of itself at such a faster pace.
People who learned good software engineering intimately before the inflection point are extremely lucky right now. Existential dread and the stages of grief have been a part of the journey for me too sadly, but there’s a lot to celebrate and explore with the right attitude.
Steve Jobs famously accurately called this out years ago [1].
Xerox, Boeing, PC manufacturers (who basically created the Taiwanese makers through a series of short-term outsourcing steps), etc. But there are two examples I want to talk about specifically.
First, one lasting impact of the 2008 GFC was that entry-level jobs disappeared. This devastated a generation of millenial college graduates who suddenly had a mountain of student loan debt (thanks to education costs outpacing inflation by a lot) but suddenly no jobs. It became a bit of a joke to poke fun at such people who had a ton of debt and worked as baristas but this was a shallow "analysis". It was really a systemic collapse. Those entry-level workers are your future senior workers and leaders. Those jobs have never come back.
The rise of DVR/TiVo and ultimately streaming brought on a golden age of TV in the 2000s. It was kind of the last hurrah for network shows that produced 22 episodes a year before streamers instead produced 8 episodes every 4 years.
But what made this system work was an ecosystem. Living in LA, Atlanta and a few other places was relatively cheap so aspiring actors and writers and entertainmnet professionals could get by with secon djobs and relatively low income. These became the future headline actors and senior professionals. Background work and odd jobs were sufficient. Background work also taught people how to be on a set.
Studios still had large writing staffs. Some writers would be on set. Those writers were your future producers and showrunners.
Part of what supported all of this was syndication. That is, networks produced shows and basic cable channels would pay to rerun them. Syndicating some shows was incredibly profitable in some cases (eg Seinfeld).
So the streamers came along and stripped things down. They got rid of junior positions. They adopted so-called "mini writing rooms". Those writers didn't tend to ever be on set. The runs were shorter and an 8 episode series couldn't support a writer in the same way a 22 episode series could. The streamers then were largely showing just their own content so residuals and syndication fees just went away.
All of this is short-term thinking. Hollywood has been both a massive industry and a source of American soft power internationally by spreading culture, basically.
I think the software engineering space is going through a similar transformation to what happened to the entertainment industry. A handful of people will do very well. AIs will destroy entry-level jobs and basically destroy that company and industry's future.
I predict in 10-20 years we'll see China totally dominating this space and a bunch of Linkedin "thought leaders" and politicians will be standing around scratchin their heads asking "what happened?"
I could care less about why either Claude, Codex or before that a developer was using a for loop or a while loop. I did and do care about architecture.
I’m no more going to review every line of code with AI than I am when I was delegating to more junior developers. I’m going to ask Claude Code about how it implemented something where I know there is an efficient way vs naive way, find and test corner cases via manual and automated tests and do the same for functional and non functional requirements.
It takes time to become a junior too. Emerging tech landscape could affect skills and knowledge that is expected from entry level job applicants.
I’ve experienced similar things and so understand the feeling, but this is poor leadership. If someone on your team makes it all the way to a code review and still thinks ‘the AI suggested it’, you failed to train them, failed to set expectations and they have justifiably lost more confidence in you than vice versa.
If we analyze the rest of the article through the lens of weak leadership, it sounds less like an AI problem and more like a corporate leadership problem.
"it's not x, but y", with bonus em-dash:
> your value as a developer is not in your ability to ship code. It’s in your ability to look at code
"But here’s the thing."
"And honestly?"
First of all, developers who only learn to code in a short bootcamp are often not well prepared — but that was already true before AI. In the past, many junior developers were students who were learning programming while studying, not just people who took a quick Python course on Udemy.
Instead of declaring junior developers useless, we should raise the standard: learn how to code properly, how to maintain code, understand networks, and build strong foundations in math and computer science. A well-trained junior developer is still extremely valuable and will always be needed.
Invest in the training of your junior employees.
The cost of generating code is now laughable, so that's not the economic value brought to the table by a junior engineer, or really, any engineer. The value is now generated by knowing what code is good code. You're going to have to have talks, book clubs, hackathons, and the like to get your juniors to know what good code is. Do they know what design patterns are? How about good architecture? If they can't name a few design patterns, you're not investing enough.
slibhb•1h ago
dangus•1h ago
However, it’s got a lot of downsides too.
DJBunnies•1h ago
“It’s what the LLM said.” - Great. Now go learn it and do it again yourself.
danielbln•1h ago
tpmoney•1h ago
In my experience it is the very rare junior dev that can learn what's good or bad about a given design on their own. Either they needed to be paired with a sr dev to look at things and explain why they might not want to something a given way, or they needed to wind up having to fix the mess they made when their code breaks something. AI doesn't change that.
techpression•1h ago
croes•1h ago
You can use AI as a teacher but how many will do that?
jatari•1h ago
The skill of the very top programmers will continue to increase with the advent of new tools.
croes•8m ago
veryemartguy•1h ago
Super great that it’s used to pump out tons of code because upper management wants features released even faster than before. I’m sure the junior devs who don’t know a for loop from their ass will be able to learn and understand wtf Claude is shitting out
TacticalCoder•1h ago
It is but how do you teach to people who think their new profession is being a "senior prompt engineer" (with 4 months of experience) and who believe that in 12 months there won't be any programmer left?
Thanemate•1h ago
As a junior, my top issue is finding valuable learning material that isn't full of poor or outright wrong information.
In the best and most generous interpretation of your statement, LLM's simply removed my need to search for the information. That doesn't mean it's not of poor quality or outright wrong.
ndriscoll•17m ago
As a general principle, take advantage of the fact that it can easily generate stuff. If you don't know whether something is true, have it prove it. Make a PoC/test/benchmark to demonstrate what it's saying. Create feedback loops (or rather, ask it to create feedback loops). They're very good at reasoning given access to the ground truth, so give them more ability to ground themselves.
They also have fantastic knowledge of public things, but no knowledge of your company, so your instructions should mostly be documentation of what's unique to your company. If it can write an instruction on its own (e.g. how to use git or kubernetes), it is a useless instruction; it already knows that. What it doesn't know is e.g. where your git server is. It also doesn't know what matters to your company: are you a startup trying to find product market fit? Are you an established company that is not allowed to break customer setups? etc.
kgeist•1h ago
>the AI group averaged 50% on the quiz, compared to 67% in the hand-coding group
And why would they do better? There's less incentive to learn because it's so easy to offload thinking to AI.
[0] https://www.anthropic.com/research/AI-assistance-coding-skil...