Of course, I think the people in that thread have it dead wrong, and I agree with this article. I work at a large tech company that has already stopped hiring junior developers. We're getting so much more productivity out of IC engineers that I would expect to see layoffs even with increasing product velocity.
People who say GenAI isn't vastly speeding up development are just using motivated reasoning, aren't using the tools, or are just bad engineers.
Then where is all that new software? Where are all the SaaS solutions, the games, the new tools?
The productivity increase that comes from working at a higher level of abstraction is a real thing, but it's not going to change the industry.
How about a good engineer who wants to feed their family?
My opinion, and it's just that, is the corporations are betting on the robots advancing fast enough where it won't be an issue there aren't new 'good engineers' to follow the current batch. Probably to their detriment but by the time that happens humans will be tasked to only perform the work the robots simply can't physically do so their agency in the matter will no longer matter, you either install 13.6 screws/min or you are replaced with someone (or something) who can.
Happy days...
I don’t worry about his job, good riddance
Don’t make the mistake of thinking that you are in the telegram business and not the communications business.
What I think AI will very soon get to is being able to generate code to that “functional sharpness” level, while human engineers are caught striving for that as “sharp as possible” level because they care for their craft. The cost difference is just not worth it to the organizations and the trade off will be AI generated code that will be “good enough”. Sure a talented dev could do better, but that juice just won’t be worth the squeeze. To me, embracing the tools and becoming an prompting expert is the way to go, rather than some Luddite-like rejection of the technology because of some offended artistic notions.
A solo programmer can afford to be an artisan, but commercial code is rarely artisanal. It's full of trade-offs and compromises.
Commercial code is always about business costs -- maintenance cost, buy-vs-build cost, refactoring cost, business continuity costs, etc. You're always working under constraints like time, headcount, available expertise, maintainability, and coordination bottlenecks.
Craftsmanship in commercial code is rare. Sometimes you get geniuses like John Carmack who writes inspired stuff in code that ships. But most of the time, it's just a bunch of normal people trying to solve a problem under a ton of constraints.
What makes you think that doesn’t involve craftsmanship?
Your first part may be right, however using the term Luddite as a pejorative without understanding what they meant or stood for (given the social context) is a sign of ignorance.
Companies hired Juniors as a way to invest in someone who could be a valuable asset later, without having to compete quite so hard for people with strong potential. What's changed is that companies don't have as much financial flexibility as they did before (interest rate and tax code changes) and the first thing to go is risky long-term investments in hiring and retaining talent. That takes the form of cutting back on cushy benefits and it also takes the form of cutting jobs for Juniors.
More software engineer productivity may mean fewer jobs at a firm with a fixed niche, but it means more applications where software engineering time can be productively deployed and greater number of software engineering jobs in aggregate.
General economic downturn kills jobs that are about immediate returns and higher interest rates make it less attractive to burn software engineering hours on long-payoff efforts.
So, some specific jobs at specific firms may be lost to AI, but aggregate job losses are more likely to be a result of economic and financial market conditions.
All previous 'efficiency gains' were in physical manufacturing. Though, one could argue, the invention of the printing press was different as it allowed the more efficient distribution of knowledge. Same with the (general availability) of the internet. Now what we're seeing is the early days of more efficient application of knowledge well beyond what any single human could achieve.
Someone inventing a pin making machine freed up hundreds of thousands of people for different tasks allowing for the more efficient application of labor. Someone inventing a machine which does the mental work of hundreds of thousands of people on a single GPU will most likely follow a different trajectory.
No, they weren't. Like, the whole information age is full of efficiency gains in knowledge work on a fairly high cadence (sure, there have also been gains in physical manufacturing, but those have hardly been the exclusive locus of such gains.)
peternicholson•1d ago