its like how the generic "we take anyone" online security degree has poisoned that market -- nothing but hoards of entry level goobers, but no real heavy hitters on the mid-to-high end. put another way, the market is tight but there are still reasonable options for seniors.
then again we live under capitalism
Take the software development sector as example: if we replace junior devs by AI coding agents and put senior devs to review the agent's work, how will we produce more seniors (with wide experience in the sector) if the juniors are not coding anymore?
Other than that, I guess developing software in some capacity while doing a non-strictly software job - say, in accounting, marketing, healtcare, etc. This might not be a relevant number of people if 'vibe coding' takes hold and the fundamentals are not learned/ignored by these accountants, marketers, healthcare workers, etc.
If that is the case, we'd have a lot of 'informed beginners' with 10+ years of experience tangentially related to software.
Edit: As a result of the above, we might see an un-ironic return to the 'learn to code' mantra in the following years. Perhaps now qualified 'learn to -actually- code'? I'd wager a dollar on that discourse popping up in ~5 years time if the trend of not hiring junior devs continues.
I'm half-joking, but I wouldn't be surprised to see all sorts of counterpoint marketing come into play. Maybe throw in a weird traditional bent to it?
> (Pretentious, douche company): Free-range programming, the way programming was meant to be done; with the human touch!
All-in-all, I already feel severely grossed out any time a business I interact with introduces any kind of LLM chatbot shtick and I have to move away from their services; I could genuinely see people deriving a greater disdain for the fad than there already is.
The question is... is this based on existing capability of LLMs to do these jobs? Or are companies doing this on the expectation that AI is advanced enough to pick up the slack?
I have observed a disconnect in which management is typically far more optimistic about AI being capable of performing a specific task than are the workers who currently perform that task.
And to what extent is AI-related job cutting just an excuse for what management would want to do anyway?
6-12 months in, the AI bet doesnt pay off, then just stop spending money in it. cancel/dont renew contracts and move some teams around.
For full time entry hires, we typically dont see meaningful positive productivity (their cost is less than what they produce) for 6-8 months. Additionally, entry level takes time away from senior folks reducing their productivity. And if you need to cut payroll cost, its far more complicated, and worse for morale than just cutting AI spend.
So given the above, plus economy seemingly pre-recession (or have been according to some leading indicators) seems best to wait or hire very cautiously for next 6-8 months at least.
I can imagine that there were a decent number of execs who tried chatgpt, made some outlandish predictions and based some hiring decisions upon those predictions though.
This paper looks kinda trashy - confusing correlation with causation and clickbaity.
- Generative AI is genuinely capable enough to replace entry level workers at a lower cost.
- The hype around Generative AI is convincing people who make hiring decisions that it is capable enough to replace entry level workers at a lower cost.
- The hype around Generative AI is being used as an excuse to not hire during an economic downturn.
Could even still be other things too.
For the tech side, we've reduced behavioral questions and created an interview that allows people to use cursor, LLMs, etc. in the interview - that way, it's impossible to cheat.
We have folks build a feature on a fake code base. Unfortunately, more junior folks now seem to struggle a lot more with this problem
companies must do this, 'cause if they don't then their competition will (i.e. the pressure)
of course, we can collectively decide to equally value labor and profit, as a symbiotic relationship that incentivizes long term prosperity. but where's the memes in that
To really test the implied theory that using AI enables cutting junior hiring, we need to see it in a better economy, in otherwise growing companies, or with some kind of control (though not sure how this would really be possible).
I'm not disputing your point, but I'm curious: given that the main headline measures that we tend to see about the US economy right now involve the labour market. How do you establish the counterfactual?
This is cause for government intervention.
1. Those that encourage people to use AI agents aggressively to increase productivity.
2. Those that encourage people to use AI agents aggressively to be more productive while still hiring young people.
Which type of company will be more innovative, productive, and successful in the long run?Many of the largest countries are experiencing similar declines, with fewer and fewer countries maintaining large birth rates.
That world was 30 years ago. In 2025 world average total fertility rate is 2.2, which is a shade above replacement rate (2.1). And 2.2 is a 10% drop since 2017 alone (when it was 2.46).
Because life expectancy is higher, the population will continue to increase. But not "rapidly".
Young people are cheap and they love AI!
Is this a case of "correlation does not imply causation?"
And the entire time I'm watching this I'm just thinking that they don't realize that they are only demonstrating the tools that are going to replace their own jobs. Kinda sad, really. Demand for soft skills and creatives is going to continue to decline.
Dev jobs too.
In the late 90s you weee considered a prodigy if you understood how to use a search engine. I had so many opportunities simply because I could find and retain information.
So LLMs have solved this. Knowing a framework or being able to create apps is not a marketable skill any longer. What are we supposed to do now?
It’s the soft skills that matter now. Being well liked has always been more important in a job than being the best at it. We all know that engineer who knows they are hot shit but everyone avoids because they are insufferable.
Those marketing people don’t need to spend a week on their deck any longer. They can work the customer relationship now.
Knowing how to iterate with an LLM to give the customer exactly what they need is the valuable skill now.
I personally think we're still a ways from the latter...
Entry-level jobs get "hollowed out" in a stagnant economy regardless of "AI".
AI = not hiring because no new work but spin as a "AI" . Markets are hungry of any utterance of the the word AI from the CEO.
so ridiculous. but we've collectively decided to ignore BS as long as we can scam each other and pray you are not the last one holding the bag.
You have to somehow have the discipline to avoid getting caught up in the noise until the hype starts to fade away.
Another way to look at it is that hiring is fine, and that the vain entitled generation we all suspected was going to emerge feels that a job should absolutely be available to them, and immediately.
Another way to look at it is that journalism has been dead for quite a while, and writing about the same fear-based topics like “omg hiring apocalypse” is what makes these people predictable money (along with other topics).
Another way to look at it is that we raised a generation of narcissistic parents and children that have been going “omg grades”, “omg good college”, “omg internship”, “omg job” for so long that that these lamentations feel normalized. A healthy dose of stfu was never given to them. Neurotic motherfuckers.
Until AI can do literally everything we can, that class of work will continue to exist, and it'll continue to be handed to the least experienced workers as a way for them to learn, get oriented, and earn access to more interesting problems and/or higher pay while experienced folks rest on their laurels or push the state of the art.
they rather pay people to sit in a room pressing a button every hour than have them loitering around on UBI
either that or in the pod
Why?
It's already doing a lot of the loadbearing work in those mid-level roles too now, it's just a bit awkward for management to admit it. One common current mode of work is people using AI to accomplish their work tasks very quickly, and then loafing a bit more with the extra time. So leaders refrain from hiring, pocket the savings, and keep a tight lid on compensation for those who remain.
At some point they'll probably try to squeeze the workforce for some additional productivity, and cut those who don't deliver it. Note that the "ease" of using AI for work tasks will be a rationale for why additional compensation is not warranted for those who remain.
Im tired of reading all these claims with no primary evidence to support it.
Hard for me to believe that AI in its current state is hollowing out junior shop assistant and salesperson roles. Either those jobs were already vulnerable to "dumb" touchscreen kiosks or they require emotional intelligence and embodied presence that LLMs lack.
The US is going through a lot of upheaval, which whether you think is positive or negative, is unique, and a confounding factor for any such research.
"Our primary data source is a detailed LinkedIn-based resume dataset provided by Revelio Labs ...
We complement the worker resume data with Revelio’s database of job postings, which tracks recruitment activity by the firms since 2021 ...
The final sample consists of 284,974 U.S. firms that were successfully matched to both employee position data and job postings and that were actively hiring between January 2021 and March 2025.3 For these firms, we observe 156,765,776 positions dating back to 2015 and 245,838,118 job postings since 2021, of which 198,773,384 successfully matched with their raw text description."
They identified 245 million job postings from 2021 forward in the United States? I mean the U.S. population is like 236 million for the 18-65 age group (based on wikipedia, 64.9% of 342 total population).
And they find a very small percentage of firms using generative AI:
"Our approach allows us to capture firms that have actively begun integrating generative AI into their operations. By this measure, 10,599 firms, about 3.7 percent of our sample, adopted generative AI during the study period."
Maybe I am wildly underestimating just how much LinkedIn is used worldwide for recruiting? As a tech person, I'm also very used to seeing the same job listing re-listed by what seems to be a large number of low-effort "recruiting" firms on LinkedIn.
I think for trying to figure out how generative AI is affecting entry-level jobs, I would have been much more interested in some case studies. Something like find three to five companies (larger than startups? 100+ employees? 500+?) that have decided to hire fewer entry-level employees by adding generative AI into their work as a matter of policy. Then maybe circling back from the case studies to this larger LinkedIn dataset and tied the case study information into the LinkedIn data somehow.
Of course in the long run a chronically underemployed economy will have little demand for products and services, but that is beyond the scope of companies who, in general, are focused on winning short term and zero-sum market capture. However I believe that while a billion dollar valuation is a market and strategy problem, a trillion dollar valuation is a political problem - and I would hope that a mandate of broad gainful employment translates to political action - although this remains to be seen.
zeuch•3h ago
FrustratedMonky•2h ago
cubefox•2h ago
candiddevmike•2h ago
thw_9a83c•1h ago
planccck•1h ago
thw_9a83c•1h ago
No, I was being sarcastic.
selimthegrim•1h ago
thw_9a83c•52m ago
softwaredoug•1h ago
Currently part of the problem is the taboo using AI coding in undergrad CS programs. And I don't know the answer. But someone will find the right way to teach new/better ways of working with and without generative AI. It may just become second nature to everyone.
Workaccount2•1h ago
swexbe•1h ago
bopbopbop7•1h ago
hattmall•1h ago
dandellion•1h ago
basscomm•1h ago
neutronicus•1h ago
That's not true anymore in the smart phone / tablet era.
5-10 years ago my wife had a gig working with college kids and back then they were already unable to forward e-mails and didn't really understand the concept of "files" on a computer. They just sent screenshots and sometimes just lost (like, almost literally) some document they had been working on because they couldn't figure out how to open it back up. I can't imagine it has improved.
pdntspa•16m ago
thw_9a83c•1h ago
s46dxc5r7tv8•1h ago
thw_9a83c•1h ago
haijo2•56m ago
Some people dont want to hear that, but...
boredtofears•55m ago
i just want devs who actually read my pr comments instead of feeding them straight into an llm and resubmitting the pr
seanmcdirmid•1h ago
Ragnarork•1h ago
But if they're not hired...?
psunavy03•40m ago
lock1•28m ago
Pretty sure it's a self-destructive move for a CS or software engineering student to pass foundational courses like discrete math, intro to programming, algorithm & data structure using LLM. You can't learn how to write if all you do is read. LLM will 1-shot the homework, and the student just passively reads the code.
On more difficult and open coursework, LLM seems to work pretty well at assisting students. For example, in the OS course I teach, I usually give students a semester-long project on writing from scratch x86 32-bit kernel with simple preemptive multitasking. LLM definitely makes difficult things much more approachable; students can ask LLM for "dumb basic questions" without fear of judgement.
But due to the novelty & open-ended nature of the requirement ("toy" file system, no DMA, etc), playing a slot machine on LLM just won't cut it. Students need to actually understand what they're trying to achieve, and at that point they can just write the code themselves.
pdntspa•19m ago
Kind of like that meme or how two AIs talking to each other spontaneously develop their own coding for communication. The human trappings become extra baggage.
rs999gti•1h ago
From company interns. Internships won't go away, there will just be less of them. For example, some companies will turn down interns because they do not have the time to train them due to project load.
With AI, now employed developers can be picky on whether or not to take on interns.
throwawayoldie•1h ago
neutronicus•1h ago
ModernMech•1h ago
gruez•54m ago
zxor•32m ago
gruez•6m ago
It's pretty hard for a non-big tech company to pay big tech level salaries.
JustExAWS•32m ago
pdntspa•21m ago
rd•1h ago
hackable_sand•45m ago
gruez•40m ago
bix6•36m ago
zapnuk•34m ago
Cheap labor. It doesn't take that much to train someone to be somewhat useful, in mmany cases. The main educators are universities and trade schools. Not companies.
And if they want more loyalty the can always provide more incentives for juniors to stay longer.
At least in my bubble it's astonishing how it's almost never worth it to stay at a company. You'd likely get overlooked for promotions and salary rises are almost insultingly low.