inb4 "America is the world"
How confident do you feel about our economic policy? Even if your company isn't directly involved in international trade, there's a good chance your customers are. My customers are putting off non-essential software.
I remember back in 2017 when I was looking at yet another blockchain company which had raised huge sums of money to develop the next dubious blockchain of no value while throwing piles of money at large teams of PhDs, thinking that the world is in need of a recession to stop this lunacy. It happened.
So we are currently in 3 AD, where the last remaining entry level jobs are being hunted and made extinct.
Tech may just be dying in general.
This could finally be the collapse we have been waiting for.
While the US was allowing capitalists to hoard all of the USs excess capital, China was facilitating the development of important industries via central planning.
Look at Intel. The owning class has become so greedy and incompetent. They aren't even running functional companies anymore: just grifting for government money then using that for stock buybacks.
The only reason we imagine that there is a distinction is because most voters are too lazy to talk to the person they hired, so those who do stand out.
one is a large corporation with money paying for access and favors, the other is a regular person who votes. Absolutely absurd.
It remains that the system of government in the US expects regular lobbying from all voters. Without other qualifiers, lobbyists and voters are the same thing.
However I agree that democracies around the globe mostly work, in the sense that voters get what they are voting for.
year-on-year growth remained positive at +0.49%, marking the third consecutive month of annual improvement and suggesting a slow but steady recovery from the recent market slump.
These are all multi-national corps.
That form of capitalism is dead. There is no free market anymore (if there ever was). We currently have crony capitalism that operates on being able to secure cheap capital through "other" means.
There is no more efficiency in the current markets. Everything is massively over valued. Financial engineering is the name of the game. Major banks gamble with our money and when they collapse they get bailed out. That is not market efficiency.
> The train keeps going.
The only train that keeps going form here on out, is larger companies swallowing smaller companies. And the top siphoning wealth from the bottom.
Real capitalism would have had several market correction in the last 15 years, but the USA just keeps on printing money.
> Tech may just be dying in general.
Its not just tech, its any company that doesnt have infinite capital to integrate and monopolize every bit of the supply chain.
The thing is, this feature leaned on every bit of experience and wisdom we had as a team --things like making sure the model is right, making sure the system makes sense overall and all the pieces fit together properly.
I don't know that "4x" is how it works --in this case, the AI let us really tap into the experience and skill we already had. It made us faster, but if we were missing the experience and wisdom part, we'd just be more prolific at creating messes.
Usually, companies benefit more from slowing down and prioritizing, not ‘going faster’.
It will still be a win: the rewards for the new productivity have to go somewhere in the economy.
Just like other productivity improvements in the past, it will likely be shared amongst various stakeholders depending on a variety of factors. The workers will get the lion's share.
Unless it goes to me I'm not entirely sure why I should care
I'll keep doing things the old way thanks, unless I personally get some benefit from it
"You are more productive but compensated the same" just shows how many of you are suckers
This doesn’t seem to be supported by history.
Statistics like https://media.equality-trust.out.re/uploads/2024/07/incomedi... suggest that since 1970 the top 10% have profited more in their income than the bottom 50%.
Weird
I find AI helpful but no where near a multiplier in my day to day development experience. Converting a csv to json or vis-versa great, but AI writing code for me has been less helpful. Beyond boiler plate, it introduces subtle bugs that are a pain in the ass to deal with. For complicated things, it struggles and does too much and because I didn't write it I don't know where the bad spots are. And AI code review often gets hung up on nits and misses real mistakes.
So what are you doing and what are the resources you'd recommend?
Makes it seem like the actual problem to be solved is reducing the amount of boilerplate code that needs to be written, not using an LLM to do it.
I'm not smart enough to write a language or modify one, so this opinion is strongly spoken, weakly held.
To give a concrete example: I'm pretty good at doing Python coding on a whiteboard, because that's what I practiced for job interviews, and when I first learned Python I used Vim without setting up any Python integration.
I'm pretty terrible at doing Rust on a whiteboard, because I only learned it when I had a decent IDE and later even AI support.
Nevertheless, I don't think I'm a better programmer in Python.
Basically, I treat LLMs like a fairly competent unpaid intern and extend about the same level of trust to the output they produce.
What you will find is that the agent is much more successful in this regard.
The LLM has certain intrinsic abilities that match us and like us it cannot actually code 10,000 lines of code and have everything working in one go. It does better when you develop incrementally and verify each increment. The smaller the increments the better it performs.
Unfortunately the chain of thought process doesn’t really do this. It can come up with steps, sometimes the steps are too big and it almost never properly verifies things are working after each increment. That’s why you have to put yourself in the loop here.
Like allowing the computer to run test and verify an application works as expected on each step and to even come up with what verification means is a bit of what’s missing here and I think although this part isn’t automated yet, it can easily be automated where humans become less and less involved and distance themselves into a more and more supervisory role.
The first thing I'll note is that Claude Code with Claude 4 has been vastly better than everything else for me. Before that it was more like a 5-10% increase in productivity.
My workflow with Claude Code is very plain. I give it a relatively short prompt and ask it to create a plan. I iterate on the plan several times. I ask it to give me a more detailed plan. I iterate on that several times, then have Claude write it down and /clear to reset context.
Then, I'll usually do one or more "prototype" runs where I implement a solution with relatively little attention to code quality, to iron out any remaining uncertainties. Then I throw away that code, start a new branch, and implement it again while babysitting closely to make sure the code is good.
The major difference here is that I'm able to test out 5-10 designs in the time I would normally try 1 or 2. So I end up exploring a lot more, and committing better solutions.
Sometimes I´ll even go a bit crazy on this planning thing and do things a bit similar to what this guy shows: https://www.youtube.com/watch?v=XY4sFxLmMvw I tend to steer the process more myself, but typing whatever vague ideas are in my mind and ending up in minutes with a milestone and ticket list is very enabling, even if it isn´t perfect.
I also do more "drive by" small improvements:
- Annoying things that weren't important enough for a side quest writing a shell script, now have a shell script or an ansible playbook.
- That ugly CSS in an internal tool untouched for 5 years? fixed in 1 minute.
- The small prototype put into production with 0 documentation years ago? I ask an agentic tool to provide a basic readme and then edit it a bit so it doesn´t lie, well worth 15 minutes.
I also give it a first shot at finding the cause of bugs/problems. Most of the time it doesn't work, but in the last week it found right away the cause of some long standing subtle problems we had in a couple places.
I have also had sometimes luck providing it with single functions or modules that work but need some improvement (make this more DRY, improve error handling, log this or that...) Here I´m very conservative with the results because as you said it can be dangerous.
So am I more productive? I guess so, I don't think 4x or even 2x, I don't think projects are getting done much faster overall, but stuff that wouldn't have been done otherwise is being done.
What usually falls flat is trying to go on a more "vibe-coding" route. I have tried to come up with a couple small internal tools and things like that, and after promising starts, the agents just can't deal with the complexity without needing so much help that I'd just go faster by myself.
(https://www.oecd.org/en/data/indicators/hours-worked.html Mexico 2226 hours/week, US 1804, Denmark 1394)
Sounds like AI has landed you on burnout treadmill
The absolute best outcome of LLMs, and frankly where it seems to be headed, is the death of bloated one-stop-shop-for-everyone software. Instead people will be able to use their computers more directly than ever, without having to use/figure out complicated unintuitive swiss army knife software to solve their problems.
LLMs today can already make people the exact tools they need with no extra feature bloat or useless expansive packages. An print shop who just resizes their photos and does some minor adjustments not available from free tier software is no longer a slave to paying adobe $40/mo to use <1% of Photoshop's capabilities. They now can have their own tailor made in-house program for free.
LLM's will not be slotted in to replace devs on adobes dev teams. They can't work on a photoshop size codebase. However they will likely cut demand for Photoshop. Very few people will mourn the death of having to pay monthly for software just because there is a language barrier between them and their computer.
I think different generations of programmers have different opinions on what is quality output. Which makes judging the quality of code very context dependent.
If I were to guess I probably get somewhere in the range 10% to 20% productivity boost from LLMs. I think those are pretty astonishing numbers. The last time I got this kind of boost was when we got web search engines and sites like stack exchange.
I would suspect that if people experience 100% or more productivity boost from LLMs, something is off. Either we have very different ideas about quality, or we are talking about people who were not very productive to begin with.
I also think that LLMs are probably more useful if you are already a senior developer. You will have a better idea of what to ask for. You will also be in a better position to guide de LLM towards good answers.
...which kind of hints at my biggest worry: I think the gen-z programmers are facing a tough future. They'll have a harder time finding jobs with good mentors. They're faced with unrealistic expectations in terms of productivity. And they have to deal with the unrealistic expectations from "muggles" who understand neither AI nor programming. They will lack the knowledge to get the most from LLMs while having to deal with the expectation that they perform at senior levels.
We already see this in the job market. There has been a slight contraction and there are still a significant portion of senior developers available. Of course employers will prefer more experienced developers. And if younger developers believe in the hype that they can just vibe-code their way to success, this is just going to get worse.
Now this solution might not even use an LLM , it existed pre-chatgpt , but I think the word of mouth of chatgpt and AI is causing business people to seek out automations where they would normally hire.
Always the first questions I ask.
Most the purpose of hiring someone is handling edge cases, checking for fraud, etc - one client of mine made a single AP mistake (accepting a change to where to send payments) that cost them the equivalent of an AP clerk’s salary for a year.
They now have a part time AP clerk and part of her duties is calling any vendor who sends them a change of payment instructions. They’re fraudulent about half the time.
I approve payments as part of my role and the amount of stuff we get that looks good at first glance but has issues when you dig deeper is astonishing. Does that remind you of any technology?
Hope they do manage to automate it though. It's tedious work.
I really appreciate your help, and look forward to getting this solved today.
But this is a software company. I think out in the “real world,” there are some low hanging fruit wins where AI replaces extremely routine boilerplate jobs that never required a lot of human intelligence in the first place. But even then, I’d say that the general drift is that the humans who were doing those low-level jobs have a chance to step up into jobs requiring higher-level intelligence where humans have a chance to really shine. And companies are competing not by just getting rid of salaries, but by providing much better service by being able to afford to have more higher-tier people on the payroll. And by higher-tier, I don’t necessarily mean more expensive. It can be the same people that were doing the low-level jobs; they just now can spend their human-level intelligence doing more interesting and challenging work.
If I suddenly have to think really hard at my job all day and do terribly if I’m undersea and still get paid the same or less, I will be left pretty bitter.
(Not a serious suggestion, but I do see this in the wild a lot)
Maybe I'm the one who is ultimately a sucker, because I take too much pride in my work to do this
But I always thought that the quality of my work and my effort would be tied to my reputation, but I don't think the world works that way unless you are very well known somehow
An analogy: any idiot can take a calculus class today, but it took Leibniz and Newton to come up with it in the first place. (And even those geniuses didn't do it properly: it took until the likes of Karl Weierstrass and friends to put analysis on a firm footing.)
Companies are doing a shit job of SABRE metrics to luck into some team that takes them all the way.
To what we don’t know since the company only exists to satisfy social wank. Jobs are just distractions from the kind of political problems the reduction in jobs and pay is creating.
These things don’t exist for any known immutable physics, but as a human distraction from war. And here we are simulating the same outcome; oh well this group of layoffs did not survive their invasion of Normandy.
Losses by the commoner are to be expected in war! I mean business!
What a shock in a system bootstrapped by military industrial complex zeitgeist of the post world war era.
Which means you should need fewer of them, no?
> It can be the same people that were doing the low-level jobs; they just now can spend their human-level intelligence doing more interesting and challenging work.
Why were you using capable humans on lower level work in the first place? Wouldn't you use cheaper and less skilled workers (entry level) for that work?
I've never worked at a company that didn't have an endless backlog of work that needs to be done. In theory, AI should enable devs to churn through that work slightly faster, but at the same time, AI will also allow PMs/work creators to create even more work to do.
I don't think AI fundamentally changes companies hiring strategies for knowledge workers. If a company wants to cheap out and do the same amount of work with less workers, then they're leaving space for their competitors to come and edge them out
That was the narrative last year (ie. that low performers have the most to gain from AI, and therefore AI would reduce inequality), but new evidence seems to be pointing in the opposite direction: https://archive.is/tBcXE
>More recent findings have cast doubt on this vision, however. They instead suggest a future in which high-flyers fly still higher—and the rest are left behind. In complex tasks such as research and management, new evidence indicates that high performers are best positioned to work with AI (see table). Evaluating the output of models requires expertise and good judgment. Rather than narrowing disparities, AI is likely to widen workforce divides, much like past technological revolutions.
When I'm pulling out all the stops, leaving nothing for the swim back the really powerful (and expensive!) agents are like any of the other all out measures: cut all distractions, 7 days a week, medicate the ADHD, manage the environment ruthlessly, attempt something slightly past my abilities every day. In that zone the truly massive frontier behemoths are that last 5-20% that makes things at the margin possible.
But in any other zone its way too easy to get into "hi agent plz do my job today I'm not up for it" mode, which is just asking to have some paper-mache, plausible if you squint, net liability thing pop out and kind of slide above the "no fucking way" bar with a half life until collapse of a week or maybe month.
These are power user tools for monomaniacal overachievers and Graeberism detectors for everyone else (in the "who am I today" sense, not bucketing people forever sense).
(The article is about the UK job market. What tax rules have recently changed for engineers there?)
I agree that the former is a strong signal. However the latter doesn't tell you anything without further context: did interest rates go up, because the economy was strong, or did rising interest rates dampen the economy?
(It's similar to how you can't tell how hot it is in my apartment, purely from looking at my heating bills: does a low heating bill mean that it's cold in my flat, because I'm too cheap too heat? Or does a low heating bill mean it's summer and really hot anyway?)
It doesnt matter. Whether it went from strong -> weak or weak -> weaker is beside the point, the question is if genAI is the main reason for entry level job loss and raising interest rates are another possible answer.
Growth was weak to unremarkable although the hiring market was good for job seekers at the time shortly before the interest rises were introduced.
But the original comment I first replied to seemed to suggest that high interest rates should lead us to deduce a weak economy.
Stupendous loads of money have been allocated to a solution looking for a problem to solve.
https://www.gartner.com/en/newsroom/press-releases/2025-06-2...
Yeah it can correlate with the end of a post pandemic hiring boom, and it can correlate with the bank rate. But no matter what it also correlates with the rise of AI.
All are true and causation cannot be established for any of the 3 through just an observational study.
Given that AI tools are only really used for white collar work, but white collar professions have not been declining faster than entry level jobs in hospitality, vocational jobs, nursing or transportation (all of which are down), this gives you a pretty decent natural control group.
The whole debate about bifurcation of the labour market, that entry level coders are having a harder time than they used to, precedes even the pandemic or recent economic woes.
We don’t know if 2 causal events will stack. It may be that one causal event does so much damage that the second causal event can’t do much.
It’s like firing a bullet and throwing a stone at a window. Whichever came first will mask the causative nature of the second. The window can't break twice.
But specifically entry level is down significantly since Nov 2022.
All of your points - interest rates, post pandemic hiring boom would apply to market as a whole.
Not saying it’s causation like the article claims, but there’s at least some correlation trend.
These categories have seen broad application of AI tools:
- CS, you’ll most likely talk to an LLM for first tier support these days.
- Account management comprises pressing the flesh (human required) and responding to emails - the latter, AMs have seen their workload slashed, so it stands to reason that fewer are required.
- Paralegal - the category has been demolished. Drafting and discovery are now largely automated processes.
- Data analysis - why have a monkey in a suit write you barely useful nonsense when a machine can do the same?
So - yeah, it’s purely correlative right now, but I can see how it being causative is perfectly plausible.
If I am running a factory that use to create carriages and now creates cars, I need people who can create cars now. If I want to expand the number of customers I serve, I need to hire more people.
If I am a software company, I don’t need to scale the number of software engineers I hire to serve more customers.
Since gen AI has been a thing, I mostly pivoted to more strategy based cloud consulting than hands on keyboard software development. But before Gen AI, I would have needed a couple of junior developers to do the grunt work of implementing well defined implementations. Now I can do both the strategy and implementations in the same amount of time.
Even before Gen AI the entire reason that software engineers get paid so much because software development has high fixed costs but near zero marginal costs. No other industry has been like that historically.
I have never once said “it sure would be nice to have a few more junior devs. That would really increase our velocity”.
As someone who is responsible for getting projects done on time, within budget and meets requiremenga, why would I push for hiring fresh entry level devs instead of hiring a mid level dev with experience for only 20-30% more? The spread isn’t that great for enterprise developers.
It’s even more true now that I can push for hiring a mid level devs working remotely in East BumbleFuck South Dakota for peanuts.
For what’s its worth, I am classifying seniority by the ability to work at certain “scope” and “deal with ambiguity”, not someone who “codez real gud” and can reverse a b tree on the whiteboard
https://www.levels.fyi/blog/swe-level-framework.html
And there is a diminishing return on new features. If Google fired every developer not involved in search and ads, they could survive another decade or so and probably end up being more profitable since they can’t produce new good profitable products to save their lives
If you've ever tried to use AI to help with this kind of analysis, you might find this to be more inevitable than it is funny.
It's really, really, really good at confidently jumping to hasty conclusions and confirmation bias. Which perhaps shouldn't be surprising when you consider that it was largely trained on the Internet's proverbial global comments section.
Kind of like entry level software engineers.
I am kidding, I believe the market has more to do with tax changes than AI. I just couldn't pass up the joke.
> More broadly, entry-level roles (including apprenticeships, internships and junior jobs) have declined by 32% since November 2022, when ChatGPT’s commercial breakthrough triggered a rapid transformation in how companies operate and hire.
> Entry-level roles now make up just 25% of all jobs advertised in the UK, down from nearly 29% two years ago.
That's such a poor presentation of the numbers. If only they could have included a small data table with something like date|total-jobs|entry-level-jobs|percentage-entry-level.
Hopefully this transition benefits everyone. I just don't see how those with zero capital are going to survive well. Most of the US economy (sorry to be US-centric but I am American) is people performing services and information based work (or at least _tons_ of it is). This is the portion of the economy that is going to be the most and first affected by AI.
If companies can’t hire people to build the product they can’t afford to invest in entry level people to push it.
From my viewpoint, companies are in a soft hiring freeze so that they can maintain a cash cushion to deal with volatility.
1. For business stakeholders, they are motivated to find ways to use AI to achieve whatever they want, because business requests are always ASAP and they don't want to wait for downstream software engineers to do the job. This has always been the case since the start of the computing business, I believe. (Maybe not the case during the mainframe/supercomputing era as the "business" sometimes are engineers themselves so it's easier to communicate)
2. Many software engineering jobs are not that technically challenging. For example Frontend, Data engineering, etc. A lot of time is spent on requirement clarification and collecting. Business might eventually figure out that the best way to use AI is NOT to make AI adapt to humans, but to make humans to adapt to AI. I'm pretty confident that if we have an integrated AI and the business stakeholders can ask questions in a way that AI can understand, 80% of my job (as a DE) could be done by AI. But it requires feeding data into AI and training the business stakeholders to use it properly. TBF, a lot of my work COULD be solved by automation, but we never had the time to properly automate the data pipelines.
3. Whatever the outcome, junior software developers are having and are going to have to a tough time. Unless they work in some low level system programming positions, they have to prove that they are MUCH better than AI to justify for a hire. Nowadays companies expect senior developers to use AI to enhance their productivity, so juniors need to learn that too to catch up -- but they also need to learn to program without AI to actually obtain the knowledge properly.
4. AI is going to impact every field, not just programming. Office work is the first to get impacted, and then blue collars too. Unions and governments are going to hold for a while, maybe a long while, but as long as there is one major player (China for example) that is pushing for AI advancement, the others HAVE to follow suite, or they run the risk of losing everything. The Russo-Ukraine war and Israel-Iran war are the bells that toll for all of us, and the existence crisis brought upon all of us by the increasing numbers of hot wars are going to make everyone push for productivity, whether in production or killing.
5. If we can't figure out how to make the world a bit more fair and happier for the average people (you and me) because AI really takes off, a dark dystopian future awaits us. Good luck. We might have some 10-20 years to achieve that.
they fired lots of people. and the ones left were happy to do the extra work.
It will be blocked by the big players such as indeed and LinkedIn, and possibly also blocked by direct corporate websites. So wouldn't take any notice of it.
If the average salary and employment rate both drop then that would be a sign.
https://www.theguardian.com/uk-news/2024/oct/30/bosses-reeve...
But then it gives details:
> By contrast, the healthcare and nursing sector – previously a consistent driver of job growth – saw vacancies fall by 10.21% in May. Other sectors with notable drops included admin (-9.22%), maintenance (-7.95%), and domestic help and cleaning (-5.72%).
Am I missing something here ? Apart from the "admin" part, I fail to see how the reduction in "healthcare, nursing, maintenance, domestic help and cleaning" can be attributed to ChatGPT, or LLMs in general.
Or am I misunderstanding the claim ?
I can completely imagine why people would fire their house cleaner, care for they elderly themselves, postpone repairing that fridge, etc... But it would purely be because: "we can't afford it". So, revenue inequality, energy shortage, world insecurity, etc... could play a part. LLMs ? I don't see how.
(Or would it be a ripple effect of growing inequality ? The AI boost is increasing inequality, and now, tech bros are the only one who can afford going to the restaurant or hiring a cleaner - but they only eat 3 times a day, and only have x rooms to clean ?)
Quite frankly: "in the last 2 years since ChatGPT came out, we had dozen of trade wars and three real ones, no one wants to invest any more" looks like a more plausible explanation.
Google launches largest office in India https://www.entrepreneur.com/en-in/news-and-trends/google-la...
Microsoft India head says no layoffs in India https://timesofindia.indiatimes.com/technology/tech-news/mic...
Wages are kept suppressed, keeping people (citizens and not) desperate.
The people that have (e.g.) immigrated into America, but not naturalized yet, are in extremely perilous positions, beholden to a corporate entity which would rather employ them (for wayyyyy less salary) than cater to free-er citizens.
¢¢
So, the Indian CEOs of Google and Microsoft perform their duty and turn the companies into boring has-been companies like IBM.
My mid-sized US city (Chattanooga) has an MSA of <500k ppl, yet employs approximately 1,762 H1-B visaholders (primarily as software engineers and data analysts, median salary $85k) [0]. Apparently nobody local is able/willing to perform these jobs?!
And yet the complaint/advice I hear most from local techies is to "WFH at a national company if you want to actually make any money, here. Or move elsewhere." Or some other iteration of "there aren't enough IT jobs here."
I'm a blue collar tradesman, so WFH isn't really practical; but I'd definitely have to move elsewhere if I were in tech and didn't want to WFH.
[0] https://h1bdata.info/index.php?em=&job=&city=chattanooga&yea...
We get access to truly exceptional people for whom companies are willing to pay exceptional wages, and we eliminate the exploitation of H1B visa workers.
Definitely like the idea of removing the lottery system — great suggestion.
That was inevitable the moment remote work caught on. Software engineers in rich countries were stupidly short-sighted to cheer on the remote work. If your work can be done from anywhere in the US, it can be done from anywhere in the world.
If you think timezones or knowledge of English will save you, Canada has much lower wages for SWEs and central/south America has enough SWEs with good English skills. They are also paid one third or one fourth of what SFBA jobs used to pay. No wonder all the new headcount I have seen since 2022 is abroad.
Remote work, high interest rates and (excuse of) AI coding agents has been the perfect storm which has screwed junior SWEs in the US.
Let's also not forget, India is a massive and growing market in its own right. Literally the 4th-largest economy in the world, soon to be 3rd. It's like China at the turn of the century.
But it is not a great place for these firms because disposable income is so low. For example, the US generates the most revenue per user for Google because it has a really high income. India is unlikely to make any significant part of tech firms revenue for a long time.
And it's been 25 years since the turn of the century. So I was approximately correct.
> There are lots of structural problems about the economy, and growth will likely be uneven and slower in the future.
It could be better than expected, or it could be worse. Predictions are hard, especially about the future.
> But it is not a great place for these firms because disposable income is so low
That matters for advertising revenue. A $30m contract for cloud services is still a $30m contract. And in any case, software is famously high-margin. Low revenue per user often isn't a problem if you have a lot of users. They're still profitable, just not wildly profitable.
> the US generates the most revenue per user for Google because it has a really high income
It's also saturated. The market demands growth.
Even though India will be relatively poor per-capita, in absolute terms it will be a bigger economy than Japan or Germany in the next 5 years (it's already bigger than one of them). Would any serious multinational company ignore the Japanese or German markets or deem them irrelevant to business?
As well as unemployment, people struggling with CoL, you have a dire shortage of labour, in so many industries - healthcare, transportation, retail, hospitality, childcare&teaching and surely others. In particular, that seems to suggest that at the same time you have people and companies who can't find jobs / employees.
Entry level postings being down could mean it's getting more squeezed still, or could mean people are giving up on hiring for these unfillable posts. Or something else still.
Either way it's a basket case.
A: ChatGPT exists
B: Entry-level jobs are down
Apply some propositional logic or something: A&B => A->B
Therefore ChatGPT took the entry level jobs.As others have pointed out, we're off the back of COVID and money is tight. Most companies are looking to shrink team sizes, hence the lack of hiring. AI tools have almost zero relevance, and the rest of the article does not substantiate the claim at all.
> Despite this, year-on-year growth remained positive at +0.49%, marking the third consecutive month of annual improvement and suggesting a slow but steady recovery from the recent market slump.
There are things that can be done to screw with numbers, but it only works for a short time. The UK for example only counts unemployment as those actively looking for work, but it is well known that there is a growing number of permanently unemployed which is hidden. The number of people applying for PIP (a type of disability allowance typically awarded to those unable to work) is growing by 1k a day [0]. In perspective, the population of the UK grows by ~1.6k a day and is projected to fall.
The growth figure is likely based on the ONS [1], which are constantly having to revise their figures for data that was already available at the time of publication [2] - they do this a lot. One hack by this government is when they grew the public sector at the cost of the private sector, which is unsustainable. A company I know halved their employees but still pay the same amount of national insurance, so the economic value per employee is reduced due to increased overheads.
This is not unique to the UK, though. I speak to many Europeans that go through a similar situation, and I hear of similar stories throughout the West. The EU for example has a growing issue with ECB bond management where it plays games to to try to bring down borrowing costs. The Japanese market, a large purchaser of foreign debt, is starting to get concerned [2], and signals a potential crisis for the US to be able to borrow at low rates. From all accounts China is just about keeping their head above the water and are pulling a lot of tricks.
[0] https://www.theguardian.com/politics/2025/jun/24/pat-mcfadde...
[1] https://www.ons.gov.uk/economy/grossdomesticproductgdp/bulle...
[2] https://www.bbc.co.uk/news/live/business-68680004
[3] https://www.reuters.com/markets/asia/japan-consider-buying-b...
The game has become how quickly can we train someone up so that the client will accept them as a "senior"
lsharkey602•7h ago
Engineering / programming entry level jobs down by one third: https://www.telegraph.co.uk/business/2025/02/23/fears-of-ai-...
Finance / corporate sectors entry level jobs down by one third: https://www.telegraph.co.uk/business/2025/06/22/city-giants-...