I believe it's WFH. It taught companies remote work, and it's a small next step to offshore work.
Money was free, so a lot of people were paid out of thin air. When money stopped being free salaries actually had to come from somewhere, and there weren't enough somewheres to go around.
I'm sure I'm not the only one who remembers all those posts on hn 2-3 years ago about how bad the job market is, right? It has only become worse.
I know a kid who interned at a job last summer. Graduated, applied to a full-time job at the company. He happened to know someone in HR who told him "we got over a thousand applications for this job req in one day."
How tariffs can be blamed for that kind of situation, which is happening all over the US and has been for literal _years_, defies logic.
Hell, my first job decades ago was as cheap labor in an IT project offshored from the US.
First, the Trump administration's economic impact is much more than tariffs - which are highly significant - but unprecedented interference in the free market and private business; destruction of the regulatory and other institutions that a stable economy depends on (e.g., democracy, the courts, the rule of law, bank regulation); disrupting the stability of international relations, possibly leading to highly destructive things like wars.
Also, the recent trend of business to switch from (broadly speaking) innovation and productivity to rent-seeking, epitomized by private equity: cut workforces, other investment, and product to the bone and squeeze as much as possible out of the asset - leaving a dead, dry carcass.
You can't just make a blanket statement about the entire economy and say "it's obvious". We live in a big world. Your perception is not my perception. That's why data is so important.
It doesn’t bite me as much due to seniority but it’s still happening
Tbh if I was younger I’d just try to relocate myself seems fun
What is "mid-sized" ?
There's so much vague descriptors in your post as to be almost entirely meaningless.
Because from personal experience I've seen loads of companies wind down and remove internal teams like internal QA in favor of outsourcing to other regions, for example. It's made my job extremely annoying because I can't just tap the shoulder of my nearby QA engineer and see what's up, I have to wait an entire day for them to get the next build and then deal with the wrong stuff they've reported.
Thanks! :)
"Unless you have investigated a problem, you will be deprived of the right to speak on it. Isn't that too harsh? Not in the least. When you have not probed into a problem, into the present facts and its past history, and know nothing of its essentials, whatever you say about it will undoubtedly be nonsense. Talking nonsense solves no problems, as everyone knows, so why is it unjust to deprive you of the right to speak?"
https://www.marxists.org/reference/archive/mao/selected-work...
The only caveat here is that Mao didn't follow his own advice, lol
And that's the problem. "Do your research before spouting off" is good in principle. But it gets weaponized via "if you don't agree with me, you obviously haven't done your research", and used to silence anyone who disagrees.
We have to put up with uninformed opinions, in order to leave space for dissenting views.
In reality, it's likely several factors:
- Offshoring/outsourcing
- Businesses running out of ZIRP cash/returns
- Software replacing a lot of "paper" jobs (AI being a sliver of this)
- Older people needing and not vacating jobs like past generations
- Higher CoL expenses meaning lower-paying jobs that would/could be occupied by a recent graduate aren't.
- General market sentiment being cautious/conservative due to the whiplash of the last 17 years.
As with most things, it's not one convenient scapegoat, it's many problems compounding into one big clusterf*ck.
I would not be surprised at all if other companies have quietly done the same while touting 'the future of AI', because as a society we seem to grudgingly accept automation far more readily than we accept offshoring.
Why is everyone hell bent on blaming foreigners, rather than the management that actually makes these decisions, and domestic elected officials who actually are responsible for and make decisions that affect the economy.
The wave of tech layoffs a few years ago were blamed on AI but were so obviously attributable to interest rates and changing tax policies that the idea that the proto-AI tools we had at the time were responsible was laughable.
This shift we at least have better AI tools to blame (though I'd argue they're still not good enough), but we also have a US President who has straight up said that he wants to cause a global recession and has been doing everything in his power to make that happen.
Given that the impact of someone running a bulldozer through the economy like this is well-studied and that we'd predict exactly what were seeing here, attributing the damage to AI is silly verging on responsible. Place the blame where it belongs!
At the end of this Atlantic article, the author admits:
> Luckily for humans, though, skepticism of the strong interpretation is warranted. For one thing, supercharged productivity growth, which an intelligence explosion would likely produce, is hard to find in the data. For another, a New York Fed survey of firms released last year found that AI was having a negligible effect on hiring.
In other words: did we scare ya? Good, because it got you to read this far. Nothing to actually see here.
Even those who do get employed, they tend to be underemployed with low wages.
The old excuse was 'automation' was killing jobs.
The lesser old excuse was offshoring.
Now it's AI?
How about we stop inventing excuses and perhaps look at root cause of the 'recent grad' factor. That perhaps requiring university degrees that arent worth anything for jobs that dont need them is the problem?
I don't know?
Kind of sounds like the problem is more fundamental than that. It sounds like the job is not actually there in the first place. Doesn't matter how qualified you are if there's no money to pay you.
1) Hype, as you said, leading to delayed hiring.
2) Offshore workers using AI becoming more competitive.
3) Reallocation of capital to AI projects, which skews demand towards data scientists at the expense of, say, front-end devs (who ironically might have provided better return).
None of these are actually reducing the amount of human workers needed, but the social and economic impact is real.
Even if you are the biggest critic of AI, it's hard to deny that the frontier models are quite good at the sort of stuff that you learn in school. Write a binary tree in C? Check. Implement radix sort in Python? check. An A* implementation? check.
Once upon a time, I had to struggle through these. My code wouldn't run properly because I forgot to release a variable from memory or I was off-by-one on a recursive algorithm. But the struggling is what ultimately helped me actually learn the material [2]. If I could just type out "build a hash table in C" and then shuffle a few things around to make it look like my own, I'd have never really understood the underlying work.
At the same time, LLMs are often useful, but still fail quite frequently in real world work. I'm not trusting cursor to do a database migration in production unless I myself understand and check each line of code that it writes.
Now, as a hiring manager, what am I supposed to do with new grads?
[1] which I think it might be to some extent in some companies, by making existing engineers more productive, but that's a different point
[2] to the inevitable responses that say "well I actually learn things better now because the LLM explains it to me", that's great, but what's relevant here is that a large chunk of people learn by struggling
But I've also witnessed interns using them as a crutch. They can churn out code faster that I did at an equivalent stage in my career but they really struggle debugging. Often, it seems like they just throw up their hands and pivot to something else (language, task, model) instead of troubleshooting. It almost seems like they are being conditioned to think implementation should always be easy. I often wonder if this is just "old curmudgeons" attitude or if it belies something more systemic about the craft.
I don't feel this is a strong argument, since these are the sort of things that one could easily lookup on stackoverflow, github, and so on for a while now. What "AI" did was being a more convenient code search tool + text manipulation abilities.
But you still need to know the fundamentals, otherwise won't even know what to ask. I recently used GPT to get a quick code sample for a linear programming solution, and it saved me time looking up the API for scipy... but I knew what to ask for in the first place. I doubt GPT would suggest that as a solution if I described the problem in too high level.
Honestly though, I recently asked Claude 3.7 Sonnet to write a python script to upload ssh keys to a mikrotik router, to prompt for the username and password -- etc. And it did it. I wouldn't say I loved the code -- but it worked. Code was written in more of a golang format, but okay. It's fine and readable enough. Hiring a contractor from our usual sources would have taken a week at least, probably by the time you add up the back and forth with code reviews and bugs.
I think for a lot of entry level positions (particularly in devops automation say), AI can effectively replace them. You'd still have someone supervise their code and do code reviews, but now the human just supervises an AI. And that human + AI combo replaces 3 other humans.
> Write a binary tree in C? Check. Implement radix sort in Python? check. An A* implementation? check.
You can look up any of these and find dozens of implementations to crib from if that's what you want.
Computers can now do more, but I'm not (yet) sure it's all that different.
When I change the spark plugs in my car, am I robbing myself if I'm not learning the intricacies of electrode design, materials science, combustion efficiency, etc.? Or am I just trying to be practical enough to get a needed job done?
To the OPs point, I think you are robbing yourself if the "black box" approach doesn't allow you to get the job done. In other words, in the edge cases alluded to, you may need to understand what's going on under the hood to implement it appropriately.
I don't know why we're pretending that individuals have suddenly lost all agency and self-perception? It's pretty clear when you understand something or don't, and it's always been a choice of whether you dive deeper or copy some existing thing that you don't understand.
We know that if we cheat on our math homework, or copy from our friend, or copy something online, that's going to bite us. LLMs make getting an answer easier, but we've always had this option.
Did you drive to work today? Did you learn everything about the tensile strength of nylon seatbelts before you buckled up? How about tarmacadem design before you exited your driveway? Or PID controls theory before you turned on the HVAC?
The point I’m making is that some people disagree about how much you need to know. Some people are ok with knowing just enough to get a job done because it makes them more productive overall. So the question still stands: How do you gauge when learning is enough? To my point above, I think it comes down to whether you can get the job done. Leaning beyond that may be admirable in your mind, but not exactly what you’re being paid for, and I think some experts would consider it a poor use of time/resources.
Do you care if your subordinate wrote a good report using a dictionary to “cheat” instead of memorizing a good vocabulary? Or that they referenced an industry standard for an equation instead of knowing it by heart? I know I don’t.
I'm not in the business of prescribing philosophies on how others should live their lives?
For example, there's actual liability (legal and financial) involved in building a bridge that subsequently falls apart - not so with small bits of code block. Similarly there's a level of academic rigor involved in the certification process for structural/mechanical/etc. engineers that doesn't (and maybe can't?) exist within software engineering.
But the same is true for code? You are held to the same standards as if you had written it yourself, whatever that may be. Frequently that is nothing.
What change do you want to see here?
I remember going to a cloud meetup in the early days of AWS. Somebody said "you won't need DBAs because the database is hosted in the cloud." Well, no. You need somebody with a thorough understanding of SQL in general and your specific database stack to successfully scale. They might not have the title "DBA," but you need that knowledge and experience to do things like design a schema, write performant queries, and review a query plan to figure out why something is slow.
I'm starting to understand that you can use a LLM to both do things and teach you. I say that as somebody who definitely has learned by struggling, but realizes that struggling is not the most efficient way to learn.
If I want to keep up, I have to adapt, not just by learning how to use tools that are powered by LLMs, but by changing how I learn, how I work, and how I view my role.
I'm using AI to explain things to me.
And I'm still struggling, I'm just struggling less.
That's progress I think.
dboreham•4h ago
mmooss•4h ago
People have been saying things like that probably since the creating of the corporate model. (Exponential is too much, but I'll take that as exaggeration to make the point.)