I'm guessing that many people with a art history degree do not work in art history...
I wonder how to reconcile those stats with the stories I hear about the CS job market.
Most of the stories you hear about difficulties getting hired are from new grads. Anecdotally, companies have become far less willing to train juniors over the past few years, they only want to hire seniors that other companies have already trained. It would be interesting to see these per-major underemployment numbers filtered by whether someone had recently graduated.
Might have something to do with the appalling employee retention numbers in today’s tech scene.
Why bother spending six months, and five figures, training someone, only to have them hop to another company in 18 months?
For some reason, companies would rather pay insane salaries, than treat employees well enough to encourage them to stay.
(We'll know when new numbers come out of the feeling is correct?)
Stats have ground truth underlying them. 6%-7% is less text than the big numbers popular for brevity. But it’s just mathematical euphemism for a real count.
Tbh with the federal deficit at 6-7% of GDP I think it'll be somewhat hard to produce a recession. Eventually companies will figure out the shoe isn't going to drop and start hiring again.
I’m biased and it worries me that the above is also what I’d like to believe, rather it being than a permanent tightening of the screws on SWEs. We could test the hypothesis to see if the same trends happened in other countries (like Canada) who didn’t change their tax policies.
> about the highest level in a decade—excluding the pandemic unemployment spike
Why "about"? What was this number 5 years ago? 10 years ago? 20 years ago? During the dotcom bubble? The housing crisis? An actual recent crisis (the pandemic) is conveniently excluded from the comparison for some reason.
Weird for the WSJ to declare an "unemployment crisis" based on a handful of anecdotes and no actual data.
My qualitative view on the job market right now is this line holds true
> But it is a bad time to be a job seeker—especially if you are young.
There's other questions about full-time or part-time, or multiple jobs.
(Multiple jobs are interesting because you'd think they'd belong to overworked underpaid poor people. But they're not really associated with that, and they go down in recessions since that's when you have no jobs.)
Additionally it's funny to see the term "housing crisis" be applied to the past rather than the present. If that means 2008, we tend to call it the 2008 financial collapse, but our response to it created the current conditions of what Canadians would now call the "housing crisis"
We saw this in 2008 post-GFC where entry-level white collar jobs just completely disappeared. It was really the start of millenials graduating with a ton of debt, possibly postgrad degrees, and working at Starbucks. Not because their degrees are useless. Their entry-level jobs just disappeared.
This has never recovered.
So you don't have to search long to find now 40 year olds who are permanent renters, have barely enough in their bank account to pay this month's bills, definitely don't own their own house, still have a ton of student debt they're unlikely to ever be able to repay and realizing they have no hope and they have no choice but to work 3 jobs until they die.
Yet those who believe in the myth of meritocracy just write this off as a personal moral failure or getting "philosophy degrees". At the older end, boomers simply have no idea because they bought their $2 million house for $11,000 in 1976.
Failure to understand that means being surprised by the groundswell to Trump and Bernie in the 2016 election cycle they each represented change in their own way. Those who have benefitted from the current system simply don't understand that many want to tear down the system. They have nothing left to lose.
Gen-Z is now going through this exact same thing. Many don't yet understand they're looking at their future when they see a 40 year old barista or DoorDash driver.
All while the ultra-wealthy continue to get even wealthier at an extraordinary rate. We will likely see the first trillionaire in our lifetimes.
This cannot and should not continue.
I have rarely met someone with a STEM degree who was entirely unable to get a decent paying job.
It is not unreasonable to say that some degrees are not as valuable as others and will be more likely to struggle financially. Its a game of statistics. You are more likely to struggle financially with a degree in philosophy than a degree in engineering. Because even companies themselves when hiring for completely unrelated positions to a persons degree will take into account the fact that the engineering graduate probably worked a lot harder than the philosophy graduate.
But I do agree that the average non-degree or "less valuable" degree holder from the past had a much larger chance of making it out okay than nowadays.
the skillset you get from philosphy make it a common degree for folks who want to study law. a big part of studying philosophy is learning how to construct and analyze ideas and arguments so you would be well suited for consulting, politics, marketing, etc.
I mean. Have you met anyone who's graduated in the past year or two?
I'm exaggerating, but seriously, I know multiple people who graduated with CS or IT degrees from reputable institutions, some with decent prior experience, and they've gotten nothing back for months if not years. Plenty of similar stories in this thread. It's pretty bad out there. Agreed that it's still probably better than the proverbial philosophy or art degree, but still.
Thankfully that went so well!
Sure, a few more poor people voted for trump and a few more rich people voted for Harris, but it basically rounded to 50-50. Rich people want to tear down the system too.
In fact, I think I'd want to see a breakdown by belief system. My gut is that generally speaking working class people believe in meritocracy more than rich people, and that is in fact why they voted for Trump. To not be lumped in with the 'DEI hires'(their perspective, not mine).
https://onlinelibrary.wiley.com/doi/10.1111/1468-4446.12930#...
The above research suggests that poor people living with high inequality are more likely to believe in meritocracy.
Think about the implications of that. There are people barely able to survive who will defend tax cuts for Jeff Bezos. These are modern day serfs. The believe the current economic order is good actually despite their bad personal circumstances. In fact, any bad personal cricumstances are the fault of [insert bogeyman group here] (eg migrants, trans people, black people, women).
And nobody seems to think about the period of history they fetishize (the 1950s) had the highest marginal tax rate of 91%.
The Democratic Party in the US is absolutely complicit in all of this. They've intentionally chosen to quash any worker momentum and absolutely refuse to address any of the legitimate material concerns of working people.
Don't forget that it also had the highest level of tax dodging available through various mechanisms.
If you think the lengths people go to now are extreme to avoid taxes-- think about what they did were willing to do then.
Minus the Covid blip, not lately!
Sure.
Real income has barely increased since 1980, as in 10% or less.
You may look at that and say well that's good because it's real income but it isn't. There are substitution issues with CPI. The housing component is lagging, relies on "in-place" rent and doesn't really reflect quality or size or housing affordability (just rents).
Look at other measures, like homelessness increasing 18% in 2024, consumer confidence and how for the last 20 years people have flocked to any political candidate that promises meaningful change, from Obama to Bernie to Trump to Zohran.
HN in particular and tech in general is a bubble. It insulates you to a large degree from the median experience. We are profoundly privileged. But privilege convinces some that everything is meritocracy when each of us is profoundly fortunate to be where we are (eg being born in a Western country, having a relatively stable family life, having access to education, speaking English and so on).
It's why the ZIP code you were born in is possibly the biggest predictor of your success [1].
[1]: https://www.lisc.org/our-resources/resource/opportunity-atla...
Real median household income is up over 36% since 1984 (the furthest back the linked chart goes: 1984 was in the middle of an economic expansion, so you’d expect a comparison with four years earlier to look even better).
And CPI has just as many potential substitution issues the other way: Hedonic adjustments are made, but it’s effectively impossible to quantify the value of decades worth of novel and improved goods that simply didn’t exist decades earlier.
HN in particular and tech in general is a bubble. It insulates you to a large degree from the median experience.
I linked decades of numbers about the median household, which you then numerically misrepresented. Bubble, pop thyself.
(Personally, my preferred measurement of inflation is "how much prices go up", but economists don't agree with me)
Wages have been rising faster than inflation, that’s link #2.
Propaganda is everywhere but so many people believe their countries are free of propaganda.
Anyway, I'm not sure how democracy can really work in huge super-complex societies. This is why we have the "iron law of oligarchy".
The best thing I did was tap out, sell my car, turn in my apartment keys, bought a one way ticket, and stuff my life into large backpack. I saw lots of things, made lots of friends, and met a life partner. Simply because life decided to unplug the career treadmill and there was no point in me trying to run on it.
After that we were like ok so VP is basically like what every other company calls a supervisor. We are not going to take crap from them anymore.
Yeah I"m aware of the pay-to-title issues though that was secondary.
Oh, so like gunfighters trying to be a bit more safe while sitting in saloons, in Western (cowboy) novels like the Sudden (name of the hero) series by Oliver Strange.
Grew up reading novels like those (and other genres) as a kid. Mainly bull, but somewhat good writing, and entertaining to a green youngster like me at the time, at least.
Good times. Sigh.
Dey don't make em like dat no more.
Grrr.
:)
Customers like to think they’re talking to a senior manager about their home loan, rather than some worker bee. Makes them feel important.
For every romantic story you read about someone selling their stuff and striking out into the world, there are a bunch more stories of people who ruined a reasonable life trajectory chasing a vague dream or simply fleeing discontent.
Talking to a therapist and practicing gratitude is a lot cheaper than burning through life savings. That said, if you're genuinely miserable in your life and career, definitely pursue changes. You should probably consult professionals (therapists, life coaches) in that case, too.
So I think if you're the type of person that's even asking about it, then just do it.
[*] I'd say one caveat is, don't go broke doing so; save/invest enough and live cheaply enough that you're coming close to break-even. The other is, have a decent network so that when you want to re-enter the job market, you'll have people to contact. That makes job hunting an OOM easier.
Of course this is an aggregate figure, but it goes to show how uneven economic outcomes can be across cohorts.
And yet, he can't even get an interview. He worked at Dropbox for a year as a contractor right out of school, until they did a huge layoff and hasn't been able to find anything in 6 months. Real interviews are super rare - most of it just recruiters fishing for stuff.
So that is the reality that he and his peers are facing.
Recommendations from trusted employees are valuable not out of nepotism or some other sinister force, they're valuable because it acts a pre-filter for the kinds of fuckups that no one would be willing to recommend.
There's only losers here. For companies, they can't find good candidates. They're inundated with fake resumes, even fake interviews. They're bombarded by bots.
For job seekers, they can't distinguish themselves. It's devolved into a sick numbers game. Want to get a job? Send the most applications. 100, 500, 1,000 - whatever it takes. 99% of jobs won't even so much as email you telling you you've been rejected. Just the logistics of keeping up with so many applications will eat the job seekers time.
In my classes there is hardly anyone that has been able to get their hands on an internship, and even the professors have started their classes with monologues about "I don't even know why you show up, none of you will have jobs after graduation, good luck out there." (quote from my DS professor) A lot of my peers are looking to move out of the US and look for jobs elsewhere, or perhaps jump straight into graduate school to ride it out.
On the Computer Engineering side, the faculty seems a lot happier, and the students also seem to be better off. But I don't think this will last however, I have noticed a steady decline in the businesses that have been searching for Computer Engineering in our career fairs. When I enrolled there were about two dozen "Computer Engineering Wanted" posters at the fair, and the last one in Feb 2025 I only counted one.
I'm honestly thinking that if this continues I'll be looking at the military, right now I'm trying to work on side projects in the meantime.
In 2009, in the midst of the financial crisis, one of my commencement speakers (and the recipient of an honorary doctorate) was Kenneth Chenault, CEO of American Express. I don't remember his exact words, but his message to the graduating class was, we have a different perspective on the world and different values— thriftier ones, necessarily— and if we stay true to them, the world will reflect them when we succeed.
"Maybe instead of having a car, like your parents' generation, your first big purchase may be a bike. Times change." Something like that.
Four days later, he laid off four thousand workers from AmEx, just a smidge more people than the graduating class.
Edit: according to Wikipedia, that year he took home $16.6 million.
I've conducted two phone screens this month and asked each candidate to implement FizzBuzz in their language of choice after giving them an explanation of the problem. Both took more than ten minutes to write out a solution and we don't even require them to run it; I'll excuse trivial syntax errors in an interview setting if I can tell what you meant.
When CS students can't write a basic for loop and use the modulo operator without relying on AI, I weep for their generation.
I honestly think that doing an in person fake technical interview with a few easy Leetcode questions at the end of your education would be a good way to weed out those that have failed to even learn the basics of the trade.
One way I try to disincentivize cheating on projects is by having in-class paper exams, including weekly quizzes, as well as in-class paper assignments, and making sure that these in-class assessments are weighted significantly (roughly 60% of the overall grade). No electronic devices are allowed for these assignments. This forces my students to be able to write code without being able look up things online or consult an AI tool.
I still assign take-home programming projects that take 1-2 weeks to complete; students submit compilable source code. Practical hands-on programming experience is still vital, and even though cheating is possible, the vast majority of my students want to learn and are honest.
Still, for in-person assessments, if I had the budget, I’d prefer to hand out laptops with no Internet connection and a spartan selection of software, just a text editor and the relevant compiler/interpreter. It would making grading in-class submissions easier. But since we don’t have this budget, in-class exams and exercises are the next best solution I could think of.
As the world changes, education can be slowest to adapt. My father did his math on a slide rule. I was in high school as we transitioned to using calculators.
My personal take on your approach is that you're seeing this from the wrong side. Creating an artificial environment for testing suggests to me you're testing the wrong thing.
Of course most school, and college, classes devolve to testing memory. "Here's the stuff to learn, remember it enough to pass the exam." And I get it, this is the way it's always been, regardless of the uselessness of the information. Who can remember when Charles 1st was beheaded? Who can't Google it in an instant?
Programing on paper without online reference tools isn't a measure of anything, because in the real world those tools exist.
Indeed, the very notion that we should even be testing "ability to write code" is outdated. That the student can create code should be a given.
Rather an exam should test understanding, not facts. Here's 2 blocks of code, which is better and why? Here's some code, what are the things about it that concern you?
Instead of treating the use of AI (or Google, or online help, or that giant C reference book I had) as "cheating", perhaps teach and assess in a world where AI exists.
I truly do get it. Testing comprehension is hard. Testing understanding is hard. Testing to sift wheat from chaff is hard. But, and I'm being harsh here i know, testing memory as a proxy for intelligence or testing hand-code-output as a proxy for understanding code is borderline meaningless.
Perhaps in the age of AI the focus switches from 'writing code' to 'reading code'. From the ability to write to the ability to prompt, review, evaluate and so on.
Perhaps the skill that needs to be taught (to the degree that community college seeks to teach skills) needs to be programing with AI, not against it.
I say all this with respect for how hard your job is, and with my thanks that you do it at all. I also say it understanding that it's a huge burden on you that you didn't necessarily sign up for.
It's similar to a calculator. We give student graphing calculators, but ONLY after they have already graphed by-hand hundreds of times. Why? Because education does not work like other things.
Efficiency, in education, is bad. We don't want to solve problems as fast as possible, we want to form the best understanding of problems possible. When I, say, want to book an airplane ticket, I want to do that in the fastest way possible. The most efficient manner. I care not about how an airport works, or how flight numbers are decided, or how planes work.
But efficient education is bad education. We can skip 99% of education, if we wanted. We can have, say, the SAT - and spend 1 year studying only for the SAT. Don't bother with the other 12 years of schooling.
Will you get an acceptable score on the SAT this way? Maybe. Will you be intelligent? No, you will be functionally illiterate.
If we use AI for programming before we can program, then we will be bad programmers. Yes, we can pass a test. Yes, we can pass a quiz. But we don't know what we're doing, because education is cumulative. If we skip steps, we lose. If we cut corners, we lose. It's like trying to put a roof on a house when the foundation isn't even poured.
I'm so old we learned to program with giant C reference books. There was no internet, much less Google. We didn't have no fancy auto-complete, crumbs a text editor was considered advanced. Them youngsters coming to us couldn't program without Googling syntax, or using an IDE.
So yeah, sure, AI is changing the game. It's hard to evaluate students because the tools they are using are different to our experience. For decades we "make them code" as a measure of ability. In 3 years (their college experience) the toolset has changed.
Good students, good employees, are those who understand the problem and can adapt to a solution. AI is a tool that can be wielded well, or badly. Our approach to hiring will need to adapt as well. But good people are still out there, and good people make good workers.
To be honest I never was much in love with the leet code measure of hiring. Past a certain coding skill level I was more interested in the person than their ability to memorize an algorithm. Today that necessary skill level is lower, or at least harder to evaluate, but the problem-solving-mind is still the thing we're looking for.
So be careful of seeing the use of new tools as a weakness. The history of the world is littered with obsolete technologies. (Here's a sextant, where are we?) Rather see people who use tools for what they are, tools. Look for people who are curious, who see patterns, who get things done.
And to students I say, mastery of tools is a necessary step, but ultimately an uninteresting one. See beyond them. Be curious. Look under the hood. Ask questions like "is this code good enough to be running 30 years from now?" Because a huge amount of what you see now has foundations in code written a long time ago, and written well enough to stand for decades.
College is not "learning to program". College is learning how to adapt to an ever changing world, that will require your adapting many times over your career.
You're gonna have to do a lot of work to convince me that people who only know how to drive an LLM are learning how to adapt to sweet fuck all
At least with a calculator, people still had to know the difference between addition and multiplication, in order to use the calculator correctly
What if changing from a "write code" based idea of programming changes to a "remove technical debt from code" skill?
What if the next generation of programmers is not focused on the creation of new code, but rather the improvement of existing code?
What if the current crop of programmers has to literally adapt from a world that has valued code quantity to a world that values code quality (something we dont especially prioritize at the moment?)
I'd argue that we're asking the current generation to be massively adaptable in terms of what was expected of us 10 (or 30) years ago, as to what will be required of them 5 years from now.
And to be clear, I'm not suggesting that LLMs will teach them to be adaptable. I'm suggesting that a world that contains LLMs will require them to be adaptable.
I don't believe you can do this if you can't write code, but sure. Maybe
> What if the current crop of programmers has to literally adapt from a world that has valued code quantity to a world that values code quality
LLMs seem more likely to increase the value of quantity and decrease the value of quality. That's playing out in front of us right now, with people "vibecoding"
> I'm suggesting that a world that contains LLMs will require them to be adaptable.
And ones who can't adapt will be ground to mulch to fuel the LLM power plants no doubt
Driving an LLM properly requires knowing to evaluate if the results are correct. People can certainly try to pass generated code over for PR. But even just one code feedback or debugging should uncover if the person understood what they were doing.
Don't lock down the computer unless you are hiring people to work in a SCIF. Instead, give candidates a brutally hard/weird problem and tell them to use any resources they can get their hands on, by fair means or foul. (They will do that anyway if you hire them.) Then watch how they deal with it.
Do they just give up and stalk off in a huff?
If they Google for answers, do they use sensible queries?
If they use AI, do their prompts show skill at getting ideas, avoiding blind alleys, and creating effective tests?
If they call their friends, see how effective they are at communicating the requirements and turning the answers into a solution. Might be management material.
I feel like this doesn't get said enough, but I'm almost certain your issue is happening during filtering prior to even getting to the interview stage. Companies are straight up choosing (the wrong) applicants to interview, the applicant fails the interview, the company does not move forward with them, and then the companies does not go back and and consider the people they originally filtered out.
I know companies get swamped with tons of applications, and filtering is basically an impossible problem since anyone can make their resume look good, but every applicant that applied can't be that bad.
Bad applicant filtering at the first step is hurting both companies and applicants.
I graduated into a world without internet (we had it at university, hosted on Unix and Vax machines, but it wasn't available commercially. ) People who had computers were running DOS. Most businesses had no computers at all.
So the job market was both good and bad. We graduated with skills that were hard to find. But we graduated into a world where big companies had computers, small companies had paper.
So huge market opportunity, but also huge challenges. We'd either graduate into big business (banking, insurance, etc) or start something new.
I joined a person doing custom software development. We'd sell both the need, the software, and usually the hardware. ) When we didn't have work we'd work on our own stuff, eventually switching from custom development to products.
We had to bootstrap, there was no investment money in our neck of the woods.
I won't pretend the job market is the same (or even vaguely similar) now, but it seems to me that opportunities for self-employment still exist. Software is still something you can build with basically zero capital.
Ultimately a job is just someone else finding a way to add value to society. Software us one of the few ways you can do that yourself, skipping the employer.
95% of people see "a job" as the goal. I get that. My own kids are like that (zero interest in starting something new.) But there are opportunities for the other 5%. Yes, it's lot more than just coding, and yes it's a lot more risky, but the opportunities are there.
As for me, I'm closing in on retirement, but at the same time building a new (not tech) business from scratch, because there's still value I can add, and a niche I can service.
I say this all to encourage current students. You can see the world as "done" or you can see it as an infant just waiting for you to come and add your unique value. And in 35 year's time feel free to encourage the next generation with your story.
https://www.theatlantic.com/economy/archive/2025/05/trump-ta...
I can’t see how a US company doing dev outside the US would make any sense anymore, unless they’re big enough to structure everything away.
The tax incentives are insane depending on size.
Getting $20-40k in tax credits per employee, 100% tax deductions on R&D, and around 20-50% of total investment cost getting subsidized when building a GCC is the norm in Czechia, Poland, India, and Costa Rica, along with various additonal local or state credits.
A number of mid-level leaders (Staff, Principal, EMs, PMs) at tech companies are also on work visas with little to no chance of converting to a GC, so employers will let them open a GCC in their home country.
This is increasingly the norm - for example, entire product lines in GCP are owned by their Warsaw and Bangalore office (especially K8s side).
Otherwise the BBB just shot all that in the head.
Pre-BBB, the tax rules favored that kind of outsourcing.
But it is those sized organizations that tend to represent the majority of hiring.
A company with 100 or fewer employees tends to be much more hard pressed with hiring, as revenue for these sized orgs also tends to be lower.
The Section 174 changes didn't have much of an impact one way or the other for larger companies and organizations.
The expectation for output has largely been set now, and even with the current changes I don't see much of an impact on hiring trends.
This also doesn't include the impact that AI productivity tools like Cursor is starting to have on AoPs. It's already the halfway point and I myself am starting to see increased proposals from Engineering leadership to leverage Cursor style tooling wherever possible. And a number of the seed and Series A companies I've funded over the past 2-3 years have largely kept headcount below 100 and heavily utilizing automation where possible, and are on track to hit Series B style FCF metrics with a much leaner workflow.
That said, larger orgs can weather it better - but it’s a fundamental change.
I got my first real job in a very down employment market--much worse than today. Got skills, learned how business worked. Found a pigeonhole where I could profitably work for myself based on the new skills. Built a reputation. Was hired by folks who needed my expertise, etc. The key for me was to get an accurate read on the employment market and let it guide my decisions.
My path night not be directly reproducible, but the orientation in OP's post nails the kind of thinking that's needed.
Some people run errands. Some people make stuff. Some people are valuable friends. Some people are wise advisors. Some people help you get healthy. Whatever!
I know this is painfully obvious in hindsight, but maybe something about 18 straight years of “your choices are military, college, or trades” prevented me from thinking outside the box. I probably would’ve gotten started on a career I was excited about a loooot earlier in life.
That is a gamble. Everyone talks about how the dotcom bust quickly recovered and housing bust recovered. But that period was when smart phones started, everyone got broadband and most businesses moved to the web. Can you be surely something else will come along?
This feels a bit unwarranted. There doesn't need to be some major new paradigm shift for things to get bad from an employment perspective. All that needs to happen is for this creative destruction rate to slightly exceed the new job creation rate, and there's your tipping point. I certainly feel that your average grad today doesn't have the same opportunities I did in the late 90s.
The example they give is a girl who has a degree in "health communications". Another, with a degree in "cognitive science", found a job in data science. A critical reader might describe these degrees as a "degree in fluff".
As a CS student I have many thoughts around the reasoning for this (AI reducing need for junior engineers, oversaturated market from COVID bubble, opaque job requirements/too low of bar). As much as I'd like to believe it's just a skill difference on my side, it's hard to deny my peers' and friends' struggle around me. I don't want my livelyhood to come down to a numbers/chance game. But sadly, that's what it is looking like right now.
My only advice is to keep costs low, don't give up, and find work where you can. It seems to cycle around so hopefully you'll end up ok but the days where degree=job were dying when I graduated 20 years ago so I assume there is left of that by now.
To be fair, the previous iteration of "degree=job" that was dying 20 years ago was the older definition - broad enough to include "degree in literally anything", which was closer to how it operated say 50 years ago.
GP looks to have gone with the newer advice of "get a more useful degree = job". That wasn't really dying 20 years ago. Or even 10 years ago.
---
Definitely agree about keeping costs low. Even if you do get a good job, if you keep costs low for long enough, it compounds like crazy.
My career path is so bizarre I don't really ever talk about it in great detail because it is so unique I think it identifies me and me exactly. Lots of others I know with similar stories. I would not want to go through it again.
We never really recovered from the 08 crash imo. That was a changing point I think that doesn't get enough attention.
https://fred.stlouisfed.org/series/LNS11300060
Literally nobody noticed because everyone had covid trauma the second time, so we had a "vibecession" where everyone felt like there was a recession because they wanted there to be one.
2008/9 was a change in the expectations of college degrees. Going into 2008, we all got the advice to get degrees and jobs will just show up. After the crash we never got back to that point. Common knowledge ever since 08 has been college doesn't ensure a job at the end and your stuck doing unpaid internships and dealing with a competitive job market
A CS degree was the major people "should have been thinking about" until recently.
Oh, I don't mean because it is actually doing people's jobs, or even because it is making people more productive (though it certainly is doing that in some cases).
I mean because management has bought into a lot of strange and misleading ideas about where it is right now. They think that you get a 10x engineer by using AI IDEs and other tools. If it fails with their existing tech, that clearly means it wasn't trained on their current tech stack so they should switch!
There are a lot of sales opportunities, but the reality and the things that non-practitioners and practitioners are seeing are far apart.
Measurement then becomes graded upon standard features as differentiation becomes harder: GPA, test scores, essay rubrics, etc. Combined with increased communication, online portals become spammed within minutes.
All this leads to quite a difficult time for the young. Inequality likely ends up being a function of the country size. It explains the USA, PRC, India, but not sure about places like Pakistan, Brazil, or Indonesia.
Still draft, but wrote a bit here about the roles in society: https://bedouin-attitude-green-fire-6608.fly.dev/writing/a-d...
I always wonder about things like that stifling the start-up ecosystem and lower numbers of jobs
You might want to be a bit more trendy, or else boring and corporate, but what's really important is that you can pick anything up quickly the first time you see it. So technically, what you actually know right now doesn't matter.
/s
1. For most people and most degrees, college does not make one more useful to society.
2. Even for those people/degrees that do make people more useful, that doesn't mean society benefits from an unlimited number of them. For example, a math degree likely makes one more useful as a math teacher, but society does not have an unlimited need for math teachers.
We should stop encouraging people to go to college. It's a huge waste of societal resources, most particularly the years of life of the students themselves. How about go to work after high school, and then if you feel like you would benefit from additional formal education, then you go to college.
If you want college degrees to not mean something, you'd need to stop paying people with college degrees more.
Note the high paying no-college jobs may also have filters like apprenticeships.
1. Showing you can complete a 4 year project
2. Networking not only with professors, but also other students. You never know when you’ll need a dude who has a weird amount of skill in X, but if it’s in any way related to your field, you may have had classes together.
3. A place to spend a bit more time maturing
4. A place to pick up some skills that particularly interest you
5. A mixing of different economic classes, backgrounds, and outlooks, amid a relatively calm intellectual setting
6. Highly specialized and targeted education
And more. All of these make you more valuable to society, though that doesn’t have to be our only goal. We could also enjoy the generic enrichment of our citizens.
I've read that since then 70s the ratio of blue vs white collar in US has shifted by few % digits, but the number of graduates has boomed in the meantime.
This means that now jobs that never required a degree require one and titles are inflated to make people feel better.
While having no degree hasn't historically been much of a blocker, during a weaker job market credentials can and do play a role in tie breaking during hiring.
Honestly, even a WGU style bachelors degree can be enough depending on years of experience.
Don’t worry about timing at all.
GenX'ers will remember the days of 'Slackers' 'Reality Bites' and the malaise of those who graduated with fancy degrees in the early 90's but stuck in barista jobs etc.
But the truth is, and has always been, that smart people (and you need a base level of cognition to successfully graduate with an engineering degree) are always needed. What they are spending their "cleverness beans" doing (as an early mentor of mine once called the propensity for humans to innovate) is always in demand. And no matter what you have heard, there has yet to be invented any replacement for either human intuition or creativity.
Not exactly. I feel like the rise of big tech showed us that it's corporate cogwheels that are most successful career-wise. Moreover, even if what you say is true, it's becoming ridiculously difficult to tell who is smart and who isn't.
> I feel like the rise of big tech showed us that it's corporate cogwheels that are most successful career-wise.
I would be interested to hear how you reason to that. What is your definition of "success" and what is your definition of "cog wheel"?
Here is why I ask, I know hundreds of engineers who consider themselves "successful" because they have accumulated enough wealth to not ever have to work again if they choose not to. Recent data[1] shows there are over 342 thousand millionaires in the Bay Area alone. I find it hard to reconcile the idea that they were all 'corporate cogwheels' (which sounds a bit pejorative but again, I don't know how you define it).
> it's becoming ridiculously difficult to tell who is smart and who isn't.
Again, I really would be interested to hear more about this. I find it trivially easy to tell who is "smart" and who isn't. But I will grant you that it is a skill that is enhanced by a lot of exposure to people who aren't smart but would like you to believe they are. They are a LOT of those types around. There is also a nuance between "smart" and "lazy" in that I know some really smart people who are also incredibly lazy, avoiding actually learning things if they can get by with faking it or cheating.
Your comment also made me realize that I have a working definition of 'smart' that is different than 'book learning' smart, it is more a mixture of a willingness to learn, the humility to learn from anyone, and the fortitude to put effort into seeking out new information. That is different than what some people call 'smart' and I might call 'quick' in that they can make up a plausible answer quickly so it sounds like they are speaking from understanding not just bullshitting you. My wife used to call that 'Male Answer Syndrome' :-). The idea that one must never say "I don't know" or "I have no idea."
[1] https://www.alonereaders.com/article/details/3057/top-10-cit...
Ended up as the director of public policy for a small nonprofit only because the new policy staffer was going to be paid more than I was. For the board to give me the substantial pay increase I deserved based on experience, I had to be a director. Yet I only managed myself; I'm a crap supervisor so that was fine with me. What was funny was that people outside the organization were perplexed about why the policy work across the organization was so varied. I operated as a lobbyist because that was what was required and that is what I'm good at. But the new staffer was a policy analyst and advocate. The executive director and board seemed to be fine with that arrangement, but again, the difference was noted outside the org. Not sure that was good for the organization and the mission.
This is the first graduate class since Covid. Pivoting to online learning quickly resulted in worse learning outcomes for k-12 and college as well.
Companies are seeing a decline in base-level skills typically expected from previous classes of new graduates. They can either hire now and pay the cost of training for possibly 1-2 years to get them to an appropriate level, or hire no one and instead hire from the classes of 2026 and 2027, assuming those students improve from the post-Covid education system.
I got my first real tech job with Microsoft shortly after the 2008 financial crisis and right before the layoffs. Microsoft had an early form of Uber and a number of the drivers were recently laid off tech people that would hand me their resumes for me to give to my boss. I read them and it was revealing how a careers could go sideways out of nowhere and former highly credentialed executives could be reduced to asking me, a new hire, for such a favor on the very small chance that something could come of it.
The whole market is a lemon market with a crazy information asymmetry between employer and employee. This has steadily gotten worse my whole career, I eventually became self employed just to get away from it. Leave the lemon market for the lemons. The threats of being replaced by low cost Indians has given way to threats of being replaced by low cost AI. And many people were replaced by Indians and I’m sure many will be replaced by AI. AI is already more helpful than any junior hire I’ve ever had. This sets a rather high low bar for ability for junior devs to meaningfully contribute.
I’m not sure what my advice to a new grad would be. Life is the mother of all selection criteria biases. I don’t find solace in false optimism. The point of false optimism is to avoid despondent inaction, but it’s best to realistically understand your situation in order to make the necessary tough decisions.
It has been like this since 2012
In truth it’s always a crisis mode. Build your networks and demonstrate value and competence such that when people leave your company they’ll regret not having you on their team. This is the right way to stay employed once you land your first gig.
techpineapple•8h ago
cit3worker•8h ago
anyone in corporate america in a hiring position knows the prevailing trend -- you either hire h1b (for maximum leverage) or overseas (for cost savings). we've been told to hire h1b ONLY.
careersuicide•8h ago
JumpCrisscross•7h ago
No, it shouldn’t. Treason has a specific meaning.
aspenmayer•7h ago
chaosharmonic•7h ago
Most of those things aren't joining up with a foreign adversary with whom we're officially at war.
Even things that are treasonous in a colloquial sense are still pretty narrow, and tend to refer to other specific forms of betraying your country, that just happen to be other crimes -- like espionage, or insurrection.
MangoToupe•6h ago
Adversary to the people, or the state? These are wildly different concepts. Just because my state has beef with China doesn't mean they can't convince me that my life is better with China in it. Sometimes states just misrepresent the people.
aspenmayer•5h ago
Muromec•7h ago
janalsncm•7h ago
unethical_ban•7h ago
The focus and the main point remains: It's unethical, it's illegal, and I wish punishment on the companies and the individuals who fraudulently scam American workers out of jobs.
babuloseo•6h ago
MangoToupe•6h ago
Personally, I think there's lots of treason that pervades our life. Hell you could argue the meaning of "national interest" used by our state department is itself treasonous.
Any definition of the word will come down to what you perceive as in our interests. For easy examples, see Snowden and Manning.
astrange•4h ago
https://constitution.congress.gov/browse/article-3/section-3...
MangoToupe•13m ago
klipklop•5h ago
plandis•8h ago
If you’re not being hyperbolic, this seems like fraud on the part of whoever is submitting visa applications for your company.
WorkerBee28474•8h ago
esseph•7h ago
horns4lyfe•53m ago
zeroCalories•8h ago
esseph•7h ago
https://www.economicstrategygroup.org/publication/in-brief-u...
techpineapple•8h ago
alfalfasprout•7h ago
babuloseo•6h ago
MangoToupe•6h ago
pirate787•8h ago
Rather, the opposite is generally true, the more people working the more growth and innovation and the more opportunities and capacity for employment.
techpineapple•8h ago
So if I'm understanding this causal relationship you're suggesting, the problem is too many people are rejecting employment, and if more people wanted to work there would be more jobs?
tonyedgecombe•7h ago
Nevermark•6h ago
More people working doesn't just mean more jobs filled. It means more people making money and spending it, so more people needed on the supply side.
The fact that it sounds circular doesn't make it bad logic in this case. It's actually a magnitude increasing spiral. All funded by the Sun.
FredPret•6h ago
But once you do have growth and innovation going one way or another, then that leads to more jobs.
barchar•4h ago
Productivity and growth aren't zero-sum, but money definitely is. All the assets and liabilities in the economy sum to zero, so if you want to add new jobs you need to either deflate the economy or increase someone's debt level.
nradov•8h ago
jibe•8h ago
readthenotes1•7h ago
koolba•4h ago
xyzzy9563•8h ago
exe34•7h ago
logicchains•7h ago
Muromec•7h ago
y-curious•7h ago
astrange•4h ago
That said, the Australian and Danish systems are the best because they're more flexible.
[0] This holds up to about 60% of median wages. You can imagine it'd lower the hours some people get before it entirely makes them unemployed.
[1] One is that it provides price signals to monopsony employers. Another is that it reduces search costs in the labor market by basically acting as a spam filter that gets rid of time-wasting job offers.
barchar•4h ago
But wave stickiness is sorta just part of human nature.
aspenmayer•7h ago
MangoToupe•6h ago