The massive boom in computer science enrollment over the last 20 years has been driven mostly by people chasing tech salaries, not by any real interest in computing itself. These students often show up completely unprepared for how difficult CS actually is, and universities have responded by dumbing down their programs to keep everyone happy and enrolled.
If this weeds out the people who are just there for the paychecks, it might actually be a relief to get back to teaching students who genuinely want to learn about computing.
The hackers and nerds will be just fine. They are like gold when we find them now. But if this makes CS "uncool" again, I am all for it.
Think about how AI can help students cheat nowadays. You could still cheat previously, but now a CS-degree seeker can have an AI do the entirety of school work for them (with exception of say pen-and-paper tests). Imagine how the quality of new graduates drops with regard to the understanding and abilities you highlight as crucial to being effective in software, and how those that do understand are even more valuable relatively, but perhaps harder to find in the noise.
Do you mind elaborating here on what is happening to you? It seems worthwhile information to add to the discussions ongoing for this post.
This is not at all my experience. One of the problems I face is many of those PMs and companies in general want mindless ticket completers. My current job just wants us to grind through the Jira backlog. They have no interest in anything else and crush it from your will too.
Did colleges expand their computer science departments or even just create them to meet the demand for the degree? The pipeline to possible employment with a CS degree is quite short, doesn't require residency and board-certification so it's a quicker route to employment, but then you are competing with peers with stronger backgrounds and educations and seasoned professionals for the same positions.
A good CS education only gives you prestige with fellow nerds.
Still I've been careful to set my life up so I could go many years without employment if I had to. It's hard to trust the rest of the economy in general.
Anecdotally I've heard that very few CS programs even use C++ anymore, and schools now favor Python because students find it more accessible.
I think there's still value in starting with C and C++, to see where it's coming from and see how much tooling and DX has improved, but I can't really blame courses jumping directly to the more useful things.
And we're back to the discussion of what is the point of a University CS education. I would argue that learning something like C++ is important for the same reason something like Lisp or Haskell is important. Not because it will necessarily help you get a job, but because it introduces you to new concepts and a new way of thinking about programming and computation that will be useful no matter what language you end up programming in for a living.
Python is the primary language for scientific computing and the secondary language for a good number of other tasks.
Tell me you've been in the field for less than 10 years without telling me.
Python is a joke language with a joke name. The only reason it ever caught on for AI is that someone wrote a few good math libraries for it in the 2000s, and its' rise is entirely incidental to that.
Python's ecosystem is massive and there are lots of use cases[0], basically data analysis, machine learning, deep learning and all the rest of AI run on Python.
[0] https://lh3.googleusercontent.com/keep-bbsk/AFgXFlJPnxraSopK...
If you're saying that all of AI runs on python, then what's the problem here? Implicitly students will need to learn python as part of their AI class.
But not sure that using Python as the specific tool is so bad--based on the MOOC that's what MIT uses in Intro to Algorithms. May be better than spending a lot of time on the vagaries of C++ which are certainly relevant to system programming (though that's probably slowly switching to Rust) if your focus is on algorithms and other design details.
"Computer science is no more about computers than astronomy is about telescopes, biology is about microscopes or chemistry is about beakers and test tubes. Science is not about tools. It is about how we use them, and what we find out when we do."
* https://quoteinvestigator.com/2021/04/02/computer-science/
* https://en.wikiquote.org/wiki/Computer_science#Disputed
Perhaps a trade school would be better if someone wants to focus on 'just' programming.
You sound like a physicist who thinks mechanical engineers are unnecessary because we have physicists and car mechanics.
Or I sound like someone who recognizes that physics and computer science, and mechanical engineering and computer engineering / programming, are different areas of activity.
Python gets language difficulty out of the way of learning a given algorithm. Bonus points for exacerbating the time issue when trying to introduce Big O timing notation. The kids can actually "feel it" in an in your face kind of way.
Systems, it's different, as you say.
Compilers. Different AI courses. And on and on. Each you may have legit reasons for using different languages.
The concern starts to grow when Python is being used across many courses to the exclusion of any other language or technology. That's the issue that's growing across CS departments right now. Couple that with kids who have no interest in learning the other languages on their own and voilá! You have an issue with uninterested kids graduating, but now they're also unprepared.
If I were at a school where they are teaching JavaScript or Python, you kind of already know that program is more "money grab" than "study of computing technologies".
College should not be about teaching a specific language. It should be teaching the programming skills needed to pick up any language. Python is just as good as C++ in this regard. In fact, if python is an easier on ramp and get people excited about programming and show shows them what’s possible before crushing their soul with C/C++ then I say go for it.
In college, I regularly wrote my programs in PHP language I had taught myself prior to college and then converted them to see to submit my homework/test. While PHP was obviously much slower to run, it let me iterate and develop faster than my peers.
In fact, I find it borderline fraudulent that so many colleges waste time on a language that most graduates will never use. Python knowledge is way more useful than C++ knowledge in my opinion, especially for a new grad.
Then again, I have a very dim view on college CS programs as a whole. They aren’t just fighting the “last war”, they are fighting a war from decades ago. Almost everything that I used in my first job were things that I taught myself, not things that I learned in college. That was one big reason why I dropped out of college my junior year I wasn’t learning anything that was useful for my field. The professors were pedantic and cared about silly things like making sure I put a semicolon at the end of each of SQL queries that I wrote for an exam.
I had a high school BASIC class but that was about it.
No, you're good, this is the natural reaction of basically all programmers except for those strange beasts known as Java programmers who believe verbosity and needlessly complex yet organized in a twisted sense is nirvana, in the same way the accountant sees the tax return as nirvana. Many other language enthusiasts such as C, python, or LISP, will also get a bad taste for Java. Of course there are other gnarly languages such as APL or SAS.
It's not going to work that way. I was genuinely interested and took many high level electives. I felt the program was very good 15 years ago at the school I attended. I also got an MSIS at a different school, but feel that one was not any more advanced than BS, just a faster pace and weirdly less coding. I did well for years at my job. Now it looks like I might lose my job and probably won't get another IT one. I will probably end up working at Walmart or something.
Making the whole thing a non-profit or a charity won’t solve this.
Most of them have no actual passion for computing, their scope of knowledge is superficial, and they're asking for six-figure salaries out of the gate.
I had a relatively simple coding assignment (shouldn't take more than 15 minutes) that I would use to weed out those that were just copying and pasting sample code. It required a very large number of values and added an additional profiling step to it. The sample code wasn't performant with a very large number of values, and was painfully slow to use unless you made minor adjustments to a few things.
- In principle, it should not matter at all, but there are practical reasons why one PL may be better than another in a particular school or context.
- But, all this "choice of PL" discussion is really a discussion about CS1. A CS degree has at least seven other courses -- assuming 1 CS course per semester -- and in practice many more than that. So, if you're going to ask questions about CS1, the question to ask is, "Does CS1 setup students to succeed in the advanced courses?" Classically, these were courses in compilers, operating systems, networking, and so on. These days, you can add distributed computing, machine learning, etc. (but don't subtract the classics).
I don't read too much into the fact that unemployment for nutrition science is at 0.4% - that doesn't mean those people are all working as nutritionists or even in a job that requires a degree. You can see this clearly in the underemployment rate which is 45%+.
Likewise, the top unemployment rate (9.4%) of those with an anthropology major probably doesn't mean all those people are living under a bridge - a fair number of them will be pretty well off, living off their parents and knew going in that their field doesn't hire millions.
So what to make of IT grads having high unemployment rates (but low underemployment rates! bottom 5 in those)? I feel some more on-the-ground reporting is needed.
The quotes from randos reacting in this article don't really help. "Every kid with a laptop thinks they're the next Zuckerberg, but most can't debug their way out of a paper bag," because debugging (like Zuck!) is computer science, apparently.
That's a very important observation. It's much better to be in a field with a 6% unemployment rate than a 60% underemployment rate (like criminal justice, performing arts, and, surprisingly, medical technicians).
I feel like I've seen this quote many times over the years.
Also, how do they calculate employment rate? If you get a job at McDonald's while having a civil engineering degree or nutrition science, that counts as employed as well, no?
Would be good to see how many are actually employed in their field of study
That would be under_employment (vs un_employment).
Un_employment refers to people actively seeking work but unable to find it, while under_employment encompasses individuals who are working but not fully utilizing their skills or working fewer hours than they would like
Underemployment as "not working as many hours as you'd like" is the standard definition, and that one actually does seem to respect people's interiority.
(I think you are right to ask if a survey can accurately capture "underemployment", there are many problems with the definition and how to capture the right information to measure it.)
""" 93. Would you rather have a job more closely related to your education, training and experience?
94. Considering your education, training and experience, do you feel that you are overqualified for your current job?
95. Considering your education, training and experience, do you feel that you have been overqualified for most of your jobs? """
93 is not a question I suspect most people answer faithfully. Because most people with tertiary education could probably find such a job - but it would be at a substantial pay cut. Yet the angle of compensation is nowhere to be found in the question itself.
94 is subject to the same bias that makes 90% of people think they're in the top 50% of driving, parenting, lovemaking and/or karaoke.
95 has that same issue, but also brings in a narcissistic wound aspect to it. No, of course you're better than all of those hams, shams, and japeths who you worked with/under/over through the years.
No, not by the common definition of underemployment. You're not over-qualified to work at Jane Street and presumably you want to work there.
But it would be worth tracking if you wanted to work in academia and ended up at Jane Street. It's about measuring labor demand vs. supply, because labor supply is difficult to measure over time (because people don't just sit forever waiting for a job in their field to open).
> Underemployment as "not working as many hours as you'd like" is the standard definition
These are related concepts and tracked for similar reasons. You're "not working as many hours as you'd like at a job you're qualified for and would like to have". The number of hours you're working at that desired job is 0, and you're replacing it with some undesired job instead.
Since most people working at Jane Street have a college degree, you would not be considered 'underemployed' in this particular study.
Just look at what is happening in just the last 5 to 6 months since this prediction was made [0]. The definition of "AGI" was hijacked to mean all sorts of things to the companies that operate the AI systems, even conflicting with each other on timeframes and goals.
But what really is the true definition of "AGI" is the blueprint inside the WEF's Future of Jobs Report 2025 [1] with the deadline of 2030 including mass layoffs which 40% of employers admittedly anticipate reducing their workforce where AI can automate tasks, as I said before [2]
So what AGI actually means is a 10% global unemployment increase by 2030 or 2035 and with all those savings going to the AI companies.
[0] https://news.ycombinator.com/item?id=42490692
[1] https://www.weforum.org/publications/the-future-of-jobs-repo...
I'm not even sure those savings will "go" anywhere, they will just stay with the companies. Right now, if I use my $20/mo ChatGPT subscription to automate away my secretary's job ($3,000/mo or whatever), it's not like those $3,000/mo is going to OpenAI. And I don't think in the future they will be able to jack up prices, because foundational LLM models have become a race to the bottom.
However, the "number go up" crowd doesn't give a fuck about the secretaries -- so they will chant "AI! AI! AI!" to juice the stock and make out like bandits, while they still can.
Many of the big companies that have been on hiring orgies are advertising dependent. Ads are the thing that gets slashed heading into a bad economy, and we’re in an economic mess that is going to get alot worse.
We had some cleaning up to do. I was a hiring manager during COVID and the resumes I saw were unbelievable. People with "web" boot camps being considered for 6 figure salaries. People who had absolutely no business being in this field were being hired.
It was due to the easy money from low interest rates. This field always had solid salaries but some people were making a million to sit on meetings and integrate frameworks into me-too websites.
The hammer is coming down and is unfortunately hitting many good people too. But they will recover while the people who shouldn't be here will move on. Don't get your HVAC repair certification quite yet. Stop complaining about AI and go study it (the hard stuff not ChatGPT for dummies).
That's a good part of the reason why hiring processes are so long and you need to re-check everything people are supposed to know. Filtering out hundreds of candidates to get a mediocre one at best, thousands to get a really good one.
There are job openings, but just having a piece of paper is not enough to get to those.
AI tools have made recruiting a miserable experience for everyone involved, there's so much cheating in applicants and you waste so much time filtering those out and sadly, good candidates sometimes get lost in the noise.
Networking is what has the highest signal to noise ratio. A good recommendation from someone you trust helps a lot, but it penalizes people just starting their careers and have smaller networks.
It's a sad state of affairs.
When it came to undergraduate majors with the highest unemployment rates, computer science came in at number seven, even amid its relative popularity.
The major saw an unemployment rate of 6.1 percent, just under those top majors like physics and anthropology, which had rates of 7.8 and 9.4 percent respectively.
Computer engineering, which at many schools is the same as computer science, had a 7.5 percent unemployment rate, calling into question the job market many computer science graduates are entering.
On the other hand, majors like nutrition sciences, construction services and civil engineering had some of the lowest unemployment rates, hovering between 1 percent to as low as 0.4 percent.
This data was based on The New York Fed's report, which looked at Census data from 2023 and unemployment rates of recent college graduates."
Source:
https://www.newyorkfed.org/research/college-labor-market (requires Javascript)
Data: (no Javascript required)
https://www.newyorkfed.org/medialibrary/research/interactive...
https://www.newyorkfed.org/medialibrary/research/interactive...
https://www.newyorkfed.org/medialibrary/research/interactive...
https://www.newyorkfed.org/medialibrary/research/interactive...
Civil Engineering 1.0%
Aerospace Engineering 1.4%
Mechanical Engineering 1.5%
Chemical Engineering 2.0%
Electrical Engineering 2.2%
General Engineering 2.4%
Miscellaneous Engineering 3.4%
Computer Science 6.1%
Computer Engineering 7.5%
zettapwn•1d ago
1. Overproduction. Even liberal arts colleges have 15-20% of students majoring in computer science. “Learn to code” ceases to be good advice if too many people do.
2. AI… sort of. It’s a lousy replacement for serious engineering talent, but the bosses are so enticed by reduced labor costs (and reduced employee leverage) that they will keep trying even if the stuff doesn’t work. Expectations are going up, teams are shrinking, and junior roles are vanishing.
3. Reputation collapse. Remember how we dismissed Michael O. Church as a crank? His writing style was grating (and has improved immensely) but he was right about everything, five years before anyone else. In 2009, we were “good rich people” in contrast to Wall Street. Now we’re Public Enemy #1 and, while we don’t all deserve it, our industry’s leadership does. This doesn’t hurt big tech companies because they’re invincible monopolies, but it has ended the era in which even non-tevh companies wanted three or four “data scientists.”
astura•1d ago
wslh•1d ago
JanneVee•1d ago
paxys•1d ago
coderatlarge•1d ago
i guess that is a natural dynamic in our economic/belief system in which all central planning must be inherently bad so we must always pay the on-demand price instead of the bulk price and every mis-timing mistake has to cost a lifetime of being wrong afterwards…
JanneVee•1d ago
zettapwn•1d ago
He seems to have moved toward CS theory, AI, and literary fiction.
volemo•1d ago
I believe "learn to code" is a great advice, nonetheless; the skill is highly applicable. The bad idea is thinking that alone will land you a cushy job.
ghaff•1d ago
ghaff•1d ago
AnimalMuppet•1d ago
Maybe it's just a phase?
Or maybe today's juniors are different than the juniors were five years ago. And maybe that's because of AI.
ghaff•1d ago
dagw•1d ago
Completely disagree. No matter what job you end up with, you will almost certainly be able to do it a bit better if you know how to code. Knowing how to code is basically always a plus when applying for a job. However "just learn to code a little bit, and nothing else" is probably bad advice.
lazide•1d ago
rvz•1d ago
"Learn to code" was the scam to address the so-called "skills shortage" BS in programming. Even worse, the skills that was pushed were also the most automatable: HTML, CSS and especially Javascript just to get $250k roles which was the most unsustainable ZIRP era to happen.
Now you won't see the influencers screaming about web developer roles given the current massive flush in those who joined because of the $$$ just to rearrange a <div> or adding accessiblity styling for 6 figures.
ghaff•1d ago
lazide•1d ago
The complaint isn’t about n people not being available, it’s about n people not being available for x low price, or z terrible working conditions.
No matter how cheap or how widely available, some folks will still complain because for some folks, even if they had to pay $0, it still would be ‘too much’ if people also demanded human rights.
It’s similar to the ‘where have all the good men gone’, or ‘why don’t people want to work anymore?’, etc. complaints.
ghaff•1d ago
orwin•1d ago
msgodel•1d ago
throwaway290•1d ago
msgodel•1d ago
The problem with many of these tech companies is that they've been so successful abusing their users out that they've quit putting energy into developing their products. HP and Sonos are two good recent examples of how this ends.
Tesla doesn't seem to be doing that right now. The big thing you'd be be betting on (long or short) is how successful the robotaxi and optimis will be. I'm not optimistic with either of those (robo taxi seems like it should be practical, it's more about the particular execution) but I also wouldn't be willing to bet against them.
spacemadness•1d ago
wooque•1d ago
swat535•1d ago
https://news.ycombinator.com/item?id=10017538
spacemadness•1d ago