"AI" professor tells everyone to use "AI". With the usual fatalism that nothing can ever be done about anything.
One option for example would be to fire all "AI" professors. Another one would be to outlaw "AI", just as nuclear energy was outlawed in Germany and DDT was banned worldwide.
Instead of confronting the issue directly, people often sidestep it with other excuses. The reality is, if we eliminated all H1B workers, every American in the IT industry, including recent graduates, would have a job. And don’t try to convince me that a Java developer from India possesses skills that our university graduates don’t.
Everyone defending H1Bs forgets why we even have an economy. America never signed up to be some hegemon that needs to compete with the entire world. America exists for the sake of Americans, not the world, first and foremost. We can help other people after that point. You get revolutions and revolutionary acts when it feels that the opportunity for foreigners and the aristocratic is exceeding opportunity for the normal everyday people born here, and that is a legitimate injustice.
To me, Computer Science would be like research type jobs. I know nothing about this field, but I expect it has always been and always will be very hard to get into this field.
Then you have these programming jobs:
IT would be working on Internal Applications for a Business. These days would usually mean supporting or in-house custom developing for things like SAP or Oracle. This is what I did, in the 70s/80s/90s it was all in-house systems. Starting early 2000s, systems like SAP. I have since retired but I know where I last worked, that company was moving these jobs outside the US. From friends still there, those moves have increased quite a bit. Maybe work could be still available in small companies.
Then there are working at startups, which is rare but gets all the press, I know nothing about this area.
Then there is working a a company that develops software for sale (like SAP), I tend to think this is starting to go the way of IT work mentioned above.
alephnerd•22m ago
CS (along with ECE/EECS) degrees have been watering down their curriculum for a decade by reducing the amount of hardware, low level, and theory courses that remain requirements abroad.
I can't justify building a new grad pipeline in cybersecurity, DevSecOps, or ML Infra with people who don't understand how a jump register works, the difference between BPF and eBPF, or how to derive a restricted Boltzmann machine (for my ML researcher hires).
Just take a look at the curriculum changes for the CSE major (course 6-3) at MIT in the 2025 [0] versus 2017-22 [1] versus pre-2017 [2] - there is a steady decrease in the amount of table stakes EE/CE content like circuits, signals, computer architecture, and OS dev (all of which are building blocks for Cybersecurity and ML) and an increased amount in math.
Nothing wrong with increasing the math content, but reducing the ECE content in a CSE major is bad given how tightly coupled software is with hardware. We are now at a point where an entire generation of CSE majors in America do not know what a series or parallel circuit is.
And this trend has been happening at every program in the US over the past 10 years.
[0] - https://eecsis.mit.edu/degree_requirements.html#6-3_2025
[1] - https://eecsis.mit.edu/degree_requirements.html#6-3_2017
[2] - https://www.scribd.com/document/555216170/6-3-roadmap
hiddencost•16m ago
alephnerd•11m ago
yangikan•14m ago
alephnerd•13m ago
goalieca•11m ago