frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Open in hackernews

Ask HN: What is it like being in a CS major program these days?

90•tathagatadg•1h ago
How has the curriculum changed? What are the professors telling their students to explain why the course they enrolled in deserves the rigorous study? Are the students buying it - and is it matching reality at the end of the course? It’s hard to get a feel from the continuous pendulum swing of “it’s dead” to “it’s better than ever”. As much as I am scared with my own career, I am worried about my nephews’. What advice to give them, when all their life I have advocated for CS as a fulfilling career choice? P.S. I have pivoted to best time to be solopreneur. “But what about uni then?”

Comments

dorianmariewo•1h ago
lots of chatgpt i assume
jtbetz22•1h ago
I am not in a CS program myself, but I guest lecture for CS students at CMU about 2x/year, and I'm in a regular happy hour that includes CS professors from other high-tier CS schools.

Two points of anecdata from that experience:

- The students believe that the path to a role in big tech has evaporated. They do not see Google, Meta, Amazon, etc, recruiting on campus. Jane Street and Two Sigma are sucking up all the talent.

- The professors do not know how to adapt their capstone / project-level courses. Core CS is obviously still the same, but for courses where the goal is to build a 'complex system', no one knows what qualifies as 'complex' anymore. The professors use AI themselves and expect their students to use it, but do not have a gauge for what kinds of problems make for an appropriately difficult assignment in the modern era. The capabilities are also advancing so quickly that any answer they arrive at today could be stale in a month.

FWIW.

nateburke•1h ago
Interesting that the algorithmic finance firms are still recruiting. Perhaps they still need a pipeline of rigorous thinkers, or are unwilling to cede significant influence over P+L to llms.
dzink•1h ago
Because the market is eternal competition. If one does something that works others have to figure it out and nobody puts their ideas in open source.
Imustaskforhelp•53m ago
How much drastic would things be if these corporations do open source it? I like to think that markets are fairly efficient so they are fighting tooth and nail for micro-percentage points which granted can be billions but usually what these companies really do is short of fraud at times which can be celebrated by finance (Jane Street frauding Indian investors)

My opinion is that they aren't worried about their competitors so much as the govt.'s patching the loopholes that they do because the only way they are a net sum positive game (in my opinion) is that they make money from the losses of the average person and that too in fraudulent manners at time.

Jane Street's $5 Billion Derivatives Scam Rocks SEBI :https://frontline.thehindu.com/columns/jane-street-sebi-scan...

jazz9k•1h ago
When I was in college in the early 2000s, it was the same. Most professors were at least a decade behind current technology.
Xelbair•55m ago
I wish it was decade for me, in early 2010s they were still teaching 90s approach to handling complex projects(upfront design, with custom DSL for each project and fully modelled by BA without any contact with actual users, with domain experts being siloed away - and all of that connected to codegen tools for xml from the 90s)
super256•48m ago
I had to deal with Java codegens from UML specs in 2021. So, nothing has changed! :')
petterroea•41m ago
Something tells me it was always like that. My university professors were teaching things nobody wanted to learn, and people were practically begging to be taught more up-to-date hireable skills.

Every time there was project work, we would be recommended using Swing or similar because that is what professors knew, but everyone used React because nobody hires Swing developers.

Someone once said "Our SQL professor's SQL knowledge is 10 years out of date. Probably because he has been a professor for around 10 years at this point" and that kind of stuck with me.

Akuehne•36m ago
This is why I have always said, that a degree in CS is useless without some degree of passion towards it.

No professor can enable you for tomorrow, and a CS career is one of constant education.

I'm glad I learned some STM32 assembly, but with the resources available today, I wouldn't get anywhere near as deep as I did in the early 2k's.

I am building a local low power RAG system for the programing languages I like, but I'll still include stm32 asm.

fm2606•6m ago
> This is why I have always said, that a degree in CS is useless without some degree of passion towards it.

I would add I don't know how anyone can do any degree and career without some sort of passion for it.

For me personally, not only do I need passion but I have to have some sort of belief in the product and/or company I'm working for. In the early 00's I worked at a company, not software related nor was I working as a developer, and didn't like what I was doing nor did I believe in the product, it was lacking in so many areas where they were trying to frame it fit in the product market. I left after 3 years and did something completely different.

iso1631•23m ago
In the UK I did comp-sci from 2000, did a couple of extra modules. One was from engineering and covered communication theory -- nyquist etc. Another from was the English Department of all places and covered XML and data.

Very little coverage of tcp/ip in any of the courses. Language of choice in CompSci was Java at the time, which was reasonable as OOP was the rage.

Some compsci lecturers were very much of the opinion that computers got in the way of teaching Computer Science.

dumb1224•8m ago
I did my CS undergrad in China but was already in the UK early 2000s. I was also abit surprised there's little mention of TCP/IP which is kinda considered classics if there's anything taught in CS at all. Java was definitly the new dominating force in industry and academia at that time.

However it depends on the resources the univ got. In some places there were other less Comp sci / software engineering focused degrees but got a little content overlap (I guess for financial benefits to enroll more students) such as e-commerce / digital degrees. They shared some courses with CS but not all.

fergie•57m ago
> They do not see Google, Meta, Amazon, etc, recruiting on campus

Really? As in FAANG has stopped recruiting graduates?

karmakurtisaani•54m ago
They still probably do, but mainly in India.
compounding_it•27m ago
FAANG employees here are cheap to hire. They work very hard to remain rich or become rich from nothing (50-60LPA will basically make you rich in 5-6 years if you save and invest well). Leetcode grind and competitive problem solving is Indian childhood bread and butter these days. And given how much househelp exists in India this kind of model is perfectly suited to be outsourced to young and middle aged Indians who have virtually no life beyond CTC anymore.

I’m just surprised it took them this long to outsource.

The risk of course is people start their own companies learning from big tech and Indians get more UPI like tech.

someguyiguess•55m ago
To be fair, college CS programs have always been decades behind in my experience. Maybe schools like Stanford and MIT are different but the majority of CS programs are not teaching tech that is actually used in the business world.
alistairSH•49m ago
Maybe I’m an oddball, but I’d rather hire a new grad with sound fundamentals, but learned on an older tech stack, then somebody with all the buzzwords but no fundamentals.

And I’ve always found summer internships to be good way to find out. Even better if the candidate is willing to work part-time through their senior year.

compounding_it•36m ago
The Pythagoras theorem doesn’t change even if you use an LLM. Fundamentals shouldn’t either. Don’t see why schools should see this any differently.
daymanstep•10m ago
I agree. That's why universities should never teach any practical real world programming languages. They should stick to Scheme and MMIX.
kelipso•34m ago
Yeah. I see a phrase like “hirable skills” and… it feels like “skills” that are probably going to be outdated every couple of months.
cmiles74•9m ago
I mean yeah, I agree, but is it that hard to keep relevant technology in the mix? I'm not saying everything has to be cutting edge!
rwmj•48m ago
Which is a good thing. They should be teaching the cornerstone principles, not offering vocational courses.
jchonphoenix•47m ago
This is CMU so they would be at the bleeding edge just like MIT/Stanford. But I think all the schools are behind today
werdnapk•44m ago
When I was in CS, we were taught theory. If you wanted to be caught up with the current tech, you'd teach yourself.
arethuza•33m ago
That was my experience in the 80's - we were taught theory, we had to apply the theory in projects so we spent lots of time programming and getting stuff working - but we were pretty much expected to pick up particular languages, operating systems or languages by ourselves.

The CS theory (i.e. maths based) side of it really has stuck with me - only other thing being vi controls being hardwired in my brain even though I went on to become more of an emacs fan...

c0balt•1h ago
The curriculum in my university mostly didn't change. Most CS topics didn't change through ML research.

The main change was in testing/exams. There was a big effort towards regular testing assisted by online tools (to replace the system with one exam at the end in favor of multiple smaller tests). This effort is slowly being winded down as students blatantly submit ChatGPT/Claude outputs for many tasks. This is now being moved back to a single exam (oral/written), passing rates are down by 10-20% iirc.

Going into CS as a career will be interesting but the university studies/degree are still likely worth it (partly spoken from a perspective where uni fees are less than 500€ per semester). Having a CS degree also does not mean you become a programmer etc. but can be the springboard for many other careers afterwards.

Having a degree and going through the effort of learning the various fundamentals is valuable, regardless of everything being directly applicable. There is also the social aspects that can be very valuable for personal development.

welder•1h ago
EU is way behind US in AI and doesn't have the big tech jobs after graduation. Probably best to look at US schools to answer OPs question.
Novosell•1h ago
I've been doing programming and sys admin as a hobby for a long time and only recently started my bachelors in compsci, and I'm sad to have waited so long as almost everything has been infested with ai to some degree.
Imustaskforhelp•58m ago
Why are people downvoting this? The reason why I had decided compsci or stem was also that being completely honest, I couldn't imagine myself not having the hobby of using linux and tinkering with scripts and everything. So I really get what you are talking about and I think that we are in similar states although I haven't started my bachelors and I might be much younger than you.

Linux/Terminal truly feels like opening another dimension of thinking, its too luring sometimes.

pona-a•1h ago
Many universities used to have basic skills without the rigorous academic culture of top universities. They're being completely decimated by AI: professors downskilling themselves by openly using it in the course, often even responding to questions with suggestion to prompt it yourself. Some will prognosticate themselves about how everything outside the tiniest subset of their subject will be replaced soon enough. Students themselves seem to either understand AI as academically dishonest or believed the propaganda, thinking they HAVE to "learn" it to have a chance at a career, even at the expense of actual subjects. If you remotely suspect that, don't rely on prior evidence, run.

Meanwhile other unis are still majority high class faculty members holding the bar, but are suffering a decline in the quality of new students. You can absolutely learn in those places, but you're likely to to find many capable peers.

I don't have the data what's going on at global top CS programs, presumably much better than this. I do predict we're gonna suffer a multi-generational loss of skilled talent, with three generations of mediocre programmers converted to AI zombies incapable of performing their job, with or without it.

jkbwdr•1h ago
currently in cs masters program at ivy: i think it's like thinking that pure math study evaporated when we made the calculator, or that we suddenly shouldn't have bothered with Riemann sums because of the FTC. ai to coding is much the same in the sense of moving to a layer of higher abstraction. i don't think cs curriculums have to change drastically to accommodate this; however, the onus on not getting it wrong increases since ai produces probabilistic output. finally, you can have a chat bot do all the work for you to your own detriment i suppose...
titanomachy•1h ago
I have no reason to believe that you aren't motivated mostly by curiosity and interest, but the mass of CS undergrads are primarily driven by economic incentives.
jazz9k•59m ago
The ones I knew that were only driven by money all dropped out or changed majors.
titanomachy•55m ago
What did they change to? Pre-med?
palata•40m ago
Feels like CS used to be for nerds who wanted to understand how computers work, and then it became much more popular because there were good career opportunities.

Maybe with AI it will go back to "CS for nerds", and those nerds will be the ones landing the jobs that require actual understanding?

Genuinely wondering.

linesofcode•1h ago
I’m also interested in what CS curriculums are right now and furthermore what students actually think of it. I suspect nothing has changed in terms of curriculum other than being more rigorous about “academic dishonesty” like detecting if someone used ChatGPT generated answers.

What I hope will change is less people going into the CS field because of the promise of having a high-paying career. That sentiment alone has produced an army of crud monkeys who will overtime be eaten by AI.

CS is not a fulfilling career choice if you don’t enjoy it, it’s not even that high-paying of a career unless you’re beyond average at it. None of that has changed with AI.

I think the right way to frame career advice is to encourage people to discover what they’re actually curious in and interested by, skills that can be turned into a passion, not just a 9 to 5.

seethishat•1h ago
Large well-regarded CS schools still have 'systems' and other traditional CS specializations. I would encourage looking at those programs.

Experience is still needed too. You can't just blindly trust AI outputs. So, my advice is to get experience in an old-fashioned CS program and by writing you own side projects, contributing to open source projects, etc.

koakuma-chan•1h ago
What is "systems"? What do "systems engineer" people do?
bradley13•1h ago
"The professors use AI themselves and expect their students to use it, but do not have a gauge for what kinds of problems make for an appropriately difficult assignment in the modern era."

I'm a prof, recently retired but still teaching part-time. This is exactly the problem. AI is here, people use it, so it would be stupid (plus impossible) not to let students use it. However, you want your students to still learn something about CS, not just about how to prompt an AI.

The problem we are heading towards as an industry is obvious: AI is perfectly capable of doing most of the work of junior-level developers. However, we still need senior-level developers. Where are those senior devs going to come from, if we don't have roles for junior devs?

Kelteseth•55m ago
Not just that. As a 31 year old developer even I feel like acquiring new skills is now harder than ever. Having Claude come up with good solutions to problems feels fast, but I don't learn anything by doing so. Like it took me weeks to understand what good and what bad CMake code looks like. This made me the Cmake guy at work. The learning curve delayed the port from qmake to CMake quite a bit, but I learned a new skill.
thin_carapace•43m ago
what i find interesting about your perspective is your subjective perception of difficulty. nobody short of a savant is going to pick up a new language instantly. weeks (if not months) to learn a language is completely normal outside of this hyper exaggerated atmosphere we find ourselves in. that being said, language models do atrophy the brain when used in excess, and they do encourage surface level understanding, so i agree wholeheartedly with the idea of not learning anything at all by using them.
tclancy•37m ago
I have a block of code I will put in the CLAUDE.md file of any project where I want to get a better understanding of the tech in use where I ask for verbose explanations, forcing me to write some of the code, etc. Mixed results so far but I think it will get there. The one thing that I have decided: only one new thing per project!
heraldgeezer•34m ago
You are on the internet.

You can download every book or tutorial ever made in our history.

We have access to vast knowledge.

xavortm•54m ago
To me it seems that the path to seniority would shift. It is difficult to answer because we're looking at it from the lens of 'fundamental knowledge'. Instead, to me it seems that now this is less of a requirement compared to 'systems-level thinking'. A very simple example could be the language syntax vs the program structure/parts working together. And with this, a junior developer would still lack this experience and I don't think AI tools would be a problem in developing it.

All I say though is from the perspective of self-taught dev, not a CS student. The current level of LLMs is still far from being a proper replacement to fundamental skills in complex software in my eyes. But it's only in it's worst version it will be from now on.

block_dagger•52m ago
No human devs will be required (or useful except in extreme niches) within a few years. Ten, at the wild maximum, I suspect.
zdragnar•27m ago
> so it would be stupid (plus impossible) not to let students use it

It's been plenty of years since my college days, but even back then professors had to deal with plagiarism and cheating. The class was split into a lecture + a lab. In the lab, you used school computers with only intranet access (old solaris machines, iirc) and tests were all in-class, pen-and-paper.

Of course, they weren't really interested at all in training people to be "developers", they were training computer scientists. C++ was the most modern language to be taught because "web technologies" changed too quickly for a four-year degree to be possible, they argued.

Times have changed quite a bit.

Imustaskforhelp•1h ago
I am a teen who is hopefully going to go to college (Preferably CS), My reason is and was that I really love tinkering with computers and code related automation/scripts (more importantly thinking about scripts)

And to be honest, my intention with going to college is hopefully to rip off any use of AI that I do or have a more learning experience because right now I am bounded severely with time but my curiosity still exists, so I just build things to "prove" that its possible. But within college, I would be able to give time to the thinking process and actually learn and I do feel like I have this curiosity which I am grateful for.

So to me, its like it gives me 4 years of doing something where I would still learn some immense concepts and meet people interested (hopefully) in the same things and one of the ideas I have within college is to actually open up a mini consultancy in the sense of helping people/businesses migrate over from proprietory solutions to open source self-hosted solutions on servers.

My opinion, is that people need a guy who they can talk to if any solution they use for their personal projects for example go wrong, you wouldn't want to talk to AI if for example you use self-hosted matrix/revolt/zulip (slack alternatives) and I think that these propreitory solutions are so rent-seeking/expensive that even if I have a modest fees in all of this, I wish to hopefully still charge less than what they might be paying to others and host it all on servers with better predictability of pricing.

Solopreneurship is never this easy yet never this hard because its hard to stand out. There was a relevant thread on Hackernews about it yesterday that I read about it, and the consensus there from what I read was that marketing might-work but producthunt/these directories are over-saturated.

Your best options are to stay within the community that you wish to help/your product helps and taking that as feedback.

That's my opinion, at least, being honest, I am not worried about what happens within Uni right now but rather the sheer competition within my country to reach a decent CS uni college as people treat it as heaven or just this race seeing what other people are doing and I feel like I am pissed between these two spots at the moment because to get into CS Uni, you have to study non CS subjects (CS doesn't even matter) but my interest within CS gets so encapsulating that its hard to focus on the other subjects. Can't say if that's good or bad but I really have to talk myself into studying to remind what I am studying for (even after which I can still slip up as I get too interested but that's another matter)

Good reminder for me to study chemistry now... wish me luck :)

palata•32m ago
I think it is a good spirit :-). I believe there will always be a need for people who understand the code generated by AI, be it to review it, but also to actually make it work when the AI fails.

The thing is, to be useful next to an AI, you have to become really good at software (note that I said "software", not "coding"; it includes architecture). And to be optimistic: one advantage of students today is that AI can help them learn. Back in the days it was a lot harder to find help, then StackOverflow helped a lot, and I'm sure AI helps even more now.

criddell•1h ago
You probably should ask about a particular program because there are as many answers to your question as there are programs. Even in a single school there are often several tracks. Some are very theory and math heavy, others are more practical.

The part that hasn’t changed is being in a cohort of people like yourself and living in a community centered around a school (and again this varies from school-to-school). I had a lot of fun and met many interesting people who inspired and motivated me. It’s the fastest way to jumpstart your professional network.

I had moved from a small, boring town to a city and the semi-structured life of a student living on-campus made that transition easy and provided an instant social life.

My regret is that I didn’t take advantage of all the things I could have with respect to my electives. I wish I had taken art history or intro to film or visual arts 101 or modern literature or just about any other humanities course that was available to me.

If you want somebody to tell you to skip school, you’ll probably get that advice here too. If all you are after is the piece if paper at the end you probably should skip school or do it remotely. It’s cheaper and more concentrated but you miss the most valuable part of university life.

If entrepreneurship is your thing, you might be better off in a business program.

Falimonda•54m ago
Get them to learn the fundamentals and understand them deeply just like they should/might have in the past.

They can do so at an accelerated rate using AI on verifiable subject matter. Use something like SRS + copilot + nano (related: https://srs.voxos.ai) to really internalize concepts.

Go deep on a project while using AI. To what extreme can they take a program before AI can't offer a working solution? Professors should explore and guide their students to this boundary.

Obligatory reference to "The illustrated guide to a Ph.D." - https://matt.might.net/articles/phd-school-in-pictures/

yaaybabx•49m ago
I’m studying for an MSc in Architectural Computation at the Bartlett, UCL – essentially computer science for architects, with a focus on geometry, simulation and computer graphics. I’m very grateful for this question, because it gives me a chance to synthesise the ideas I’ve had since I started the programme.

Even though our professors are getting worried, the institution itself hasn’t changed dramatically yet when it comes to generative AI. There is an openness from our professor to discuss the matter, but change is slow.

What does work in the current programme —and in my oppinion exactly what we need for next generations— is that we are exposed to an astonishing number of techniques and are given the freedom to interpret and implement them. The only drawback is that some students simply paste LLM outputs as their scripts, while others spend time digging deeper into the methods to gain finer control over the models. This inevitably creates a large discrepancy in skill levels later on and can damage the institution’s reputation by producing a highly non‑homogeneous cohort.

I think the way forward is to develop a solid understanding of the architecture behind each technique, be able to write clear pseudocode, and prototype quickly. Being able to anticipate what goes in and what comes out has never been more important. Writing modular, well‑segmented code is also crucial for maintainability. In my view, “vibe‑coding” is only a phase; eventually students will hit a wall and will need to dig into the fundamentals. The question is can we make them hit the wall during the studies or will that happen later in their career.

In my opinion, and the way I would love to be taught, would be to start with a complex piece of code and try to reverse‑engineer it: trace the data flow, map out the algorithm on paper, and then rebuild it step by step. This forces you to understand both the theory and the implementation, rather than relying on copy‑and‑paste shortcuts.

Hope that is of any use out there, and again, I think there is no time less exciting (and easy!) than this one to climb on the shoulders of giants.

rdtsc•41m ago
CS may stop being a clear way to a high paying job. “Learn to code and then Google will surely hire you and pay you $250k right off the bat” path may be gone. It may become something like physics or math where only people really motivated or interested in fundamentals regardless of landing at a MAANG job in the end will apply.

So I why is your nephew in CS? Did he want to be there because he likes computing or was he “encouraged” by family members ;-) because it was a path to “success”, Not unlike how families encourage kids to become doctors or lawyers.

AI is not the only headwind. Companies are starting to “tighten their belts” and outsourcing work away from US and laying people off. They like to blame AI but it’s a little hard to take them seriously when they turn around and immediately open 10k jobs in India or Eastern Europe. So I guess it depends where you are. If you’re in those countries, then maybe CS career would work out pretty well.

sdevonoes•37m ago
> Learn to code and then Google will surely hire you and pay you $250k right off the bat

Weird. In EU, 99% of graduates didn’t (don’t) have that in mind… A fresh graduate in CS typically earns less than 40-50K (even less depending on the country).

So USA is now like the EU?

melvinroest•22m ago
It has been for a while I suspect.
ForHackernews•21m ago
No, USA is not like the EU because everything still costs American prices.
nemo44x•35m ago
Maybe he was there because he wanted to make a better life for himself and his family. Why is learning to do something because it pays well a bad thing? It’s admirable that someone would do that.
rdtsc•19m ago
> It’s admirable that someone would would that

I guess it could be that. It sounds like you are hinting at it being like a sacrifice almost: they’d rather be doing something else but they forced themselves in to make a better life for their family. It’s like being doctor in US used to be (or still is), when someone would rather not deal with blood and guts but it’s something they’ll force themselves into for a better life.

I suppose one difference here might be if it’s their family pushing this choice or they do it intrinsically. Will they be disappointed in themselves in the end, or the person who pushed them into that path if it doesn’t work out.

Andr2Andr•37m ago
What I see in a German university - no change for undergraduate CS degree, which still has 50% maths annd theoretical CS and is not affected by LLMs. But in a Master’s degree they offer really lots of ML courses - from basics to CV to hardware aware. Exams in those are written on paper without any aids.
pcblues•26m ago
I am not strictly entitled to answer this but I will just in case. (Language is a bit different in Australia.)

I completed a Bachelor CS degree in 1995. I think that's a "CS major program".

It was very theoretical, in that the languages we learnt were too old, too new, and not industry-led. So, Eiffel for OO, Cobol(!), and some proper maths thrown in.

It got me a solid 25 years of work.

After about a five year gap in software development as a job, I am now doing a Masters of Computer Science at the same place (by name alone, maybe) and the tech they teach is ten years old.

I'm not averse to this so far. I finish in a year, and I'll know if it was a waste of time to get back into the industry then.

However, I have done six of the twelve subjects and they ALL filled gaps in my understanding from both my original Bachelor and my work experience. I am a better programmer now.

I am currently in an interview process where I surprised myself with my own knowledge. YMMV of course.

SparklyCircuit•26m ago
I am currently on a CS major and I can definitely say that whether it differs compared to days before heavily depends on the lecturers.

But never the less the usage of LLMs in order to finish homework/be done with tests in a matter of minutes has widely spread. On the other hand the idea of cheating and it's drawbacks have stayed the same - (not em dash, chill) That is robbing yourself of applicable knowledge.

The current idea and motive behind CS majors is dragging us first through ANSI C so we can learn to program.

I have a suspicion that the methodology of ascertaining knowledge has become stricter on programming laboratories compared to before. We are required to create an initial program for a specific lesson and we essentially have a sizable test every week, which consists of adding onto our code. The amount of points we gain is heavily time dependent and in order to finish code quickly we need to understand it already.

Some claim they are able to use chatgpt on those lessons and in my opinion they are digging their own grave because we have very strict rules on passing and rumors day not a lot passed the subject in the last year, a third supposedly.

Some people are already predicting our replacement, but you just have to know that's utter bullshit.

That's why I stopped using AI for exercises because I realized I might fail if I do the initial exercises with usage of LLMs, because I will get slower if I continue to so.

To summarize the CS majors are starting to produce people with no real desire to learn programming and to survive we need to repeat last year's exercises in order to get accustomed to reading poorly written exercises. A lot of tests can be easily cheated off which affects negatively real world experience.

gchallen•17m ago
I teach computing at the University of Illinois. I'm spending a lot of time thinking about how to adapt my own courses and our degree programs. I'm actually at a workshop about incorporating AI into computing education, so this was a timely post to find this morning.

We don't have a coherent message yet. Currently there's a significant mismatch between what we're teaching and the reality of the computing profession our students are entering. That's already true today. Now imagine 2030, when the students we admit today will start graduating. We're having students spend far too much time practicing classical programming, which is both increasingly unnecessary and impedes the ability to effectively teach other concepts. You learn something about resource allocation from banging out malloc by hand, but not as much as you could if you properly leveraged coding agents.

Degree programs also take time and energy to update, and universities just aren't designed to deal with the speed of the changes we're witnessing. Research about how to incorporate AI in computing education is outdated before the ink is dry. New AI degrees that are now coming online were designed several years ago and don't acknowledge the emergent behavior we've seen over the past year. Given the constraints faculty operate under, it's just hard to keep up. I'm not defending those constraints: We need to do better at adapting for the foreseeable future. Creating the freedom to innovate and experiment within our educational systems is a bigger and more fundamental challenge than people realize, and one that's not getting enough attention. We have a huge task ahead to update both how and what we teach. I'm incorporating coding agents into my introductory course (https://www.cs124.org/ai) and designing a new conversational programming course for non-technical students. And of course I'm using AI to accelerate all of this work.

Emotionally, most of my colleagues seem to be stuck somewhere on the Kübler-Ross progression: denial (coding agents don't work), anger (coding agents are bad), bargaining (but we still need to teach Python, right?), depression (computing education is over). We're scared and confused too: acceptance is hard when you don't know what's happening next. That makes it hard to effectively communicate with our students, even if there's a clear basis for connection. Also keep in mind that many computing faculty don't code, and so lack a first-hand perspective on what's changing. (One of the more popular posts about how to use AI effectively on our faculty Slack was about correcting LaTex formatting for a paper submission. Sigh.)

Here's what I'm telling students. First, if you use AI to complete an assignment that wasn't designed to be completed with AI, you're not going to learn much: not much about the topic, or about how to use AI, since one-shotting homework is not good prompting practice. Second, you have to learn how to use these new tools and workflows. Most of that will need to be done outside of class. Start immediately. Finally, speak up! Pressure from students is the most effective driver of curricular change. Don't expect that the faculty teaching your courses understand what's happening.

Personally I've never been more excited to teach computing. I'm a computing educator: I've always wanted my students to be able to build their castles in the sky. It was so hard before! It's easier now. Cue frisson. That's going to invite all kinds of new people with new ideas into computing, and allow us to focus on the meaningful stuff: coming up with good ideas, improving them through iterative feedback, understanding other problem domains, and caring enough to create great things.

0xbeefcafe•16m ago
I teach courses in discrete mathematics, data structures and algorithms, machine learning, and programming at a mid-tier United States public university. I work with many students: those taking my courses, and teaching assistants. Here are the answers to your questions:

1. The curriculum has not really changed. Some faculty are attempting to use AI in their courses. Most of it is charlatanism, the faculty themselves sort of blundering about using the web interfaces (chatgpt.com, claude.ai). Realistically, most students are not proficient enough to use Claude Code yet.

2. Students are buying into AI behind the backs of faculty. There's something like a consensus among CS faculty that AI ought not be used in introductory courses, other than as a search engine replacement for Q&A. Nonetheless, homework averages now approach 100%, whereas exam averages are falling from B/B- (before AI) to C-/D (after AI). AI use is, for most, obviously undermining foundational learning.

3. The majority of liberal arts courses with substantial enrollments (40+) are referred to by students as "fake," as most of the work for these courses can be completed with AI. Seminars remain robust.

4. Exam cheating is widespread. A cell phone is held in the student's lap. The camera is used to photograph what page of the exam faces down. OCR and AI provide an answer. The student flips the page and copies the answer. I have caught students doing this and awarded them a trip to the dean's office and a course grade of F.

5. It is understood that Grade Point Average (GPA) is not a strong signal of achievement, because for many courses, AI use results in a higher grade (and less understanding). Those who understand more, due to ethical attention to their education, often have lower GPAs than those who engineer high GPAs by taking the easiest, AI-vulnerable courses.

6. Mathematics and theory courses that rely on exams for the overwhelming majority of the grade, and which proctor those exams, retain their rigor and retain their value.

7. Students still land FAANG jobs at a reasonable rate. This school never strongly fed FAANG, and the percentage that attains such a position remains about 10% of graduates. Many other graduates land reasonable positions with startups, financial, automotive, logistics, security, etc. firms.

8. Overall student engagement and give-a-damn is circling the drain. Student routinely perform theatrics, such as responding to in-person class discussions by reading the output of their LLM. Students hauled in to discuss AI use on homework often have scripts prepared: to reveal this, it is a simple matter of forcing the student to deviate from his script.

9. New grad interviews seem to take two flavors: the first flavor is one where the new grad is interviewed by a bot. This is regrettable. The second is whiteboarding how a data structure or algorithm is applied to a specific problem. This is laudable.

But what about uni then?

A. Your nephews should attend to their theory courses heavily and avoid leaning on AI. They will not learn faster with AI use. They will reap benefits from understanding the theory of discrete mathematics, data structures, and algorithms. Even if their future as engineers involves heavy use of AI to generate code, understanding that theory will set them apart from their "peers" rather substantially.

ayanmali•15m ago
I'm a CS undergrad at a mid-tier school. My main observations wrt to AI:

- most students use AI to do pretty much all their labs and assignments. Most also use AI tools to help with studying for exams. Students seem pretty dependent on these tools at this point and their writing and coding skills without them have deteriorated substantially imo.

- Curriculums haven't changed much. Professors still put an emphasis on understanding theory and not just letting the LLM think for you.

- Almost every professor is vocally against the use of AI, whether its for writing reports or generating code. Some are ok with using AI as a studying tool or verifying that your solution to a homework problem is correct.

- A friend of mine was taking a course that's meant to be about building software in a practical context, agile, etc. The professor for that course encouraged students to use AI as much as possible for their projects, so I guess the permissibility of AI depends on whether the course is meant to be theoretical or practical.

- A lot of professors don't bother to take the time to explain why the material is relevant in the AI era, but a few do. The argument is that even if AI can physically write our code, real engineers still need to be able to verify that it works and that sort of thing. I think in general, professors want us to keep holding onto hope even if the future seems bleak. I had one professor tell us that engineers will likely be the last group of people to be replaced by AI, so having a thorough understanding of technical domains will still be important for years to come.

- From my point of view, it doesn't seem like students give much respect to the course content. The sentiment I'm seeing is that since AI can solve a lot of math and programming problems, acquiring a deep understanding of these domains is irrelevant.

- A lot of students feel overwhelmed with how competitive the tech job market is and it's seemingly all they think about. Any time I'm in the engineering building all I hear people talking about are interviews, internships, or their lack thereof.

- Students seem pretty divided as to whether they should be optimistic or pessimistic. Some think software engineering is already dead, some think it'll be the last profession to be replaced by AI.

- Some students are more willing to do things like side projects now that they have AI at their disposal. Most students don't seem to be fully up to date on the latest tools though. As of last fall, ChatGPT and Gemini were pretty ubiquitous, but only about ~20% of CS students (as a rough estimate) were using Cursor, and even fewer using tools like Codex and Claude Code, definitely less than 5%. I haven't been in school for the last few months so these numbers are likely higher now.

- Building a startup is trendier now than it was a year or two ago. Granted its a very small minority at my school, but still noteworthy nonetheless.

bhouston•7m ago
I feel that AI moves so fast that its capabilities at the start of the year compared to the end of the year is pretty drastic. Remember that Claude Code is just a year old and the significant more capable agentic models really come out just a few months ago.

Hard to deal with I would expect.

Focus on fundamentals that are timeless and can be applied to any level of AI is my recommendations: - What are algorithms - Theory of databases - P, NP, etc - Computer architecture - O-notation - Why not to use classes - Type theory - And adjacent fields: Mathematics, Engineering, etc...

It is sort of like teaching computer graphics during the start of the video card era - 1996 to 2001. There was for about 5 years really rapid change, where it went from CPU-based rendering, to texturing on the card, to vertex transformations, to assembly programming on the GPU to high level languages (Cg, HLSL, etc.) But the fundamentals didn't change from 1996 to 2001 - so focus on that.