I did take one of the MIT intro CS MOOCs at one point for kicks. Very good. But it was more or less learn Python on your own if you don't already know it (or how to program more broadly). That doesn't really happen in a lot of other disciplines other than some areas of the arts.
Now, I'm sure some would argue "tough." What are you doing at MIT then? And certainly, there are SO many opportunities these days to get some grounding in a way that may not be as readily possible with chemistry much less nuclear engineering for example. But it is something I think about now and then.
I'm also a CS guy so I can't directly challenge this on the whole, but my experiences in some classes outside of this in other domains didn't feel like they were 'comfortably' paced at all. Without extensive out-of-class work I'd have been completely lost in no time. In fact one electrical engineering course I took was ironically considered a weed out course, for computer science, as it was required, and was probably the most brutal (and amazing) class I've ever taken in my life.
I had basically a machine shop course in mechanical engineering in college. OK, it was a bit more than that but I had no "shop" in high school.
Certainly didn't really do anything that would have really prepared me for a civil engineering or or chemical engineering degree.
I had actually done a little bit of fiddling around with electronics (and maybe should have majored in that). But certainly college would have been a whole different level. (With a whole lot more math which was never my strong suit.)
So, yeah, these days I think there's a different baseline assumption for CS/programming than many other majors.
Git, shell, basics.. even simple python if you have any at all programming experience - not nearly as hard as what they're teaching in the class.
Most of the time something like that like learning latex or git basics.. they'll say.. you'll pick up what you need. They're not gonna spend 12 weeks on those subjects they aren't hard enough.
Of course, you were struggling with fairly primitive tools at the time as well. Made a typo? Time to beg the grad students running the facility for some more compute cycles.
Although it's out of print I don't immediately see a full copy online. https://www2.seas.gwu.edu/~kaufman1/FortranColoringBook/Colo...
I thought that was pretty strange at the time because like 5% of the students end up going into research. So that was basically like him saying I'm totally cool with our educational program being misaligned for 95% percent of our customers...
Maybe it makes sense for the big picture though. If all the breakthroughs come from those 5%, it might benefit everyone to optimize for them. (I don't expect they would have called the program particularly optimized either though ;)
But it's also the case that (only half-joking) a lot of faculty at research universities regard most undergrads as an inconvenience at best.
I think there are a number of ways in which financial incentives and University culture are misaligned with this reality.
A chemistry, physics, or even MechE BS is coming out only at the very beginning of their training, and will require lots of specific on-the-job training if they go into industry. School is about the principles of the field and how to think critically / experimentally. E.g. software debugging requires an understanding of hypothesis testing and isolation before the details of specific tech ever come into play. This is easy to take for granted because many people have that skill naturally, others need to be trained and still never quite get it.
Edit: of course if only 5% of grads are going on to research then maybe the department is confused. A lot of prestigious schools market themselves as research institutions and advertise the undergrad research opportunities etc. If you choose to go there then you know what you're getting into.
This. I went to the University of Iowa in the aughts. My experience was that because they didn't cover a lot of the same material in this MIT Missing Semester 2026 list, a lot of the classes went poorly. They had trouble moving students through the material on the syllabus because most students would trip over these kinds of computing basics that are necessary to experiment with the DS+A theory via actual programming. And the department neither added a prereq that covers these basics or nor incorporated them into other courses's syllabi. Instead, they kept trying what wasn't working: having a huge gap between the nominal material and what the average student actually got (but somehow kept going on to the next course). I don't think it did any service to anyone. They could have taken time to actually help most students understand the basics, they could have actually proceeded at a quicker pace through the theoretical material more for the students who actually did understand the basics, they could have ensured their degree actually was a mark of quality in the job market, etc.
It's nice that someone at MIT is recognizing this and putting together this material. The name and about page suggest though it's not something the department has long recognized and uncontroversially integrated into the program (perhaps as an intro class you can test out of), which is still weird.
You didn't take those majors at MIT or Stanford to learn to code COBOL or C or Java, although in upper division classes, you may have started learning how those languages worked and may have begun working on advanced-theory projects like compilers. Students in those programs were understood to either be future academics or auto-didactic enough to pick up the vocational skills on their own. (Admittedly: this did mean that a lot of top tier CS grads were pretty awful software engineers)
It's only in the last couple decades that these programs, and the less prestigious ones that try to emulate them, started reluctantly admitting that that students and administrators saw them as vocational schools for getting high-paying jobs in tech the year after graduation. Many of their senior professors and alums were not thrilled about that.
I'm surprised that you found your professor's position unfamiliar or confounding, because it's still pretty widely held in departments/programs that specifically identify as "Computer Science" rather than Software Engineer or Computer Engineering or whatever.
But my general sense based on some level of connections is you're expected to figure out a lot of, for lack of a better term, practicalities on your own. I don't think there's a lot of hand-holding in many cases--probably more so in some domains than others.
Unfortunately I heard that class was retired and there was no direct replacement, which is a shame. It was an excellent crash course in shipping.
So the word on the street was that his was a good class to take if you wanted a chance to learn the programming language. (Because you have only so much time in the day to allocate to labs.)
And rumor was also not to say to the professor that you want to learn that language, because word had gotten back to him about the off-label draw of his class to many, and he didn't like it.
- University being cost prohibitive to 90 percent of all humans as financial driven institutions, not performance.
- Before AI, 20 + years of google data indexing/searches fueling academia
- study groups before that allowing group completion (or, cheating, in your view)
- The textbook that costs 500 dollars, or the textbook software from pearson that costs 500, that has the homework answers.
I think it's a silly posit that students using AI is...anything to even think about. I use it at my fortune 500 job every day, and have learned about my field's practical day-to-day from it than any textbook, homework assignment, practical etc.
For similar reasons I think arts and humanities students should take marketing and business courses.
The lectures were primarily about algorithms, basic data structures etc and the extra "labs", taught by Teaching Assistants, was almost always for reviewing the lecture notes with a focus on answering questions.
At no point was there any discussion around "hey, here is a good way to design, write and test a program in a compiled language". My prior experience was with BASIC so just figuring out how to compile a program was a skill to pick up. I thankfully picked it up quickly but others struggled.
Another thing I saw often was people writing ENTIRE programs and then trying to compile them and getting "you have 500 compilation errors". I never wrote programs this way, I was more "write a couple lines, compile, see what happens etc" but it always struck me that even just suggesting that option in class would have helped a lot of folks.
(This being HN, I'm sure some people will say that students figuring this stuff out on their own helps weed out non-serious people but I still don't 100% buy that argument)
kratom_sandwich•1h ago