The professors had a habit of sending out an email one week before the due date (right before finals week) which contained several updates to the spec.
It was a surprisingly effective course.
(Dartmouth also followed this up with a theory course that often required writing about 10 pages of proofs per week. I guess they wanted a balance of practice and theory, which isn't the worst way to teach CS.)
I've never been taught anything more clearly than the lessons from that class.
Glad that someone actually did this.
Jenkins, Docker, Kubernetes, none of these sorts of things - and I don’t even mean these specific technologies, but moreover nothing even in their ballpark.
Everyone has a bespoke mishmash of nonsense pipelines, build tools, side cars, load balancers, Terragrunt, Terraform, Tofu, Serverless, Helm charts, etc.
There are enough interesting things here that you wouldn’t even need to make a tool heavy project style software engineering course - you could legitimately make a real life computer science course that studies the algorithms and patterns and things used.
Would make a great course.
It would be easy for me to agree with you. I hold a graduate degree in computer science and I’m named for my contributions proofreading/correcting a graduate text about algorithms in computability theory.
I love abstraction and algorithms and pure theory, but this whole “computer science is a branch of mathematics, nothing more” idea has always struck me as ridiculous. Are you willing to throw out all of the study of operating systems, networking, embedded systems, security (hardware and software), a good chunk of AI, programming languages, UI/UX/human computer interaction, graphics, just to draw a line around algorithms and Turing Machines and say this is all there is to computer science?
Yes I very much think a computer science degree should be as close to the foundation of theory as possible. And still learning Jenkins and kuberenetes or even a general course on how effectively push code is still far from the things you listed.
Theres so much computer science that isnt even covered that id include before including courses on CI/CD
AI is a lot of math, especially if you hang out with the quasiconvex optimization crowd, but a vast majority of work in that field can not properly constitute “theory”
I think it’s clear in practice that computer science has officially strayed beyond whatever narrow bounds people originally wished to confine it to.
If it were me Id get rid of statically defined 4 year programs, and/or definite required courses for degrees, or just degrees in general. Just offer courses and let people come learn what they want.
One of my favorite classes was a python class that focused on building some simple games with tkinter, making a chat client, hosting a server, because it was the first time I understood how actual software worked. Im really glad I took that class.
On the other hand Id love to have learned information theory, lamba calculus, all the early AI, cognitive science, theory of programming languages, philosophy behind all of it that got us here
The research done by professional academic computer scientists also reflects the broad scope I’m advocating for.
a) Practical CI/CD requires understanding (and some practical experience) of many other concepts like Linux shell scripting, version control, build automation, containers, server administration, etc. As few students (in our degree programme) have sufficient experience in these concepts, I spend about half of the semester teaching such devops basics instead of actual software engineering.
b) Effectively, teaching CI/CD means teaching how GitHub or GitLab do CI/CD. I feel a little bit uncomfortable teaching people how to use tech stacks owned by a single company.
¹) Actually it's more a course on basic software craftsmanship for media informatics students because no such course exists in our curriculum and I find it more important that students learn this than that they understand the V model.
It's hard to fit everything student needs to know in the curriculum. Someone else posted here they had 10 pages of proofs per week, for one course. I would have been fired for assigning so much homework!
I was a CS professor at a local college. My solution was to ignore CS1 and CS2 curriculum (we were not ABET accredited, so that's okay) in the second course of Java programming. Instead, I taught students Maven/Gradle, Git and GitHub, workflows, CI/CD, regular expressions, basic networking, basic design patterns, Spring Boot, and in general everything I thought new programmers ought to know. I even found a book that covered much of this stuff, but in the end I wrote my own learning materials and didn't use a book.
The course was a victim of its success. The school mandated the course for non-Java programmers too, resulting in a lot of push-back from the non-Java students.
If anyone is interested, I have the syllabus online still (I've since retired) at <https://wpollock.com/>. Look for COP2800 and COP2805C. I can also send the Java teaching materials as a PDF to anyone interested (book length, but sadly not publishable quality).
Alan Kay, my favorite curmudgeon, spent decades trying to remind us we keep reinventing concepts that were worked out in the late 70s and he’s disappointed we’ve been running in circles ever since. He’s still disappointed because very few programmers are ever introduced to the history of computer science in the way that artists study the history of art or philosophers the history of philosophy.
The effective history of computing spans a lifetime or three.
There's no sense comparing the two. In the year 2500 it might make sense to be disappointed that people don't compare current computational practices with things done in 2100 or even 1970, but right now, to call what we have "history" does a disservice to the broad meaning of that term.
Another issue: art and philosophy have very limited or zero dependence on a material substrate. Computation has overwhelming dependence on the performance of its physical substrate (by various metrics, including but not limited to: cpu speed, memory size, persistent storage size, persistent storage speed, network bandwidth, network scope, display size, display resolution, input device characteristics, sensory modalities accessibly via digital to analog conversion, ...). To assert that the way problems were solved in 1970 obviously has dramatic lessons for how to solve them in 2025 seems to me to completely miss what we're actually doing with computers.
This seems to fundamentally underestimate the nature of most artforms.
True they might not all be "dramatic lessons" for us, but to ignore them and assume that they hold no lessons for us is also a tragic waste of resources and hard-won knowledge.
I once thought about a series of PHYS classes that focus on historical ideas and experiments. Students are supposed to replicate the experiments. They have to read book chapters and papers.
You're not wrong about history, but that only strengthens Kay's case. E.g., our gazillion-times better physical substrate should have led an array of hotshot devs to write web apps that run circles around GraIL[1] by 2025. (Note the modeless GUI interaction.) Well, guess what? Such a thing definitely doesn't exist. And that can only point to programmers having a general lack of knowledge about their incredibly short history.
(In reality my hope is some slice of devs have achieved this and I've summoned links to their projects by claiming the opposite on the internet.)
Edit: just so I get the right incantation for summoning links-- I'm talking about the whole enchilada of a visual language that runs and rebuilds the user's flowchart program as the user modelessly edits it.
Why? What problem did it solve that we're suffering from in 2025?
At 85 he has earned the peace of staying away from anything and everything on the internet.
And yet for the most part, philosophy and the humanities' seminal development took place within about a generation, viz., Plato and Aristotle.
> Computation has overwhelming dependence on the performance of its physical substrate [...].
Computation theory does not.
Not true for either. For centuries it was very expensive to paint with blue due to the cost of blue pigments (which were essentially crushed gemstones).
Philosophy has advanced considerably since the time of Plato and much of what it studies today is dependent on science and technology. Good luck studying philosophy of quantum mechanics back in the Greek city state era!
Art or philosophy might or might not make progress. No one can say for sure. They are bad role models.
As opposed to ours, where we're fond of subjective regression. ;-P
Alan Kay giving the same (unique, his own, not a bad) speech at every conference for 50 years is not Alan Kay being a curmudgeon
>we keep reinventing concepts that were worked out in the late 70s and he’s disappointed we’ve been running in circles ever since.
it's Alan Kay running in circles
https://www.amazon.ca/Visual-Basic-Algorithms-Ready-Run/dp/0...
The BASIC implementation started to outrun the supercomputer at some surprisingly pedestrian array sizes. I was properly impressed.
Unlearning Object-Oriented Programming: a course on specific software engineering techniques
Classical Software Studies: a course on the history of software tools
Writing Fast Code in Slow Languages: a course on specific engineering techniques
User Experience of Command Line Tools: an engineering design course
Obsessions of the Programmer Mind: course about engineering conventions and tools.
One day, the name of science will not be so besmirched.
This is just a common meme that often comes from ignorance, or a strawman of what OOP is.
>CSCI 4020: Writing Fast Code in Slow Languages Analyze performance at a high level, writing interpreted Python that matches or beats typical C++ code while being less fragile and more fun to work with.
I like this one, but see?
Python is heavily OOP, everything is an object in python for example.
I'm wondering if OP took a basic OOP course or would otherwise be interested in taking one? You can learn about a thing you are against, or even form your opinion after actually learning about it.
>
Simulate increasingly unethical product requests and deadlines. The only way to pass is to refuse and justify your refusal with professional standards.
Watch as peers increasingly overpromise and accept unethical product requests and deadlines and leave you high, mighty, and picking up the scraps as they move on to the next thing.
> I don't think of myself as a programmer. I write code, and I often enjoy it when I do, but that term programmer is both limiting and distracting. I don't want to program for its own sake, not being interested in the overall experience of what I'm creating. If I start thinking too much about programming as a distinct entity then I lose sight of that.
Programming is a useful skill, even in the age of large language models, but it should always be used to achieve some greater goal than just writing programs.
Definitely would love that. Reading source code is pretty hard for newbies like me. Some guidance is appreciated.
(However, CSCI 2100 shouldn't be necessary if you should learn stuff other than OOP the first time, even if you also learn OOP.)
but 90s fashion is all the rage these days!
I was definitely guilty of this in my last role. Some of my refactorings were good and needed, but also a distraction from saying the codebase was "good enough" and focusing on the broader people/team/process problems around me.
This should exist and the class should study openssl.
Computer science courses that don't exist, but should (2015) - https://news.ycombinator.com/item?id=16127508 - Jan 2018 (4 comments)
Computer science courses that don't exist, but should (2015) - https://news.ycombinator.com/item?id=13424320 - Jan 2017 (1 comment)
Computer science courses that don't exist, but should - https://news.ycombinator.com/item?id=10201611 - Sept 2015 (247 comments)
fred_is_fred•2h ago
jasonthorsness•1h ago
andoando•1h ago
corysama•1h ago
lmm•1h ago
bruce511•40m ago
Although, to be fair, while it was a helpful practice at coding, I'm not a game designer, so it was a game too terrible to play.
First year Uni though I spent too many hours in the lab, competing with friends, to recreate arcade games on the PC. Skipping the game design part was helpful. To be fair by then we had a glorious 640k of ram. Some Assembly required.