The professors had a habit of sending out an email one week before the due date (right before finals week) which contained several updates to the spec.
It was a surprisingly effective course.
(Dartmouth also followed this up with a theory course that often required writing about 10 pages of proofs per week. I guess they wanted a balance of practice and theory, which isn't the worst way to teach CS.)
I've never been taught anything more clearly than the lessons from that class.
Glad that someone actually did this.
Jenkins, Docker, Kubernetes, none of these sorts of things - and I don’t even mean these specific technologies, but moreover nothing even in their ballpark.
Everyone has a bespoke mishmash of nonsense pipelines, build tools, side cars, load balancers, Terragrunt, Terraform, Tofu, Serverless, Helm charts, etc.
There are enough interesting things here that you wouldn’t even need to make a tool heavy project style software engineering course - you could legitimately make a real life computer science course that studies the algorithms and patterns and things used.
Would make a great course.
It would be easy for me to agree with you. I hold a graduate degree in computer science and I’m named for my contributions proofreading/correcting a graduate text about algorithms in computability theory.
I love abstraction and algorithms and pure theory, but this whole “computer science is a branch of mathematics, nothing more” idea has always struck me as ridiculous. Are you willing to throw out all of the study of operating systems, networking, embedded systems, security (hardware and software), a good chunk of AI, programming languages, UI/UX/human computer interaction, graphics, just to draw a line around algorithms and Turing Machines and say this is all there is to computer science?
Yes I very much think a computer science degree should be as close to the foundation of theory as possible. And still learning Jenkins and kuberenetes or even a general course on how effectively push code is still far from the things you listed.
Theres so much computer science that isnt even covered that id include before including courses on CI/CD
AI is a lot of math, especially if you hang out with the quasiconvex optimization crowd, but a vast majority of work in that field can not properly constitute “theory”
I think it’s clear in practice that computer science has officially strayed beyond whatever narrow bounds people originally wished to confine it to.
If it were me Id get rid of statically defined 4 year programs, and/or definite required courses for degrees, or just degrees in general. Just offer courses and let people come learn what they want.
One of my favorite classes was a python class that focused on building some simple games with tkinter, making a chat client, hosting a server, because it was the first time I understood how actual software worked. Im really glad I took that class.
On the other hand Id love to have learned information theory, lamba calculus, all the early AI, cognitive science, theory of programming languages, philosophy behind all of it that got us here
The research done by professional academic computer scientists also reflects the broad scope I’m advocating for.
a) Practical CI/CD requires understanding (and some practical experience) of many other concepts like Linux shell scripting, version control, build automation, containers, server administration, etc. As few students (in our degree programme) have sufficient experience in these concepts, I spend about half of the semester teaching such devops basics instead of actual software engineering.
b) Effectively, teaching CI/CD means teaching how GitHub or GitLab do CI/CD. I feel a little bit uncomfortable teaching people how to use tech stacks owned by a single company.
¹) Actually it's more a course on basic software craftsmanship for media informatics students because no such course exists in our curriculum and I find it more important that students learn this than that they understand the V model.
It's hard to fit everything student needs to know in the curriculum. Someone else posted here they had 10 pages of proofs per week, for one course. I would have been fired for assigning so much homework!
I was a CS professor at a local college. My solution was to ignore CS1 and CS2 curriculum (we were not ABET accredited, so that's okay) in the second course of Java programming. Instead, I taught students Maven/Gradle, Git and GitHub, workflows, CI/CD, regular expressions, basic networking, basic design patterns, Spring Boot, and in general everything I thought new programmers ought to know. I even found a book that covered much of this stuff, but in the end I wrote my own learning materials and didn't use a book.
The course was a victim of its success. The school mandated the course for non-Java programmers too, resulting in a lot of push-back from the non-Java students.
If anyone is interested, I have the syllabus online still (I've since retired) at <https://wpollock.com/>. Look for COP2800 and COP2805C. I can also send the Java teaching materials as a PDF to anyone interested (book length, but sadly not publishable quality).
Alan Kay, my favorite curmudgeon, spent decades trying to remind us we keep reinventing concepts that were worked out in the late 70s and he’s disappointed we’ve been running in circles ever since. He’s still disappointed because very few programmers are ever introduced to the history of computer science in the way that artists study the history of art or philosophers the history of philosophy.
The effective history of computing spans a lifetime or three.
There's no sense comparing the two. In the year 2500 it might make sense to be disappointed that people don't compare current computational practices with things done in 2100 or even 1970, but right now, to call what we have "history" does a disservice to the broad meaning of that term.
Another issue: art and philosophy have very limited or zero dependence on a material substrate. Computation has overwhelming dependence on the performance of its physical substrate (by various metrics, including but not limited to: cpu speed, memory size, persistent storage size, persistent storage speed, network bandwidth, network scope, display size, display resolution, input device characteristics, sensory modalities accessibly via digital to analog conversion, ...). To assert that the way problems were solved in 1970 obviously has dramatic lessons for how to solve them in 2025 seems to me to completely miss what we're actually doing with computers.
This seems to fundamentally underestimate the nature of most artforms.
True they might not all be "dramatic lessons" for us, but to ignore them and assume that they hold no lessons for us is also a tragic waste of resources and hard-won knowledge.
I once thought about a series of PHYS classes that focus on historical ideas and experiments. Students are supposed to replicate the experiments. They have to read book chapters and papers.
You're not wrong about history, but that only strengthens Kay's case. E.g., our gazillion-times better physical substrate should have led an array of hotshot devs to write web apps that run circles around GraIL[1] by 2025. (Note the modeless GUI interaction.) Well, guess what? Such a thing definitely doesn't exist. And that can only point to programmers having a general lack of knowledge about their incredibly short history.
(In reality my hope is some slice of devs have achieved this and I've summoned links to their projects by claiming the opposite on the internet.)
Edit: just so I get the right incantation for summoning links-- I'm talking about the whole enchilada of a visual language that runs and rebuilds the user's flowchart program as the user modelessly edits it.
Why? What problem did it solve that we're suffering from in 2025?
At 85 he has earned the peace of staying away from anything and everything on the internet.
And yet for the most part, philosophy and the humanities' seminal development took place within about a generation, viz., Plato and Aristotle.
> Computation has overwhelming dependence on the performance of its physical substrate [...].
Computation theory does not.
Not true for either. For centuries it was very expensive to paint with blue due to the cost of blue pigments (which were essentially crushed gemstones).
Philosophy has advanced considerably since the time of Plato and much of what it studies today is dependent on science and technology. Good luck studying philosophy of quantum mechanics back in the Greek city state era!
This was clearly true in 01970, but it's mostly false today.
It's still true today for LLMs and, say, photorealistic VR. But what I'm doing right now is typing ASCII text into an HTML form that I will then submit, adding my comment to a persistent database where you and others can read it later. The main differences between this and a guestbook CGI 30 years ago or maybe even a dialup BBS 40 years ago have very little to do with the performance of the physical substrate. It has more in common with the People's Computer Company's Community Memory 55 years ago (?) using teletypes and an SDS 940 than with LLMs and GPU raytracing.
Sometime around 01990 the crucial limiting factor in computer usefulness went from being the performance of the physical substrate to being the programmer's imagination. This happened earlier for some applications than for others; livestreaming videogames probably requires a computer from 02010 or later, or special-purpose hardware to handle the video data.
Screensavers and demoscene prods used to be attempts to push the limits of what that physical substrate could do. When I saw Future Crew's "Unreal", on a 50MHz(?) 80486, around 01993, I had never seen a computer display anything like that before. I couldn't believe it was even possible XScreensaver contains a museum of screensavers from this period, which displayed things normally beyond the computer's ability. But, in 01998, my office computer was a dual-processor 200MHz Pentium Pro, and it had a screensaver that displayed fullscreen high-resolution clips from a Star Trek movie.
From then on, a computer screen could display literally anything the human eye could see, as long as it was prerendered. The dependence on the physical substrate had been severed. As Zombocom says, the only limit was your imagination. The demoscene retreated into retrocomputing and sizecoding compos, replaced by Shockwave, Flash, and HTML, which freed nontechnical users to materialize their imaginings.
Most things people do with computers today, and in particular the most important things, are things fewer people have been doing with computers in nearly the same way for 30 years, when the physical substrate was very different: 300 times slower, 300 times smaller, a much smaller network.
Except, maybe, mass emotional manipulation, doomscrolling, LLMs, mass surveillance, and streaming video.
A different reason to study the history of computing, though, is the sense in which your claim is true.
Perceptrons were investigated in the 01950s and largely abandoned after Minsky & Papert's book, and experienced some revival as "neural networks" in the 80s. In the 90s the US Postal Service deployed them to recognize handwritten addresses on snailmail envelopes. (A friend of mine who worked on the project told me that they discovered by serendipity that decreasing the learning rate over time was critical.) Dr. Dobb's hosted a programming contest for handwriting recognition; one entry used a neural network, but was disqualified for running too slowly, though it did best on the test data they had the patience to run it on. But in the early 21st century connectionist theories of AI were far outside the mainstream; they were only a matter of the history of computation. Although a friend of mine in 02005 or so explained to me how ConvNets worked and that they were the state-of-the-art OCR algorithm at the time.
Then ImageNet changed everything, and now we're writing production code with agentic LLMs.
Many things that people have tried before that didn't work at the time, limited by the physical substrate, might work now.
That argument actually strengthens the original point: Even though it's been that short, youngsters often still don't have a clue.
Art or philosophy might or might not make progress. No one can say for sure. They are bad role models.
As opposed to ours, where we're fond of subjective regression. ;-P
Micro-economics is much more approachable with experiments etc.
Btw, I didn't suggest to completely disregard history. Physics and civil engineering don't completely disregard their histories, either. But they also don't engage in constant navel gazing and re-hashing like a good chunk of the philosophers do.
Alan Kay giving the same (unique, his own, not a bad) speech at every conference for 50 years is not Alan Kay being a curmudgeon
>we keep reinventing concepts that were worked out in the late 70s and he’s disappointed we’ve been running in circles ever since.
it's Alan Kay running in circles
How far we have fallen but so great the the reward if we could, "lift ourselves up again." I have hope in people like Bret Victor and Brenda Laurel.
https://www.amazon.ca/Visual-Basic-Algorithms-Ready-Run/dp/0...
The BASIC implementation started to outrun the supercomputer at some surprisingly pedestrian array sizes. I was properly impressed.
Gave a good idea of how python is even remotely useful for AI.
Unlearning Object-Oriented Programming: a course on specific software engineering techniques
Classical Software Studies: a course on the history of software tools
Writing Fast Code in Slow Languages: a course on specific engineering techniques
User Experience of Command Line Tools: an engineering design course
Obsessions of the Programmer Mind: course about engineering conventions and tools.
One day, the name of science will not be so besmirched.
This is just a common meme that often comes from ignorance, or a strawman of what OOP is.
>CSCI 4020: Writing Fast Code in Slow Languages Analyze performance at a high level, writing interpreted Python that matches or beats typical C++ code while being less fragile and more fun to work with.
I like this one, but see?
Python is heavily OOP, everything is an object in python for example.
I'm wondering if OP took a basic OOP course or would otherwise be interested in taking one? You can learn about a thing you are against, or even form your opinion after actually learning about it.
>
Simulate increasingly unethical product requests and deadlines. The only way to pass is to refuse and justify your refusal with professional standards.
Watch as peers increasingly overpromise and accept unethical product requests and deadlines and leave you high, mighty, and picking up the scraps as they move on to the next thing.
> I don't think of myself as a programmer. I write code, and I often enjoy it when I do, but that term programmer is both limiting and distracting. I don't want to program for its own sake, not being interested in the overall experience of what I'm creating. If I start thinking too much about programming as a distinct entity then I lose sight of that.
Programming is a useful skill, even in the age of large language models, but it should always be used to achieve some greater goal than just writing programs.
Definitely would love that. Reading source code is pretty hard for newbies like me. Some guidance is appreciated.
(However, CSCI 2100 shouldn't be necessary if you should learn stuff other than OOP the first time, even if you also learn OOP.)
Do you have a moment to talk about our saviour, Lord interactive debugging?
but 90s fashion is all the rage these days!
I was definitely guilty of this in my last role. Some of my refactorings were good and needed, but also a distraction from saying the codebase was "good enough" and focusing on the broader people/team/process problems around me.
This should exist and the class should study openssl.
Computer science courses that don't exist, but should (2015) - https://news.ycombinator.com/item?id=16127508 - Jan 2018 (4 comments)
Computer science courses that don't exist, but should (2015) - https://news.ycombinator.com/item?id=13424320 - Jan 2017 (1 comment)
Computer science courses that don't exist, but should - https://news.ycombinator.com/item?id=10201611 - Sept 2015 (247 comments)
CS103 Methodologies: Advanced Hack at it ‘till it Works
CS103 History: Fashion, Buzzwords and Reinvention
CS104 AI teaches Software Architecture (CS103 prerequisite)
Systems Engineering 101/201/301/401: How to design a computer system to be reliable
Security Engineering 101/201/301/401: How security flaws happen and how to prevent them
Conway's Law 101/201: Why the quality of the software you write is less important than your org chart
The Real DevOps 101/201/301: Why and how to simultaneously deliver software faster, with higher quality, and fewer bugs
Old And Busted 101/201: The antiquated patterns developers still use, why they're crap, what to use instead
Thinking Outside the Box 101: Stupid modern designs and why older ones are better
New Technology 101: The new designs that are actually superior and why
Project Management 101/201/301: History of project management trends, and how to manage any kind of work
Managing for Engineers 101/201/301: Why and how to stop trying to do everything, empowering your staff, data-driven continuous improvement
Quality Control 101/201: Improving and maintaining quality
Legal Bullshit 101/201: When you are legally responsible and how not to step in itIs this a good place to complain about tools with long option names that only accept a single dash? I'm thinking of you, `java -jar`.
I think OOP became popular because it feels profound when you first grasp it. There is that euphoric moment when all the abstractions suddenly interlock, when inheritance, polymorphism, and encapsulation seem to dance together in perfect logic. It feels like you have entered a secret order of thinkers who understand something hidden. Each design pattern becomes a small enlightenment, a moment of realization that the system is clever in ways that ordinary code is not.
But if you step back far enough, the brilliance starts to look like ornament. Many of these patterns exist only to patch over the cracks in the paradigm itself. OOP is not a natural way of thinking, but a habit of thinking that bends reality into classes and hierarchies whether or not they belong there. It is not that OOP is wrong, but that it makes you mistake complexity for depth.
Then you encounter functional programming, and the same transformation begins again. It feels mind expanding at first, with the purity of immutable data, the beauty of composability, and the comfort of mathematical certainty. You trade one set of rituals for another: monads instead of patterns, recursion instead of loops, composition instead of inheritance. You feel that familiar rush of clarity, the sense that you have seen through the surface and reached the essence.
But this time the shift cuts deeper. The difference between the two paradigms is not just structural but philosophical. OOP organizes the world by binding behavior to state. A method belongs to an object, and that object carries with it an evolving identity. Once a method mutates state, it becomes tied to that state and to everything else that mutates it. The entire program becomes a web of hidden dependencies where touching one corner ripples through the whole. Over time you code yourself into a wall. Refactoring stops being a creative act and turns into damage control.
Functional programming severs that chain. It refuses to bind behavior to mutable state. Statelessness is its quiet revolution. It means that a function’s meaning depends only on its inputs and outputs. Nothing else. Such a function is predictable, transparent, and portable. It can be lifted out of one context and placed into another without consequence. The function becomes the fundamental atom of computation, the smallest truly modular unit in existence.
That changes everything. In functional programming, you stop thinking in terms of objects with responsibilities and start thinking in terms of transformations that can be freely composed. The program stops feeling like a fortress of interlocking rooms and begins to feel like a box of Lego bricks. Each function is a block, self-contained, perfectly shaped, designed to fit with others in infinitely many ways. You do not construct monoliths; you compose arrangements. When you need to change something, you do not tear down the wall. You simply reassemble the bricks into new forms.
This is the heart of functional nirvana: the dream of a codebase that can be reorganized endlessly without decay. Where every part is both independent and harmonious, where change feels like play instead of repair. Most programmers spend their careers trying to reach that state, that perfect organization where everything fits together, but OOP leads them into walls that cannot move. Functional programming leads them into open space, where everything can move.
Reality will always be mutable, but the beauty of functional programming is that it isolates that mutability at the edges. The pure core remains untouched, composed of functions that never lie and never change. Inside that core, every function is both a truth and a tool, as interchangeable as Lego bricks and as stable as mathematics.
So when we ask which paradigm handles complexity better, the answer becomes clear. OOP hides complexity behind walls. Functional programming dissolves it into parts so small and transparent that complexity itself becomes optional. The goal is not purity for its own sake, but freedom; the freedom to recompose, reorganize, and rethink without fear of collapse. That is the real enlightenment: when your code stops feeling like a structure you maintain and starts feeling like a universe you can endlessly reshape.
No, it's about creating something which does something useful and is easy to maintain. Plumbers have great ideas and approaches but you just want plumbing which works and can be fixed .
It's time developers realised they are plumbers not **** artists.
If we're plumbers, why all the memory leaks?
Or... are we also bad plumbers?
fred_is_fred•3h ago
jasonthorsness•3h ago
andoando•3h ago
corysama•3h ago
lmm•3h ago
bruce511•2h ago
Although, to be fair, while it was a helpful practice at coding, I'm not a game designer, so it was a game too terrible to play.
First year Uni though I spent too many hours in the lab, competing with friends, to recreate arcade games on the PC. Skipping the game design part was helpful. To be fair by then we had a glorious 640k of ram. Some Assembly required.
tdeck•1h ago