The professors had a habit of sending out an email one week before the due date (right before finals week) which contained several updates to the spec.
It was a surprisingly effective course.
(Dartmouth also followed this up with a theory course that often required writing about 10 pages of proofs per week. I guess they wanted a balance of practice and theory, which isn't the worst way to teach CS.)
I've never been taught anything more clearly than the lessons from that class.
Glad that someone actually did this.
(I didn't believe him at the time, but in some ways he really didn't go far enough...)
Jenkins, Docker, Kubernetes, none of these sorts of things - and I don’t even mean these specific technologies, but moreover nothing even in their ballpark.
Everyone has a bespoke mishmash of nonsense pipelines, build tools, side cars, load balancers, Terragrunt, Terraform, Tofu, Serverless, Helm charts, etc.
There are enough interesting things here that you wouldn’t even need to make a tool heavy project style software engineering course - you could legitimately make a real life computer science course that studies the algorithms and patterns and things used.
Would make a great course.
It would be easy for me to agree with you. I hold a graduate degree in computer science and I’m named for my contributions proofreading/correcting a graduate text about algorithms in computability theory.
I love abstraction and algorithms and pure theory, but this whole “computer science is a branch of mathematics, nothing more” idea has always struck me as ridiculous. Are you willing to throw out all of the study of operating systems, networking, embedded systems, security (hardware and software), a good chunk of AI, programming languages, UI/UX/human computer interaction, graphics, just to draw a line around algorithms and Turing Machines and say this is all there is to computer science?
Yes I very much think a computer science degree should be as close to the foundation of theory as possible. And still learning Jenkins and kuberenetes or even a general course on how effectively push code is still far from the things you listed.
Theres so much computer science that isnt even covered that id include before including courses on CI/CD
AI is a lot of math, especially if you hang out with the quasiconvex optimization crowd, but a vast majority of work in that field can not properly constitute “theory”
I think it’s clear in practice that computer science has officially strayed beyond whatever narrow bounds people originally wished to confine it to.
If it were me Id get rid of statically defined 4 year programs, and/or definite required courses for degrees, or just degrees in general. Just offer courses and let people come learn what they want.
One of my favorite classes was a python class that focused on building some simple games with tkinter, making a chat client, hosting a server, because it was the first time I understood how actual software worked. Im really glad I took that class.
On the other hand Id love to have learned information theory, lamba calculus, all the early AI, cognitive science, theory of programming languages, philosophy behind all of it that got us here
The research done by professional academic computer scientists also reflects the broad scope I’m advocating for.
a) Practical CI/CD requires understanding (and some practical experience) of many other concepts like Linux shell scripting, version control, build automation, containers, server administration, etc. As few students (in our degree programme) have sufficient experience in these concepts, I spend about half of the semester teaching such devops basics instead of actual software engineering.
b) Effectively, teaching CI/CD means teaching how GitHub or GitLab do CI/CD. I feel a little bit uncomfortable teaching people how to use tech stacks owned by a single company.
¹) Actually it's more a course on basic software craftsmanship for media informatics students because no such course exists in our curriculum and I find it more important that students learn this than that they understand the V model.
It's hard to fit everything student needs to know in the curriculum. Someone else posted here they had 10 pages of proofs per week, for one course. I would have been fired for assigning so much homework!
I was a CS professor at a local college. My solution was to ignore CS1 and CS2 curriculum (we were not ABET accredited, so that's okay) in the second course of Java programming. Instead, I taught students Maven/Gradle, Git and GitHub, workflows, CI/CD, regular expressions, basic networking, basic design patterns, Spring Boot, and in general everything I thought new programmers ought to know. I even found a book that covered much of this stuff, but in the end I wrote my own learning materials and didn't use a book.
The course was a victim of its success. The school mandated the course for non-Java programmers too, resulting in a lot of push-back from the non-Java students.
If anyone is interested, I have the syllabus online still (I've since retired) at <https://wpollock.com/>. Look for COP2800 and COP2805C. I can also send the Java teaching materials as a PDF to anyone interested (book length, but sadly not publishable quality).
Huh. As a professor, I would not be able to grade this kind of volume in any serious capacity. Especially since proofs need to be scrutinized carefully for completeness and soundness. I wonder how their instructor manages.
Alan Kay, my favorite curmudgeon, spent decades trying to remind us we keep reinventing concepts that were worked out in the late 70s and he’s disappointed we’ve been running in circles ever since. He’s still disappointed because very few programmers are ever introduced to the history of computer science in the way that artists study the history of art or philosophers the history of philosophy.
The effective history of computing spans a lifetime or three.
There's no sense comparing the two. In the year 2500 it might make sense to be disappointed that people don't compare current computational practices with things done in 2100 or even 1970, but right now, to call what we have "history" does a disservice to the broad meaning of that term.
Another issue: art and philosophy have very limited or zero dependence on a material substrate. Computation has overwhelming dependence on the performance of its physical substrate (by various metrics, including but not limited to: cpu speed, memory size, persistent storage size, persistent storage speed, network bandwidth, network scope, display size, display resolution, input device characteristics, sensory modalities accessibly via digital to analog conversion, ...). To assert that the way problems were solved in 1970 obviously has dramatic lessons for how to solve them in 2025 seems to me to completely miss what we're actually doing with computers.
This seems to fundamentally underestimate the nature of most artforms.
True they might not all be "dramatic lessons" for us, but to ignore them and assume that they hold no lessons for us is also a tragic waste of resources and hard-won knowledge.
My dictionary absolutely implies that, it even claims that all the sciences were split of from Philosophy and that a common modern topic of Philosophy is the theory of science. The point of Philosophy is to define truth in all aspects, how is that not science? It's even in the name: "friend of truth". Philosophy is even more fundamental and formal than mathematics. Mathematics asks what sound systems are, what properties they have and how they can be generalized. Philosophy asks, what something truly is, what it means to know, what it means to have a system and whether it's real. The common trope of going even more fundamental/abstract goes: "biology -> chemistry -> physics -> mathematics -> philosophy"
I once thought about a series of PHYS classes that focus on historical ideas and experiments. Students are supposed to replicate the experiments. They have to read book chapters and papers.
You're not wrong about history, but that only strengthens Kay's case. E.g., our gazillion-times better physical substrate should have led an array of hotshot devs to write web apps that run circles around GraIL[1] by 2025. (Note the modeless GUI interaction.) Well, guess what? Such a thing definitely doesn't exist. And that can only point to programmers having a general lack of knowledge about their incredibly short history.
(In reality my hope is some slice of devs have achieved this and I've summoned links to their projects by claiming the opposite on the internet.)
Edit: just so I get the right incantation for summoning links-- I'm talking about the whole enchilada of a visual language that runs and rebuilds the user's flowchart program as the user modelessly edits it.
Why? What problem did it solve that we're suffering from in 2025?
At 85 he has earned the peace of staying away from anything and everything on the internet.
The easy example I used to use to really blow people's minds on what was possible was Mathematica.
That is to say, it isn't so much lack of knowledge of history. It is lack of knowledge of the present. And a seeming unwillingness to want to pay for some things from a lot of folks.
Isn't that pretty much how things like simulink and gnu radio flowgraphs work?
And yet for the most part, philosophy and the humanities' seminal development took place within about a generation, viz., Plato and Aristotle.
> Computation has overwhelming dependence on the performance of its physical substrate [...].
Computation theory does not.
Not true for either. For centuries it was very expensive to paint with blue due to the cost of blue pigments (which were essentially crushed gemstones).
Philosophy has advanced considerably since the time of Plato and much of what it studies today is dependent on science and technology. Good luck studying philosophy of quantum mechanics back in the Greek city state era!
This was clearly true in 01970, but it's mostly false today.
It's still true today for LLMs and, say, photorealistic VR. But what I'm doing right now is typing ASCII text into an HTML form that I will then submit, adding my comment to a persistent database where you and others can read it later. The main differences between this and a guestbook CGI 30 years ago or maybe even a dialup BBS 40 years ago have very little to do with the performance of the physical substrate. It has more in common with the People's Computer Company's Community Memory 55 years ago (?) using teletypes and an SDS 940 than with LLMs and GPU raytracing.
Sometime around 01990 the crucial limiting factor in computer usefulness went from being the performance of the physical substrate to being the programmer's imagination. This happened earlier for some applications than for others; livestreaming videogames probably requires a computer from 02010 or later, or special-purpose hardware to handle the video data.
Screensavers and demoscene prods used to be attempts to push the limits of what that physical substrate could do. When I saw Future Crew's "Unreal", on a 50MHz(?) 80486, around 01993, I had never seen a computer display anything like that before. I couldn't believe it was even possible XScreensaver contains a museum of screensavers from this period, which displayed things normally beyond the computer's ability. But, in 01998, my office computer was a dual-processor 200MHz Pentium Pro, and it had a screensaver that displayed fullscreen high-resolution clips from a Star Trek movie.
From then on, a computer screen could display literally anything the human eye could see, as long as it was prerendered. The dependence on the physical substrate had been severed. As Zombocom says, the only limit was your imagination. The demoscene retreated into retrocomputing and sizecoding compos, replaced by Shockwave, Flash, and HTML, which freed nontechnical users to materialize their imaginings.
The same thing had happened with still 2-D monochrome graphics in the 01980s; that was the desktop publishing revolution. Before that, you had to learn to program to make graphics on a computer, and the graphics were strongly constrained by the physical substrate. But once the physical substrate was good enough, further improvements didn't open up any new possible expressions. You can print the same things on a LaserWriter from 01985 that you can print on the latest black-and-white laser printer. The dependence on the physical substrate has been severed.
For things you can do with ASCII text without an LLM, the cut happened even earlier. That's why we still format our mail with RFC-822, our equations with TeX, and in some cases our code with Emacs, all of whose original physical substrate was a PDP-10.
Most things people do with computers today, and in particular the most important things, are things fewer people have been doing with computers in nearly the same way for 30 years, when the physical substrate was very different: 300 times slower, 300 times smaller, a much smaller network.
Except, maybe, mass emotional manipulation, doomscrolling, LLMs, mass surveillance, and streaming video.
A different reason to study the history of computing, though, is the sense in which your claim is true.
Perceptrons were investigated in the 01950s and largely abandoned after Minsky & Papert's book, and experienced some revival as "neural networks" in the 80s. In the 90s the US Postal Service deployed them to recognize handwritten addresses on snailmail envelopes. (A friend of mine who worked on the project told me that they discovered by serendipity that decreasing the learning rate over time was critical.) Dr. Dobb's hosted a programming contest for handwriting recognition; one entry used a neural network, but was disqualified for running too slowly, though it did best on the test data they had the patience to run it on. But in the early 21st century connectionist theories of AI were far outside the mainstream; they were only a matter of the history of computation. Although a friend of mine in 02005 or so explained to me how ConvNets worked and that they were the state-of-the-art OCR algorithm at the time.
Then ImageNet changed everything, and now we're writing production code with agentic LLMs.
Many things that people have tried before that didn't work at the time, limited by the physical substrate, might work now.
See https://longnow.org/ideas/long-now-years-five-digit-dates-an...
But I hate it. It makes most readers stumble over the dates, and it does so to grind an axe that is completely unrelated to the topic at hand.
That argument actually strengthens the original point: Even though it's been that short, youngsters often still don't have a clue.
But the rhyme and reason for who is known is not at all obvious. Outside of "who is getting marketed."
I will agree that the "greats" seem to tend to know all of this. Such that I think I'm agreeing with your parenthetical there. But most practitioners?
I owned at one time a wonderful two-volume anthology called Actors on Acting, which collected analysis and memoir and advice going back... gosh, to Roman theatre, at least. (The Greeks were more quasi-religious, and therefore mysterious - or maybe the texts just haven't survived. I can't remember reading anything first-hand, but there has been a good deal of experimental "original practice" work done exploring "how would this have worked?"). My graduate scholarship delved into Commedia dell'Arte, and classical Indian theatre, as well as 20th century performers and directors like Grotowski, and Michael Chekhov, and Joan Littlewood. Others, of course, have divergent interests, but anyone I've met who cares can geek out for hours about this stuff.
However, acting (or, really, any performance discipline), is ephemeral. It invokes a live experience, and even if you (and mostly you don't, even for the 20th c) have a filmed version of a seminal performance it's barely anything like actually being there. Nor, until very recently, did anyone really write anything about rehearsal and training practice, which is where the real work gets done.
Even for film, which coincidentally covers kinda the same time-period as "tech" as you mean it, styles of performance - and the camera technology which enables different filming techniques - have changed so much, that what's demanded in one generation isn't much like what's wanted in the next. (I think your invocation of film directors is more apt: there are more "universal" principles in composition and framing than there are in acting styles.)
Acting is a personal, experiential craft, which can't be learned from academic study. You've got to put in hours of failure in the studio, the rehearsal room, and the stage or screen to figure out how to do it well.
Now, here's where I'll pull this back to tech: I think programming is like that, too. Code is ephemeral, and writing it can only be learned by doing. Architecture is ephemeral. Tooling is ephemeral. So, yes: there's a lot to be learned (and should be remembered) from the lessons left by previous generations, but everything about the craft pulls its practitioners in the opposite direction. So, like, I could struggle through a chapter of Knuth, or I could dive into a project of my own, and bump up against those obstacles and solve them for myself. Will it be as efficient? No, but it'll be more immediately satisfying.
Here's another thing I think arts and tech have in common: being a serious practitioner is seldom what gets the prize (if by that you mean $$$). Knuth's not a billionaire, nor are any of my favorite actors Stars. Most people in both disciplines who put in the work for the work's sake get out-shined by folks lucky enough to be in the right place at the right time, or who optimize for hustle or politics or fame. (I've got no problem with the first category, to be clear: god bless their good fortune, and more power to them; the others makes me sad about human nature, or capitalism, or something.) In tech, at least, pursuing one's interest is likely to lead to a livable wage - but let's see where our AI masters leave us all in a decade, eh?
Anyway, I've gone on much to much, but you provoked an interesting discussion, and what's the internet for if not for that?
Given the pace of CS (like you mentioned) 50 years might as well be centuries and so early computing devices and solutions are worth studying to understand how the technology has evolved and what lessons we can learn and what we can discard.
That's absolutely false. Do you know why MCM furniture is characterized by bent plywood? It's because we developed the glues that enabled this during world war II. In fashion you had a lot more colors beginning in the mid 1800s because of the development of synthetic dyes. Really odd that oil paints were really perfected around Holland (major place for flax and thus linseed oil), which is what the dutch masters _did_. Architectural mcmansions began because of the development of pre-fab roof trusses in the 70s and 80s.
How about philosophy? Well, the industrial revolution and it's consequences have been a disaster for the human race. I could go on.
The issue is that engineers think they're smart and can design things from first principles. The problem is that they're really not, and design things from first principles.
Studying history is not just, or even often, a way to rediscover old ways of doing things.
Learning about the people, places, decisions, discussions, and other related context is of intrinsic value.
Also, what does "material substrate" have to do with history? It sounds here like you're using it literally, in which case you're thinking like an engineer and not like a historian. If you're using it metaphorically, well, art and philosophy are absolutely built on layers of what came before.
Consider transport. Millennia ago, before the domestication of the horse, the fastest a human could travel was by running. That's a peak of about 45 km/h, but around 20 km/h sustained over a long distance for the fastest modern humans; it was probably a bit less then. Now that's about 900 km/h for commercial airplanes (45x faster) or 3500 km/h for the fastest military aircraft ever put in service (178x faster). Space travel is faster still, but so rarely used for practical transport I think we can ignore it here.
My current laptop, made in 2022 is thousands of times faster than my first laptop, made in 1992. It has about 8000 times as much memory. Its network bandwidth is over 4000 times as much. There are few fields where the magnitude of human technology has shifted by such large amounts in any amount of time, much less a fraction of a human lifespan.
Dude watch original StarTrek from 1960’s you will be surprised.
You might also be surprised that all AI stuff nowadays is so hyped was already invented in 1960’s only that they didn’t have our hardware to run large models. Read up on neural networks.
"Computer" goes back to 1613 per https://en.wikipedia.org/wiki/Computer_(occupation)
https://en.wikipedia.org/wiki/Euclidean_algorithm was 300 BC.
https://en.wikipedia.org/wiki/Quadratic_equation has algorithms back to 2000 BC.
Art or philosophy might or might not make progress. No one can say for sure. They are bad role models.
As opposed to ours, where we're fond of subjective regression. ;-P
Micro-economics is much more approachable with experiments etc.
Btw, I didn't suggest to completely disregard history. Physics and civil engineering don't completely disregard their histories, either. But they also don't engage in constant navel gazing and re-hashing like a good chunk of the philosophers do.
Alan Kay giving the same (unique, his own, not a bad) speech at every conference for 50 years is not Alan Kay being a curmudgeon
>we keep reinventing concepts that were worked out in the late 70s and he’s disappointed we’ve been running in circles ever since.
it's Alan Kay running in circles
In my observation the problem rather is that many of the people who want to "learn" computer science actually just want to get a certification to get a cushy job at some MAGNA company, and then they complain about the "academic ivory tower" stuff that they learned at the university.
So, the big problem is not the lack of competent educators, but practitioners actively sabotaging the teaching of topics that they don't consider to be relevant for the job at a MAGNA company. The same holds for the bigwigs at such companies.
I sometimes even see the conspiracy that if a lot of graduates saw that what their work at these MAGNA involves is from the history of computer science often decades old and has been repeated multiple times over the decades, this might demotivate the employees who are to believe that they work on the "most important, soon to be world changing" thing.
How far we have fallen but so great the the reward if we could, "lift ourselves up again." I have hope in people like Bret Victor and Brenda Laurel.
It's funny, my impression had been that Mansfield was legislating from a heavily right-wing standpoint, of 'Why are we tax-and-spending into something that won't help protect the sanctity of my private property?" I was pleased to see the motivations were different. The fact that so few have articulated why the invoke Mansfield underscores why I was able to adopt such an orthogonal interpretation of Mansfield's amendment prior to this discussion. All the best AfterHIA. -ricksunny
Q: "Sooo... what does this do that Ansible doesn't?"
A: "I've never heard of Ansible until now."
Lots of people think they are the first to come across some concept or need. Like every generation when they listen to songs with references to drugs and sex.
I think software engineering have so many social problems to a level that other fields just don't have. Dogmatism, superstition, toxicity ... you name it.
You must not know any electricians. These behaviors are far from unique to one field.
Software development/programming is a field where the importance of planning and design lies somewhere between ignored and outright despised. The role of software architect is both ridiculed and vilified, whereas the role of the brave solo developer is elevated to the status of hero.
What you get from that cultural mix is a community that values ad-hoc solutions made up on the spot by inexperienced programmers who managed to get something up and running, and at the same time is hostile towards those who take the time to learn from history and evaluate tradeoffs.
See for example the cliche of clueless developers attacking even the most basic aspects of software architecture such as the existence of design patterns.
with that sort of community, how does anyone expect to build respect for prior work.
I see that more from devs in startup culture, or shipping products are sofware only company.
It is a very different mindset when software is not the main business of a company, or in consulting.
I think one of the problems is that if someone uses a word, one still does not know what it means. A person can say 'design patterns' and what he is actually doing is a very good use of them that really helps to clarify the code. Another person can say 'design patterns' and is busy creating an overengineered mess that is not applicable to the actual situation where the program is supposed to work.
I think assessing piece of hardware/software is much more difficult and time consuming than art. So there are no people with really broad experience.
https://www.amazon.ca/Visual-Basic-Algorithms-Ready-Run/dp/0...
The BASIC implementation started to outrun the supercomputer at some surprisingly pedestrian array sizes. I was properly impressed.
So we can estimate the Cray's scalar performance at 400× the TRS-80's. On that assumption, Quicksort on the TRS-80 beats the Cray somewhere between 10000 items and 100_000 items. This probably falsifies the claim—10000 items only fits in the TRS-80's 48KiB maximum memory if the items are 4 bytes or less, and although external sorting is certainly a thing, Quicksort in particular is not well-suited to it.
But wait, BASIC on the TRS-80 was specified. I haven't benchmarked it, but I think that's about another factor of 40 performance loss. In that case the crossover isn't until between 100_000 and 1_000_000 items.
So the claim is probably wrong, but close to correct. It would be correct if you replaced the TRS-80 with a slightly faster microcomputer with more RAM, like the Apple iiGS, the Commodore 128, or the IBM PC-AT.
Gave a good idea of how python is even remotely useful for AI.
The reason is very simple: Python takes longer for a few function calls than Java takes to do everything. There's nothing I can do to fix that.
I wrote a portion of code that just takes a list of 170ish simple functions and run them, and they are such that it should be parallelizable, but I was rushing and just slapped the boring serialized version into place to get things working. I'll fix it when we need to be faster I thought.
The entire thing runs in a couple nanoseconds.
So much of our industry is writing godawful interpreted code and then having to do crazy engineering to get stupid interpreted languages to do a little faster.
Oh, and this was before I fixed it so the code didn't rebuild a constant regex pattern 100k times per task.
But our computers are so stupidly fast. It's so refreshing to be able to just write code and it runs as fast as computers run. The naive, trivial to read and understand code just works. I don't need a PhD to write it, understand it, or come up with it.
There are many cases where O(n^2) will beat O(n).
Utilising the hardware can make a bigger difference than algorithmic complexity in many cases.
Vectorised code on linear memory vs unvectorised code on data scattered around the heap.
Premature optimisation is a massive issue, spending days working on finding a better algorithm is many times not with the time spent since the worse algorithm was plenty good enough.
Real world beats algorithmic complexity many many times because you spent ages building a complex data structure with a bunch of heap allocations all over the heap to get O(N) while it's significantly faster to just do the stupid thing that is in linear memory.
Unlearning Object-Oriented Programming: a course on specific software engineering techniques
Classical Software Studies: a course on the history of software tools
Writing Fast Code in Slow Languages: a course on specific engineering techniques
User Experience of Command Line Tools: an engineering design course
Obsessions of the Programmer Mind: course about engineering conventions and tools.
One day, the name of science will not be so besmirched.
This is just a common meme that often comes from ignorance, or a strawman of what OOP is.
>CSCI 4020: Writing Fast Code in Slow Languages Analyze performance at a high level, writing interpreted Python that matches or beats typical C++ code while being less fragile and more fun to work with.
I like this one, but see?
Python is heavily OOP, everything is an object in python for example.
I'm wondering if OP took a basic OOP course or would otherwise be interested in taking one? You can learn about a thing you are against, or even form your opinion after actually learning about it.
>
I strongly disagree. How is everything being called an object in any way "heavily OOP"? OOP is not just "I organize my stuff into objects".
You can write OOP code with python but most python code I've seen is not organized around OOP principles.
Do I need to spell it out? The O in OOP stands for object.Everything is an object therefore it is Object Oriented. It's not much more complex than that man.
And I don't mean that it supports users writing oop code, I mean that the lang, interpreter and library are themselves written with oop. Inheritance? Check. Classes? Check. Objects? Check. Even classes are an object of type MetaClass.
Simulate increasingly unethical product requests and deadlines. The only way to pass is to refuse and justify your refusal with professional standards.
Watch as peers increasingly overpromise and accept unethical product requests and deadlines and leave you high, mighty, and picking up the scraps as they move on to the next thing.
I have seen a disconnect between what is covered in ethics classes and the types of scenarios students will encounter in the working world. My (one) ethics class was useless. But not political even with the redrawn ethical map of the Trump era.
I'm not really sure how you could totally separate politics (forming laws) from ethics anyway.
Other points to note: All AI will always have bias; a) because neural networks literally have a constant called 'bias' in them; and b) training a prediction algorithm means it is being trained to be bias and running a clustering algorithm means lumping commonly themed attributes together.
If you mean "BIAS" as in "RACE/ETHNICITY/SOCIOECONOMIC_STATUS", then most groups already do this, state they do it, and still deal with the general public not believing them.
If you mean "BIAS" as in avoiding "RACE/ETHNICITY/SOCIOECONOMIC_STATUS", most groups state they do not use them for decision making and still deal with the "general public" not believing them because results do not support their opinions.
This isn't a "Trump Era"/"Right Wing Talking Point" amigo, its the truth; also it's always been this way, listen to some punk rock. Is AI allowed to generate child pornography? Because that is currently being argued as protected by Free Speech in courts because no child was actually involved [1]. Sorry your ethics class was useless, I don't believe mine is.
[1] https://www.usatoday.com/story/news/nation/2024/11/21/pennsy...
> I don't think of myself as a programmer. I write code, and I often enjoy it when I do, but that term programmer is both limiting and distracting. I don't want to program for its own sake, not being interested in the overall experience of what I'm creating. If I start thinking too much about programming as a distinct entity then I lose sight of that.
Programming is a useful skill, even in the age of large language models, but it should always be used to achieve some greater goal than just writing programs.
Definitely would love that. Reading source code is pretty hard for newbies like me. Some guidance is appreciated.
(However, CSCI 2100 shouldn't be necessary if you should learn stuff other than OOP the first time, even if you also learn OOP.)
Which of course people do and why of course you have:
> PSYC 4410: Obsessions of the Programmer Mind
Do you have a moment to talk about our saviour, Lord interactive debugging?
It's old but reliable.
A similar course in CS would give each student a legacy codebase with a few dozen bugs and performance / scaling problems. When the code passes all unit and integration tests, the course is complete.
Repwat for 2 yeaes. Rhen later on, my Systems Programming course would give an overview of GDB, Valgrind, and tease the class with GProf. It'd even warn us on the dangers of debugging hypnosis. But that was all the extent of formal debugging I got. The rest was on the job or during projects.
but 90s fashion is all the rage these days!
I was definitely guilty of this in my last role. Some of my refactorings were good and needed, but also a distraction from saying the codebase was "good enough" and focusing on the broader people/team/process problems around me.
This should exist and the class should study openssl.
Computer science courses that don't exist, but should (2015) - https://news.ycombinator.com/item?id=16127508 - Jan 2018 (4 comments)
Computer science courses that don't exist, but should (2015) - https://news.ycombinator.com/item?id=13424320 - Jan 2017 (1 comment)
Computer science courses that don't exist, but should - https://news.ycombinator.com/item?id=10201611 - Sept 2015 (247 comments)
CS103 Methodologies: Advanced Hack at it ‘till it Works
CS103 History: Fashion, Buzzwords and Reinvention
CS104 AI teaches Software Architecture (CS103 prerequisite)
Systems Engineering 101/201/301/401: How to design a computer system to be reliable
Security Engineering 101/201/301/401: How security flaws happen and how to prevent them
Conway's Law 101/201: Why the quality of the software you write is less important than your org chart
The Real DevOps 101/201/301: Why and how to simultaneously deliver software faster, with higher quality, and fewer bugs
Old And Busted 101/201: The antiquated patterns developers still use, why they're crap, what to use instead
Thinking Outside the Box 101: Stupid modern designs and why older ones are better
New Technology 101: The new designs that are actually superior and why
Project Management 101/201/301: History of project management trends, and how to manage any kind of work
Managing for Engineers 101/201/301: Why and how to stop trying to do everything, empowering your staff, data-driven continuous improvement
Quality Control 101/201: Improving and maintaining quality
Legal Bullshit 101/201: When you are legally responsible and how not to step in itIs this a good place to complain about tools with long option names that only accept a single dash? I'm thinking of you, `java -jar`.
I think OOP became popular because it feels profound when you first grasp it. There is that euphoric moment when all the abstractions suddenly interlock, when inheritance, polymorphism, and encapsulation seem to dance together in perfect logic. It feels like you have entered a secret order of thinkers who understand something hidden. Each design pattern becomes a small enlightenment, a moment of realization that the system is clever in ways that ordinary code is not.
But if you step back far enough, the brilliance starts to look like ornament. Many of these patterns exist only to patch over the cracks in the paradigm itself. OOP is not a natural way of thinking, but a habit of thinking that bends reality into classes and hierarchies whether or not they belong there. It is not that OOP is wrong, but that it makes you mistake complexity for depth.
Then you encounter functional programming, and the same transformation begins again. It feels mind expanding at first, with the purity of immutable data, the beauty of composability, and the comfort of mathematical certainty. You trade one set of rituals for another: monads instead of patterns, recursion instead of loops, composition instead of inheritance. You feel that familiar rush of clarity, the sense that you have seen through the surface and reached the essence.
But this time the shift cuts deeper. The difference between the two paradigms is not just structural but philosophical. OOP organizes the world by binding behavior to state. A method belongs to an object, and that object carries with it an evolving identity. Once a method mutates state, it becomes tied to that state and to everything else that mutates it. The entire program becomes a web of hidden dependencies where touching one corner ripples through the whole. Over time you code yourself into a wall. Refactoring stops being a creative act and turns into damage control.
Functional programming severs that chain. It refuses to bind behavior to mutable state. Statelessness is its quiet revolution. It means that a function’s meaning depends only on its inputs and outputs. Nothing else. Such a function is predictable, transparent, and portable. It can be lifted out of one context and placed into another without consequence. The function becomes the fundamental atom of computation, the smallest truly modular unit in existence.
That changes everything. In functional programming, you stop thinking in terms of objects with responsibilities and start thinking in terms of transformations that can be freely composed. The program stops feeling like a fortress of interlocking rooms and begins to feel like a box of Lego bricks. Each function is a block, self-contained, perfectly shaped, designed to fit with others in infinitely many ways. You do not construct monoliths; you compose arrangements. When you need to change something, you do not tear down the wall. You simply reassemble the bricks into new forms.
This is the heart of functional nirvana: the dream of a codebase that can be reorganized endlessly without decay. Where every part is both independent and harmonious, where change feels like play instead of repair. Most programmers spend their careers trying to reach that state, that perfect organization where everything fits together, but OOP leads them into walls that cannot move. Functional programming leads them into open space, where everything can move.
Reality will always be mutable, but the beauty of functional programming is that it isolates that mutability at the edges. The pure core remains untouched, composed of functions that never lie and never change. Inside that core, every function is both a truth and a tool, as interchangeable as Lego bricks and as stable as mathematics.
So when we ask which paradigm handles complexity better, the answer becomes clear. OOP hides complexity behind walls. Functional programming dissolves it into parts so small and transparent that complexity itself becomes optional. The goal is not purity for its own sake, but freedom; the freedom to recompose, reorganize, and rethink without fear of collapse. That is the real enlightenment: when your code stops feeling like a structure you maintain and starts feeling like a universe you can endlessly reshape.
ps: closures are worth thinking about, we use them without even thinking about it
In essence OOP is just, "hey, if you have a struct and a bunch of operations that operate on that struct, let's put the name of the struct and a dot in front of the names of those operations and you don't need to pass the struct itself as an argument"
It beats me how either the high priests or its detractors get so worked up about it, even with the add-ons like inheritance, poly-morphism or patterns. (Which of course also exist in a more mathematically clean way in functional languages.)
Of course we know today composition is better than inheritance, plain data structs are enough for most cases, and "parse, don't validate". but did people know it in 1990s?
When state is mutable, every method that touches it becomes coupled to every other method that touches it. The object stops being a collection of independent behaviors and turns into a shared ecosystem of side effects. Once you mutate state, all the code that relies on that state is now bound together. The object becomes a single, indivisible unit. You cannot take one method and move it elsewhere without dragging the rest of its world along with it.
Functional programming avoids that trap. Functions are isolated. They take input and return output. They don’t secretly reach into a shared pile of state that everything else depends on. That separation is not aesthetic, it is structural. It’s what makes functions genuinely modular. You can pull them out, test them, recombine them, and nothing else breaks.
# OOP version
class Counter:
def __init__(self):
self.value = 0
def increment(self, n):
self.value += n
def double(self):
self.value *= 2
c = Counter()
c.increment(5)
c.double()
print(c.value)
Here, every method is bound to self.value. Change how one works and you risk breaking the others. They share a hidden dependency on mutable state.Now compare that to the functional version:
def increment(value, n):
return value + n
def double(value):
return value * 2
increment_and_double = lambda x: double(increment(x, 5))
print(increment_and_double(0))
In this version, increment and double are completely independent. You can test them, reuse them, and combine them however you like. They have no shared state, no implicit dependency, no hidden linkage.People often think OOP and FP are complementary styles. They are not. They are oppositional at the core. OOP is built on mutation and shared context. FP is built on immutability and isolation. One binds everything together, the other separates everything cleanly.
Mutation is what breaks modularity. Every time you let a method change shared state, you weave a thread that ties the system tighter. Over time those threads form knots, and those knots are what make change painful. OOP is built around getters and setters, around mutating values inside hidden containers. That’s not structure. It’s coupling disguised as design.
Functional programming escapes that. It separates state from behavior and turns change into a controlled flow. It makes logic transparent and free. It’s not just another way to code. It’s the only way to make code truly modular.
But alot of times, yes it makes sense to group a set of related methods and states.
You say this is not a natural way of thinking but I strongly disagree, it lines up perfectly with how I think. You are you, the car dealership is a dealership. You buy a car from the car dealership, the dealership gets money, you lose money, the dealership loses a car and you gain a car. I want these states reflected in the objects they belong and not passed around globally and tracked
Or if I am writing an API library, yes I very much want to 1. group all my calls together in a class, and 2. keep track of some state, like auth tokens, expirations, configuration for the http client, etc. So you can just do api.login, api.likeX, etc
Moreover, most methods youd write in a large project are so limited in scope to the type and purpose, this idea of some great modularity is nonsense. Its not as if you can have a single delete function that works on deleting users, images from your s3, etc. Youd end up writing bunch of functions like deleteUser(user), createUser(user), deleteImage(image), and wow wouldn't it be great if we could just group these functions together and just do user.delete, user.create? we could even define an interface like Crudable and implement it differently based on what we're deleting. wows
I want to reuse the logic of increment(n) inside another object called childHeight to increment height values.
Can I import the method and reuse it inside another object? No. I can’t.
But if increment was a pure function thats like this increment(c, n) then I can move it anywhere.
That is what I mean by modularity. For oop lack of modularity is fundamental to its design. The grouping of methods around mutating state breaks modularity.
You’re talking about a semantic issue. Grouping methods by meaning. I’m talking about a logistical issue where the grouping by semantics cannot be broken even though the logic is the same. Incrementing height and incrementing ids are identical in logic even though the semantics are divergent.
That is the problem with OOP.
Class Number {
static Int inc(Int n) {
return n+1
}
}import static Number.inc
Class Tracker {
Integer height = 0;
incHeight() {
this.height = inc(this.height)
}
}class Tracker extends Numbers {
Integer height = 0;
incHeight() {
this.height = this.inc(this.height)
}
}Or
interface Number {
default Int inc(Int n) {
return n+1
}
}class Tracker implements Numbers {
Integer height = 0;
incHeight() {
this.height = this.inc(this.height)
}
}Or my pick
class Number { //or extend the Integer class and add your methods
int num = 0;
inc() {
this.num +=1
}
}class Tracker { Number trackingId; }
class Building { Number height; }
t = Tracker()
t.height.inc()
b = Building()
b.height.inc()
All OOP does is really give you a way to group state and methods, how you use it is up to you. There is a reason almost all big software is written this way and not in Lisp.
With functional you don’t rewrite. You recompose what you already have.
It means oop is not modular. You didn’t create modules that can be reused. Nothing could be reused so you had to reconfigure everything.
It enables map, filter, stream, reduce, groupby, etc on lists, sets, etc
or collections class which gives sort, min, max, replaceAll, etc
No, it's about creating something which does something useful and is easy to maintain. Plumbers have great ideas and approaches but you just want plumbing which works and can be fixed .
It's time developers realised they are plumbers not **** artists.
[HN reduced my expletive to just four asterisks which seems a bit reductionist]
If we're plumbers, why all the memory leaks?
Or... are we also bad plumbers?
- COBOL on punch cards
- RPG II/III
- PDP/Vax Shared Memory Modules
- Hierarchical Data File Storage
- Recursive Expression Evaluator
- Batch Processing large datasets
- Compressed dates and numbers
So many of these teach you structural complexity that you’d never learn in today’s world.
You had to carefully think through how your code worked at a molecular level always banging up against a memory wall.
**************************************
* MEDICAL INSURANCE CLAIMS HIERARCHICAL FILE - CIRCA 1982 *
* RECORD TYPES: H=HOSPITAL(MASTER) P=PATIENT D=DIAGNOSIS C=CLAIM DETAIL *
**************************************
H001MEMORIAL GENERAL HOSPITAL 19820415NEW YORK NY10001212555010001520000
P001001JOHNSON ROBERT M19450312001234567819820102BCBS
D001001001HYPERTENSION 401.9 19820102
D001001002DIABETES TYPE 2 250.00 19820102
C00100100119820102OV0992001200120
C00100100219820102LAB8100305003050
C00100100319820102RX1150002500250
P001002SMITH MARY F19580624002345678919820115AETNA
D001002001PNEUMONIA 486 19820115
C00100200119820115ER0992505002500
C00100200219820115XR7100108001080
C00100200319820115RX1150012501250
That batch program was modular and would run several "passes" over the file (sequentially) to do varying types of work (create new data, create reports). These programs could also change a record if they needed to. A claims record might have a status indicator like "submitted" (S), "billed" (B), "denied" (D), "paid" (P).
Everyone that writes code today works on the shoulders of such systems and some of these systems still exist (social security, hotel reservations, flight scheduling, banking, insurance) all original designed by IBM. Most of these have Java and DB2 layers on top of them now, but the underlying data is in many cases still there.
The modularization was because 32mb that had to be split up to handle varying functionality (procedural BASIC or C). There would be a MAIN module, then any number of sub-modules and you'd describe how they would swap in and out as needed. (See PDP-11/RSTS-E Task Builder).
Relational databases were magic when we started using them in the early nineties. NOSql databases are also magic. Graph databases are pure wizardry.
My large state university still has the same core required classes as it did 25 years ago. I don't think CS programs can veer to far away from teaching core computer science without losing accreditation.
For decades, the academia mafia, through impenetrable jargon and intimidating equations, have successfully prevented the masses from adopting this beautiful paradigm of computation. That changes now. Join us to learn why monads really are monoids in the category of endofunctors (oh my! sorry about that).
[0] Insert your favourite natural language
- telling clients that the proof-of-concept is non-conclusive so it's either bag it or try something different
- spending innovation tokens in something else than a new frontend framework and/or backend language
- understanding that project management methods are tools (not rites) and if your daily standup is 45min then there's a problemAnd alway reimplement Perl in NodeJS. For Internet points.
CS is the study of computation, SE is the study of building computer programs.
Those overlap in the same way physics and chemistry do. Of course the two overlap and chemist's are also exposed to classical and quantum physics and know about Dirac spaces or Born-Oppenheimer equations. But the bulk and core of a chemist curriculum will involve few of these courses and with a focus on what's relevant to the chemist. E.g understanding how quantum physics make water appear transparent in a glass but blue in a lake or deep pool.
Same is for CS and SE. Of course they are related, but CS is much more focused on the theoretical and mathematical parts of computing, not the practical side of building systems.
One wants to know what can be computed and how and with what properties. The other wants to know how to build computer programs, but does not need to understand and be intimate with the mathematics of type inference or Hoare logic.
If you want to know how to build computer programs, then learn the type system of your chosen language, and learn how to reason about the behavior of sequences, loops, and conditionals—even if you do it informally or with small-step operational semantics instead of Hoare logic, and even if your language doesn't have type inference. Don't listen to the comforting lies of "Software Engineering" promising easy shortcuts. There is no royal road to Geometry, and there is no royal road to Google. Git gud.
But it is also true that there is a great deal that you could learn about computer science that you do not need to write working software, fast. Sequential search is often fast enough. Simple hash tables are usually better than fancy balanced trees. You will probably never use a computer that uses one's complement or the network stack the OSI model describes. If you have an array to sort, you should probably use a sorting function from the system library and definitely not implement bubble sort from scratch. Or even Quicksort. You can program in Erlang or Java for decades without having to understand how the garbage collector works.
There are some good posts on the blog that do a good job of explaining: https://prog21.dadgum.com/177.html https://prog21.dadgum.com/87.html
Software engineering is not an ideology, but the application of engineering practices to building computer programs, the same way civil engineering is the application of engineering practices to building bridges.
Your statement is odd: software engineering curricula do include theoretical and computational courses, but ultimately those are a limited part and not the focus of the curriculum.
In the same way CS curricula do include few engineering and application-focused exams, but again, they are not the focus.
It's absolutely fine for the two curricula to be different and they are indeed different in most of Europe.
E.g. at the university of Pisa the CS curriculum (obviously speaking about masters, arguing about bachelors is partially irrelevant, you just can't get in enough depth of any topic) has exams like parallel computing, category theory, models of computation, compilers and interpreters.
But the software engineering curriculum has: mobile and physical systems, machine learning, distributed computing, business process modeling, IT risk assessment, IT infrastructures, peer to peer systems, etc.
Of course many exams are shared (albeit they have slightly different focuses) such as: randomized algorithms, competitive programming, advanced programming and you can likely choose one of the other courses as your optionals.
But the focus is ultimately different. One focuses on the theory behind computation, one focuses on the practical aspect.
Or look at who's actually executing successfully on the practical aspect of building software. It isn't people who got a master's degree in IT risk assessment and business process modeling.
People in tech industry, seem to have no idea how the systems in the wild work. Enterprise Java runs the backbone of operations for all of large business organisations such as banks. It is just as grounded as MS Office is. It is object-oriented software that is running the bulk of production environments of the world. Who is going to maintain these systems for the next few decades?
And in reality, there is nothing wrong with Java or object orientation. It has the best battle-tested and rich ecosystem to build enterprise systems. It mirrors the business entities and a natural hierarchy and evolution of things. It has vast pool of skilled resources and easy to maintain. Python is still a baby when it comes operational readiness and integrations. You might get excited about Jupyter cells and REPL, but that is all a dev-play, not production.
The banks' real "product" is trust. You will work an entire month for a "bank transfer" (something you can't even hold, let alone eat or burn) because you believe your landlord will similarly accept a "bank transfer" in exchange for you rent (or, if you have a mortgage, you work an entire month because you believe this means you will be able to continue living in your house unchallenged). This has absolutely nothing to do with what programming languages or paradigms they have in place.
Parent is correct, been doing this my entire (profitable to the absolute limit) career and will most probably retire doing same. You clearly seem to lack any expertise in field discussed.
This is rather anti-recommendation. At this point I'm expecting from a bank only to reliably login, preview balance and transaction history, receive and send bank transfers... and they oftentimes fail at this basic feature set. I don't even need credit or interest rates from them.
Notice the prerequisite to unlearning something is learning it first. I don't think anyone proposes that the concept of an object is useless.
When I code in C, in the end, I usually miss the syntax for defining "objects/classes" (structs with functions and access controls), the syntax/notation that encapsulates/binds/groups the related state and its functions/API to define some specific concept/model == custom data type.
Of course OOP can be taken to extreme complexity and then lose its usefulness.
The type bike shedding that seems to inevitably occur in typescript discussions has completely turned me off of it.
Now that I'm self-employed, I just use JSDoc comments and an IDE that warns me when I break the definitions.
Much faster iteration, much easier to debug, and no wasted meetings moralizing over the use of `any`.
Methods are functions, with an implicit argument usually called "self". Unless they are static, in which case, they are just regular functions. Classes are data structures, abstract methods are function pointers, inheritance adding data at the end of an existing data structure. In fact, inheritance is like a special case of composition.
Those who oppose object-oriented programming the most are typically the functional programming guys. But what is a function variable if not an object with a single abstract method, add attributes and you have a closure.
It will all end up as machine code in the end, and understanding how all these fancy features end up on the bare metal help understanding how seemingly different concepts relate.
The venerable master Qc Na was walking with his student, Anton. Hoping to prompt the master into a discussion, Anton said "Master, I have heard that objects are a very good thing - is this true?" Qc Na looked pityingly at his student and replied, "Foolish pupil - objects are merely a poor man's closures."
Chastised, Anton took his leave from his master and returned to his cell, intent on studying closures. He carefully read the entire "Lambda: The Ultimate..." series of papers and its cousins, and implemented a small Scheme interpreter with a closure-based object system. He learned much, and looked forward to informing his master of his progress.
On his next walk with Qc Na, Anton attempted to impress his master by saying "Master, I have diligently studied the matter, and now understand that objects are truly a poor man's closures." Qc Na responded by hitting Anton with his stick, saying "When will you learn? Closures are a poor man's object." At that moment, Anton became enlightened.
For those who don't know, Guy L. Steele (Anton's interlocutor here) is the original designer of the Scheme programming language, the author of the "Lambda: The Ultimate Zozo" series of papers, and the second biggest contributor to the design of Java.
I've seen the entities you're describing from the inside and they resemble nothing natural besides perhaps a tumor. Hopefully we can just dispense with them rather than shackle the next generation with the burden of maintaining them.
But I'm not a data oriented evangelist. Part of that paradigm is understanding the size and predictability of data, and not everything in games meets this. Input is the most obvious example. But knowing when and where to use such techniques is arguably what separates an engineer from a programmer. But schools can go a tad overboard focusing on one technique.
Unlearning can also help you appreciate OOP more as well. I wouldn't have dug too much onto c++ vtables without a goal to understand why Polymorphism can be so slow. But I can also appreciate how clever the implementation is to get that far to begin with.
- a basic computer science course, teaching how to be at home in a FLOSS desktop system
- an intermediate course teaching how to properly automate this environment, from scripting to classic development
- a basic course in networking and system management to reach the level of being able to be a dummy sysadmin at home
all of these must be preparatory to CS because without them it's like studying literature before knowing the basics of the language in which it's written. So far, it's assumed that students do it themselves, but the facts prove that this is not the case.
PSYC 2230: Measuring Things - Gathering evidence, overcoming bias, and comparative analysis.
Most developers cannot measure things at any level, in any regard.
CSCI 5540: Transmission Control - Comparative analysis of existing data transfer protocols, to include practical application, as well as authoring new and original protocols
Most developers don’t know how any transmission protocols work internally, except possibly HTTP
CSCI 3204: Tree Traversal - Data structure analysis applied to data model hierarchies and taxonomies
I have heard from so many developers who say they spend most of their education on data structure analysis but in the real world cannot apply it against tree models in practical application on any level. The people in library sciences figure this out in the real world but not educated software developers.
Learn unicode and utf-8.
Unlearn the 1 char = 1 byte concept
Not only encoding/decoding but searching and sorting is also different. We may also cover font rendering, unicode modifiers and emoji. They are so common and fundamental but very few understand them.
Same for font rendering, there is a reason why harfbuzz is used everywhere. Getting an 80% working renderer is easy but the remaining 20% can take years.
It really "handling text correctly"should be a masters, and I'd sign up in a heartbeat.
Its always frustrated me how little consideration to the existing study of knowledge is given in AI courses
Also check out https://www.youtube.com/@BetterSoftwareConference
Even for the most passionate of developers, you're gonna have a hard time getting someone to commit even a fraction of that time towards educational content.
If you think there are any specific videos from the Handmade Hero series that are really worth watching, you should recommend them directly. But pointing someone to 1300 hours of content is an absurd suggestion.
There's a sizable community around Handmade Hero which can point you to more specific topics.
Self-taught Python, and PHP before that. The course I always wanted might be named, “Systems: static and flexible abstractions”
Which is an expression of the frustration I had trying to learn frameworks (I forgot I attempted to learn Drupal until I read this question), where I found it nearly impossible to ask questions on SO (a lot of ‘stay in your lane’ remarks).
And the frustrations I feel today when I try to reconcile my understanding of Python used in the ‘systems’ I created for CLI scripts, and the baffling brain fog I experience when trying to understand how someone else’s code works.
I remember when I was 10 or 12 or so hacking with my IBM 8086 and using basic, I accidentally "invented" the bubble sort. In fact, mine was extra slow and inefficient because both my outer and my inner loop went from 1 to N and there was no early exit if no swaps were made. A true O(N^2) algorithm. I didn't now what O(N^2) meant, but I had some understanding the things quickly got slower.
Then later in CS101 I learned about big-O and all the theories around sorting and it immediately clicked because I had a deep understanding of something that I experienced and then could tie it to real theory. The other way around - learning the theory before the experience - wouldn't have worked as well.
To tie it to your comment, you should have a deep experience with your OS of choice and then when you go to school, you learn why things are the way they were.
When I say this I often get accused of gate keeping, but I don't view it that way. I look at it as other types of majors that have existed longer than CS. I often make an analogy to music majors. I can't enroll as a freshman and say I'm going to be a music major without ever having played an instrument. People get accepted to a music department after they demonstrate the ability (usually though the equivalent of hacking while they were kids), and in their music classes they learn theory and how to play different instruments (just like learning different OSes or languages).
I kind of feel that CS should be the same way, you should show up to CS101 knowing how to do things from deep experience. You may not know any of the whys or theory, that's fine, but you should have experience in doing.
To tie it back to the parent: you should come to CS knowing how to run Linux, maybe because you copied configurations or scripts from the dark corners of the internet. And then the CS classes should be around why it's all that way. E.g., you know that to schedule something you use cron; and CS would be a discussion around how generic OSes need a way to schedule tasks.
Anyway, when I made the comment, I was thinking it should be an elective and intended for people who either aren’t that familiar with Linux or want to become even more comfortable with it. There are certainly plenty of such students in my experience, myself included when I was in college.
Also just to be clear, this shouldn’t be just about “being able to run Linux at home” level of material, but things like writing non trivial applications using Linux subsystems and being able to troubleshoot them.
think about decompile java apps that don't even have the original version control anymore, change it, and deploy it again
Yep, I've been there.
- Learn how to design a graphical user interface that empowers the users instead of coercing or manipulating them.
- Prerequisites: Any course that used the term "funnel", "user journey", "A/B tests" or "engagement".
- Includes a historical bonus part about "Native UI Toolkits" (not part of the exam)
- How to ignore the latest platform/library/thing everyone is talking about on HN
CSCI 3120: Novelty Driven Development
- How to stay interested in your job by using the latest platform/library/thing everyone is talking about on HN even if it isn't actually necessary
NB: CSCI 3120 cannot be taken at the same time as CSCI 3240
PSYC 4870: Meeting Techniques
- mentioning problems before, during or after meetings - different techniques for different manager types
- small talk conventions in different cultures
- camera on or camera off - the modern split in corporate habits
PSYC 5630: Accepting Organisational Friction
- how to motivate yourself to write documentation that no-one will ever read
- managers are people too - understanding what makes someone think they want to be a manager
- Everything Will Take A Long Time And No-one Will Remember What Got Decided - working in large organisartions
- Cake and Stare - how to handle a leaving do
If anything, we need the opposite of this class. Learn OO well and create tight apps with a small runtime footprint, well isolated code boundaries, and clean interfaces.
I wish software engineering degrees were more common so I could've studied computer science like I intended to.
Architectures (centralized, distributed), implementations (diffs, patches, full state, etc.), branches, merges, etc.
All of which is to say: yes, this should totally be a course!
I actually did a phd in the topic of software engineering and I had to learn a lot of stuff after I started practicing what I preached. I realized that this was the case while I was working on my thesis and it was a big reason for me to get some hands on experience.
Basically, academics tend to have a lot of somewhat naive notions about software engineering that usually manifest in them waffling about things like waterfall style development or emphasizing things like formal methods, which in 30 years of practice, I've rarely encountered in the wild.
That doesn't mean teaching that is a waste of time. But it does mean that there's more to software engineering than is taught in universities. You can't really learn most of that from someone that hasn't been exposed to real life software engineering. Most academics never leave university so they are not necessarily that up to speed with modern practices.
Teaching in most engineering disciplines boils down to a lot of theory followed by apprenticeships. The theory doesn't evolve nearly as fast as practice and tools. That's why it's fine using somewhat outdated or academic languages and tools in university. I don't think that's unique to computer science either.
Studying computer science gives you a theoretical basis and the ability to learn. Which is nice and relevant. But I actually know a lot of software engineers that studied completely different topics (theoretical physics, philosophy, mathematics, geology, etc.) that do fine without it. A few years of working can compensate for that. Having an academic background prepares people to wrap their heads around complex new stuff. Getting a degree basically means "you have a working brain". The most important skill you learn in university is using your brain.
I don't care what language people use in university. But I'd prefer people to have been exposed to more things than just 1 language and knowing that there are multiple ways to do the same thing. I did logic, functional, and imperative programming in my first year. OO was kind of hot and newish but that was a second year topic. I later studied aspect oriented programming as well (there are several flavors of that) and a few offshoots of object oriented (prototype based, role based). Many javascript programmers may have never heard of the language Self. But that's one of the the languages that inspired Brandan Eich; it's a prototype based OO language (no classes, just prototype objects). That's the difference between a good engineer and one with a decent computer science background. You don't need to know that to use Javascript. But it helps.
Being able to navigate not just a codebase but bugs/tickets attached to it, discussions in documents, old wiki pages that half work, extracting context clues from versioning history, tracing people by the team they worked on at the time...digital detective work is a serious part of the job sometimes.
That company develops most of the court software used in the US.
And it's very unlikely that they have improved their practices in the three years since I had to leave due to burnout.
Each class would "just" study a hackernews thread in depth.
Learn the group etiquette during meetings for task assignment and interacting with supervisor authority.
Emphasizes being respectful and polite, with lessons on manners like handshakes, greetings, and helping others.
Cotillion classes often culminate in a formal backlog grooming and lessons-learned where students display their learned skills.
First make a modern UI for sufficiently complex application.
Then make students to improve it to be usable on low memory, low cpu, high network latency and slow network speed system...
fred_is_fred•3mo ago
jasonthorsness•3mo ago
andoando•3mo ago
corysama•3mo ago
markus_zhang•3mo ago
lmm•3mo ago
bruce511•3mo ago
Although, to be fair, while it was a helpful practice at coding, I'm not a game designer, so it was a game too terrible to play.
First year Uni though I spent too many hours in the lab, competing with friends, to recreate arcade games on the PC. Skipping the game design part was helpful. To be fair by then we had a glorious 640k of ram. Some Assembly required.
tdeck•3mo ago