frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Roc Camera

https://roc.camera/
124•martialg•3h ago•116 comments

Betty White's shoulder bag is a time capsule of World War II (2023)

https://americanhistory.si.edu/explore/stories/betty-white-world-war-ii
182•thunderbong•1w ago•11 comments

Claude Memory

https://www.anthropic.com/news/memory
426•doppp•13h ago•245 comments

/dev/null is an ACID compliant database

https://jyu.dev/blog/why-dev-null-is-an-acid-compliant-database/
278•swills•9h ago•107 comments

Counter-Strike's player economy is in a multi-billion dollar freefall

https://www.polygon.com/counter-strike-cs-player-economy-multi-billion-dollar-freefall/
79•perihelions•6h ago•103 comments

Fast-DLLM: Training-Free Acceleration of Diffusion LLM

https://arxiv.org/abs/2505.22618
17•nathan-barry•4h ago•0 comments

Cheap DIY solar fence design

https://joeyh.name/blog/entry/cheap_DIY_solar_fence_design/
86•kamaraju•1w ago•33 comments

How memory maps (mmap) deliver faster file access in Go

https://info.varnish-software.com/blog/how-memory-maps-mmap-deliver-25x-faster-file-access-in-go
94•ingve•8h ago•68 comments

Computer science courses that don't exist, but should (2015)

https://prog21.dadgum.com/210.html
153•wonger_•4h ago•105 comments

Benchmarking Postgres 17 vs. 18

https://planetscale.com/blog/benchmarking-postgres-17-vs-18
67•bddicken•1w ago•7 comments

JupyterGIS breaks through to the next level

https://eo4society.esa.int/2025/10/16/jupytergis-breaks-through-to-the-next-level/
28•arjxn-py•2h ago•7 comments

Can “second life” EV batteries work as grid-scale energy storage?

https://www.volts.wtf/p/can-second-life-ev-batteries-work
136•davidw•12h ago•150 comments

When is it better to think without words?

https://www.henrikkarlsson.xyz/p/wordless-thought
117•Curiositry•9h ago•49 comments

React Flow, open source libraries for node-based UIs with React or Svelte

https://github.com/xyflow/xyflow
108•mountainview•7h ago•17 comments

PyTorch Monarch

https://pytorch.org/blog/introducing-pytorch-monarch/
332•jarbus•20h ago•41 comments

Summary of the Amazon DynamoDB Service Disruption in US-East-1 Region

https://aws.amazon.com/message/101925/
466•meetpateltech•1d ago•144 comments

Killing Charles Dickens (2023)

https://www.newyorker.com/magazine/2023/07/10/on-killing-charles-dickens
10•bryanrasmussen•6d ago•1 comments

Binmoji: A 64-bit emoji encoding

https://github.com/jb55/binmoji
13•jb55•1w ago•2 comments

I spent a year making an ASN.1 compiler in D

https://bradley.chatha.dev/blog/dlang-propaganda/asn1-compiler-in-d/
262•BradleyChatha•18h ago•181 comments

Automating Algorithm Discovery: A Case Study in MoE Load Balancing

https://adrs-ucb.notion.site/moe-load-balancing
111•melissapan•8h ago•54 comments

Date bug in Rust-based coreutils affects Ubuntu 25.10 automatic updates

https://lwn.net/Articles/1043103/
138•blueflow•10h ago•132 comments

Introduction to the concept of likelihood and its applications (2018)

https://journals.sagepub.com/doi/10.1177/2515245917744314
34•sebg•7h ago•3 comments

OpenAI acquires Sky.app

https://openai.com/index/openai-acquires-software-applications-incorporated
163•meetpateltech•13h ago•108 comments

Apple loses UK App Store monopoly case, penalty might near $2B

https://9to5mac.com/2025/10/23/apple-loses-uk-app-store-monopoly-case-penalty-might-near-2-billion/
241•thelastgallon•8h ago•231 comments

US probes Waymo robotaxis over school bus safety

https://www.yahoo.com/news/articles/us-investigates-waymo-robotaxis-over-102015308.html
89•gmays•18h ago•145 comments

FocusTube: A Chrome extension that hides YouTube Shorts

https://github.com/CaptainYouz/FocusTube
191•youz•9h ago•145 comments

Kaitai Struct: declarative binary format parsing language

https://kaitai.io/
111•djoldman•1w ago•37 comments

Antislop: A framework for eliminating repetitive patterns in language models

https://arxiv.org/abs/2510.15061
98•Der_Einzige•14h ago•95 comments

Zram Performance Analysis

https://notes.xeome.dev/notes/Zram
74•enz•10h ago•24 comments

Color-changing organogel stretches 46 times its size and self-heals

https://phys.org/news/2025-09-organogel-size.html
5•PaulHoule•1w ago•0 comments
Open in hackernews

Computer science courses that don't exist, but should (2015)

https://prog21.dadgum.com/210.html
153•wonger_•4h ago

Comments

fred_is_fred•3h ago
The Classical Software Studies would be quite useful. Go write a game in 64kb of RAM in BASIC. It would really stretch your creativity and coding skills.
jasonthorsness•3h ago
Agreed, it would be very interesting to see some of the care taken for resource management that is lost now because every machine has “enough” RAM and cycles…
andoando•3h ago
There's generally courses on embedded
corysama•3h ago
I’m e long thought the Gameboy Advance would make a great educational platform. Literally every aspect of the hardware is memory mapped. Just stuff values into structs at hard-coded addresses and stuff happens. No need for any OS or any API at all.
lmm•3h ago
I think working on that kind of system would be actively harmful for most programmers. It would give them a totally unbalanced intuition for what the appropriate tradeoff between memory consumption and other attributes (maintainability, defect rate, ...) is. If anything, programmers should learn on the kind of machine that will be typical for most of their career - which probably means starting with a giant supercomputing cluster to match what's going to be in everyone's pocket in 20 years' time.
bruce511•2h ago
Ha. You call it "history". I call it "childhood". I did that years before getting to Uni :)

Although, to be fair, while it was a helpful practice at coding, I'm not a game designer, so it was a game too terrible to play.

First year Uni though I spent too many hours in the lab, competing with friends, to recreate arcade games on the PC. Skipping the game design part was helpful. To be fair by then we had a glorious 640k of ram. Some Assembly required.

tdeck•1h ago
Bonus points if it fits in 38911 bytes.
epalm•3h ago
Where’s the course on managing your reaction when the client starts moving the goal posts on a project that you didn’t specify well enough (or at all), because you’re a young eager developer without any scars yet?
dfex•3h ago
some scars need to be earned
ekidd•3h ago
Back in the 90s, this was actually a sneaky part of Dartmouth's CS23 Software Engineering course. At least half your grade came from a 5-person group software project which took half a semester. The groups were chosen for you, of course.

The professors had a habit of sending out an email one week before the due date (right before finals week) which contained several updates to the spec.

It was a surprisingly effective course.

(Dartmouth also followed this up with a theory course that often required writing about 10 pages of proofs per week. I guess they wanted a balance of practice and theory, which isn't the worst way to teach CS.)

collingreen•3h ago
In uni we had a semester long group project with the stronger coders as project leaders. Group leaders controlled pass/fail for the group members and vice versa. After the leaders put together project plans for their teams and everyone was supposed to start working WHOOPSIE reorg and all the leaders got put on random teams and new people got "promoted" into leader (of groups they didn't create the plan for). If the project didn't work at the end the entire group failed.

I've never been taught anything more clearly than the lessons from that class.

markus_zhang•3h ago
Shit I was thinking about exactly the same thing: professor deliberately change requirements at last week to mess up the students and give them a bit of taste of true work.

Glad that someone actually did this.

hotstickyballs•3h ago
That's the job of a (good) manager.
h4ck_th3_pl4n3t•3h ago
It's called test driven development.
mnky9800n•3h ago
I am happy to sign up for all these classes. Tbh this is what coursera or whatever should be. Not yet another machine learning set of lectures with notebooks you click run on.
pluto_modadic•3h ago
"when you need shiny tech (you don't)" (aka, when to use postgres/sqlite)
LPisGood•3h ago
It is interesting that no software engineering or computer science course I’ve seen has ever spent any time on CI/CD.

Jenkins, Docker, Kubernetes, none of these sorts of things - and I don’t even mean these specific technologies, but moreover nothing even in their ballpark.

hotstickyballs•3h ago
Personally it felt quite natural once you start to work on real software projects.
LPisGood•3h ago
I have never felt that way, and I’ve worked on a variety of projects at a variety of companies.

Everyone has a bespoke mishmash of nonsense pipelines, build tools, side cars, load balancers, Terragrunt, Terraform, Tofu, Serverless, Helm charts, etc.

There are enough interesting things here that you wouldn’t even need to make a tool heavy project style software engineering course - you could legitimately make a real life computer science course that studies the algorithms and patterns and things used.

bckr•3h ago
Merging strategies, conflict resolution, bisect debugging, and version control in general are very computer sciencey.

Would make a great course.

LPisGood•2h ago
That’s not even getting into the fact that you could basically teach multiple graduate level distributed computing courses with k8s as a case study
andoando•3h ago
Well this has nothing to do with computer science so there's that
LPisGood•3h ago
I could not disagree with you more.

It would be easy for me to agree with you. I hold a graduate degree in computer science and I’m named for my contributions proofreading/correcting a graduate text about algorithms in computability theory.

I love abstraction and algorithms and pure theory, but this whole “computer science is a branch of mathematics, nothing more” idea has always struck me as ridiculous. Are you willing to throw out all of the study of operating systems, networking, embedded systems, security (hardware and software), a good chunk of AI, programming languages, UI/UX/human computer interaction, graphics, just to draw a line around algorithms and Turing Machines and say this is all there is to computer science?

andoando•3h ago
Cryptography is all math, networking is largely math and algorithms (IMO yes this should really be replaced with information theory. Just understanding Shannons paper would have been more valuable than learning about how routers work), AI is mostly statistics (And AI as a whole Id argue is the essence of computer science), graphics is largely math and algorithms.

Yes I very much think a computer science degree should be as close to the foundation of theory as possible. And still learning Jenkins and kuberenetes or even a general course on how effectively push code is still far from the things you listed.

Theres so much computer science that isnt even covered that id include before including courses on CI/CD

LPisGood•2h ago
Yeah cryptography mostly (but certainly not all) math but it accounts for a negligible (pun intended) portion of interesting security work.

AI is a lot of math, especially if you hang out with the quasiconvex optimization crowd, but a vast majority of work in that field can not properly constitute “theory”

I think it’s clear in practice that computer science has officially strayed beyond whatever narrow bounds people originally wished to confine it to.

andoando•2h ago
Alot of its a push for practicality/catering to student's interests. IMO its a result of a really archaic education system. Universities were originally small and meant for theoretical study, not as a de facto path for everyone to enroll into in order to get a job.

If it were me Id get rid of statically defined 4 year programs, and/or definite required courses for degrees, or just degrees in general. Just offer courses and let people come learn what they want.

One of my favorite classes was a python class that focused on building some simple games with tkinter, making a chat client, hosting a server, because it was the first time I understood how actual software worked. Im really glad I took that class.

On the other hand Id love to have learned information theory, lamba calculus, all the early AI, cognitive science, theory of programming languages, philosophy behind all of it that got us here

LPisGood•2h ago
Your point is well taken and to some extent I agree, but I think you have to recognize. It’s not just student interest, career preparation, and practicality.

The research done by professional academic computer scientists also reflects the broad scope I’m advocating for.

dsr_•2h ago
You've just named most of the interesting stuff.
raphman•2h ago
I'm doing this in the software engineering¹ course I teach. However:

a) Practical CI/CD requires understanding (and some practical experience) of many other concepts like Linux shell scripting, version control, build automation, containers, server administration, etc. As few students (in our degree programme) have sufficient experience in these concepts, I spend about half of the semester teaching such devops basics instead of actual software engineering.

b) Effectively, teaching CI/CD means teaching how GitHub or GitLab do CI/CD. I feel a little bit uncomfortable teaching people how to use tech stacks owned by a single company.

¹) Actually it's more a course on basic software craftsmanship for media informatics students because no such course exists in our curriculum and I find it more important that students learn this than that they understand the V model.

wpollock•2h ago
> It is interesting that no software engineering or computer science course I’ve seen has ever spent any time on CI/CD.

It's hard to fit everything student needs to know in the curriculum. Someone else posted here they had 10 pages of proofs per week, for one course. I would have been fired for assigning so much homework!

I was a CS professor at a local college. My solution was to ignore CS1 and CS2 curriculum (we were not ABET accredited, so that's okay) in the second course of Java programming. Instead, I taught students Maven/Gradle, Git and GitHub, workflows, CI/CD, regular expressions, basic networking, basic design patterns, Spring Boot, and in general everything I thought new programmers ought to know. I even found a book that covered much of this stuff, but in the end I wrote my own learning materials and didn't use a book.

The course was a victim of its success. The school mandated the course for non-Java programmers too, resulting in a lot of push-back from the non-Java students.

If anyone is interested, I have the syllabus online still (I've since retired) at <https://wpollock.com/>. Look for COP2800 and COP2805C. I can also send the Java teaching materials as a PDF to anyone interested (book length, but sadly not publishable quality).

corysama•3h ago
> CSCI 3300: Classical Software Studies

Alan Kay, my favorite curmudgeon, spent decades trying to remind us we keep reinventing concepts that were worked out in the late 70s and he’s disappointed we’ve been running in circles ever since. He’s still disappointed because very few programmers are ever introduced to the history of computer science in the way that artists study the history of art or philosophers the history of philosophy.

PaulDavisThe1st•3h ago
The history of art or philosophy spans millenia.

The effective history of computing spans a lifetime or three.

There's no sense comparing the two. In the year 2500 it might make sense to be disappointed that people don't compare current computational practices with things done in 2100 or even 1970, but right now, to call what we have "history" does a disservice to the broad meaning of that term.

Another issue: art and philosophy have very limited or zero dependence on a material substrate. Computation has overwhelming dependence on the performance of its physical substrate (by various metrics, including but not limited to: cpu speed, memory size, persistent storage size, persistent storage speed, network bandwidth, network scope, display size, display resolution, input device characteristics, sensory modalities accessibly via digital to analog conversion, ...). To assert that the way problems were solved in 1970 obviously has dramatic lessons for how to solve them in 2025 seems to me to completely miss what we're actually doing with computers.

degamad•3h ago
> art [has] very limited or zero dependence on a material substrate

This seems to fundamentally underestimate the nature of most artforms.

degamad•3h ago
> To assert that the way problems were solved in 1970 obviously has dramatic lessons for how to solve them in 2025 seems to me to completely miss what we're actually doing with computers.

True they might not all be "dramatic lessons" for us, but to ignore them and assume that they hold no lessons for us is also a tragic waste of resources and hard-won knowledge.

citizenpaul•2h ago
Its because CS is not cared about as a true science for the most part. Nearly all of the field is focused on consolidating power and money dynamics. No one cares to make a comprehensive history since it might give your competitors an edge.
rippeltippel•1h ago
Art and Philosophy are hardly regarded as science, either. Actually, less so. Yet...
andrewflnr•3h ago
You first sentences already suggest one comparison between the histories of computing and philosophy: history of computing ought to be much easier. Most of it is still in living memory. Yet somehow, the philosophy people manage it while we computing people rarely bother.
markus_zhang•3h ago
I always think it is great value to have a whole range of history of X courses.

I once thought about a series of PHYS classes that focus on historical ideas and experiments. Students are supposed to replicate the experiments. They have to read book chapters and papers.

andrewflnr•2h ago
That's not that far off the standard physics course, is it? Certainly lots of labs I took were directly based on historical experiments.
vintermann•34m ago
History of physics is another history where we have been extremely dependent on the "substrate". Better instruments and capacity to analyze results, obviously, but also advances in mathematics.
jancsika•2h ago
If Alan Kay doesn't respond directly to this comment, what is Hacker News even for? :)

You're not wrong about history, but that only strengthens Kay's case. E.g., our gazillion-times better physical substrate should have led an array of hotshot devs to write web apps that run circles around GraIL[1] by 2025. (Note the modeless GUI interaction.) Well, guess what? Such a thing definitely doesn't exist. And that can only point to programmers having a general lack of knowledge about their incredibly short history.

(In reality my hope is some slice of devs have achieved this and I've summoned links to their projects by claiming the opposite on the internet.)

Edit: just so I get the right incantation for summoning links-- I'm talking about the whole enchilada of a visual language that runs and rebuilds the user's flowchart program as the user modelessly edits it.

1: https://www.youtube.com/watch?v=QQhVQ1UG6aM

wredcoll•2h ago
> E.g., our gazillion-times better physical substrate should have led an array of hotshot devs to write web apps that run circles around GraIL[1] by 2025

Why? What problem did it solve that we're suffering from in 2025?

corysama•2h ago
https://news.ycombinator.com/user?id=alankay Has not been active on Hacker News for several years now.

At 85 he has earned the peace of staying away from anything and everything on the internet.

Exoristos•2h ago
> The history of art or philosophy spans millen[n]ia.

And yet for the most part, philosophy and the humanities' seminal development took place within about a generation, viz., Plato and Aristotle.

> Computation has overwhelming dependence on the performance of its physical substrate [...].

Computation theory does not.

chongli•1h ago
Art and philosophy have very limited or zero dependence on a material substrate

Not true for either. For centuries it was very expensive to paint with blue due to the cost of blue pigments (which were essentially crushed gemstones).

Philosophy has advanced considerably since the time of Plato and much of what it studies today is dependent on science and technology. Good luck studying philosophy of quantum mechanics back in the Greek city state era!

kragen•39m ago
> Computation has overwhelming dependence on the performance of its physical substrate (by various metrics, including but not limited to: cpu speed, memory size, persistent storage size, persistent storage speed, network bandwidth, network scope, display size, display resolution

This was clearly true in 01970, but it's mostly false today.

It's still true today for LLMs and, say, photorealistic VR. But what I'm doing right now is typing ASCII text into an HTML form that I will then submit, adding my comment to a persistent database where you and others can read it later. The main differences between this and a guestbook CGI 30 years ago or maybe even a dialup BBS 40 years ago have very little to do with the performance of the physical substrate. It has more in common with the People's Computer Company's Community Memory 55 years ago (?) using teletypes and an SDS 940 than with LLMs and GPU raytracing.

Sometime around 01990 the crucial limiting factor in computer usefulness went from being the performance of the physical substrate to being the programmer's imagination. This happened earlier for some applications than for others; livestreaming videogames probably requires a computer from 02010 or later, or special-purpose hardware to handle the video data.

Screensavers and demoscene prods used to be attempts to push the limits of what that physical substrate could do. When I saw Future Crew's "Unreal", on a 50MHz(?) 80486, around 01993, I had never seen a computer display anything like that before. I couldn't believe it was even possible XScreensaver contains a museum of screensavers from this period, which displayed things normally beyond the computer's ability. But, in 01998, my office computer was a dual-processor 200MHz Pentium Pro, and it had a screensaver that displayed fullscreen high-resolution clips from a Star Trek movie.

From then on, a computer screen could display literally anything the human eye could see, as long as it was prerendered. The dependence on the physical substrate had been severed. As Zombocom says, the only limit was your imagination. The demoscene retreated into retrocomputing and sizecoding compos, replaced by Shockwave, Flash, and HTML, which freed nontechnical users to materialize their imaginings.

Most things people do with computers today, and in particular the most important things, are things fewer people have been doing with computers in nearly the same way for 30 years, when the physical substrate was very different: 300 times slower, 300 times smaller, a much smaller network.

Except, maybe, mass emotional manipulation, doomscrolling, LLMs, mass surveillance, and streaming video.

A different reason to study the history of computing, though, is the sense in which your claim is true.

Perceptrons were investigated in the 01950s and largely abandoned after Minsky & Papert's book, and experienced some revival as "neural networks" in the 80s. In the 90s the US Postal Service deployed them to recognize handwritten addresses on snailmail envelopes. (A friend of mine who worked on the project told me that they discovered by serendipity that decreasing the learning rate over time was critical.) Dr. Dobb's hosted a programming contest for handwriting recognition; one entry used a neural network, but was disqualified for running too slowly, though it did best on the test data they had the patience to run it on. But in the early 21st century connectionist theories of AI were far outside the mainstream; they were only a matter of the history of computation. Although a friend of mine in 02005 or so explained to me how ConvNets worked and that they were the state-of-the-art OCR algorithm at the time.

Then ImageNet changed everything, and now we're writing production code with agentic LLMs.

Many things that people have tried before that didn't work at the time, limited by the physical substrate, might work now.

teiferer•15m ago
> The effective history of computing spans a lifetime or three.

That argument actually strengthens the original point: Even though it's been that short, youngsters often still don't have a clue.

eru•3h ago
Computer scientists should treat history like civil engineering or physics treat their histories. These are subjects that make objective progress.

Art or philosophy might or might not make progress. No one can say for sure. They are bad role models.

rkagerer•3h ago
"These are subjects that make objective progress."

As opposed to ours, where we're fond of subjective regression. ;-P

wnc3141•2h ago
At least for history of economics, I think it's harder to really grasp modern economic thinking without considering the layers it's built upon, the context ideas were developed within etc...
eru•1h ago
That's probably true for macro-economics. Alas that's also the part where people disagree about whether it made objective progress.

Micro-economics is much more approachable with experiments etc.

Btw, I didn't suggest to completely disregard history. Physics and civil engineering don't completely disregard their histories, either. But they also don't engage in constant navel gazing and re-hashing like a good chunk of the philosophers do.

lmm•3h ago
Maybe he managed to work them out and understand them in the '70s, if you believe him. But he has certainly never managed to convey that understanding to anyone else. Frankly I think if you fail to explain your discovery to even a fraction of the wider community, you haven't actually discovered it.
synack•3h ago
When I was at RIT (2006ish?) there was an elective History of Computing course that started with the abacus and worked up to mainframes and networking. I think the professor retired years ago, but the course notes are still online.

https://www.cs.rit.edu/~swm/history/index.html

ponco•2h ago
I reflect on university, and one of the most interesting projects I did was an 'essay on the history of <operating system of your choice>' as part of an OS course. I chose OS X (Snow Leopard) and digging into the history gave me fantastic insights into software development, Unix, and software commercialisation. Echo your Mr Kay's sentiments entirely.
fsckboy•2h ago
>Alan Kay, my favorite curmudgeon, spent decades trying to remind us

Alan Kay giving the same (unique, his own, not a bad) speech at every conference for 50 years is not Alan Kay being a curmudgeon

>we keep reinventing concepts that were worked out in the late 70s and he’s disappointed we’ve been running in circles ever since.

it's Alan Kay running in circles

pathseeker•1h ago
This has been going on forever. That just means the people that "worked it out" were shitty teachers and/or bad at spreading their work.
musebox35•1h ago
Sadly this naturally happens in any field that ends up expanding due to its success. Suddenly the number of new practitioners outnumbers the number of competent educators. I think it is a fundamental human resources problem with no easy fix. Maybe llms will help with this, but they seem to reinforce the convergence to the mean in many cases as those to be educated is not in a position to ask the deeper questions.
AfterHIA•1h ago
I can't concur enough. We don't teach, "how to design computers and better methods to interface with them" we keep hashing over the same stuff over and over again. It gets worse over time and the effect is that what Engelbart called, "intelligence augmenters" become, "super televisions that cause you political and social angst."

How far we have fallen but so great the the reward if we could, "lift ourselves up again." I have hope in people like Bret Victor and Brenda Laurel.

d_silin•3h ago
"CSCI 4020: Writing Fast Code in Slow Languages" does exist, at least in the book form. Teach algorithmic complexity theory in slowest possible language like VB or Ruby. Then demonstrate how O(N) in Ruby trumps O(N^2) in C++.
d_silin•3h ago
The book in question:

https://www.amazon.ca/Visual-Basic-Algorithms-Ready-Run/dp/0...

LPisGood•3h ago
Python has come along way. It’s never gonna win for something like high-frequency trading, but it will be super competitive in areas you wouldn’t expect.
liqilin1567•2h ago
Optimizing at the algorithmic and architectural level rather than relying on language speed
nine_k•2h ago
One of my childhood books compared bubble sort implemented in FORTRAN and running on a Cray-1 and quicksort implemented in BASIC and running on TRS-80.

The BASIC implementation started to outrun the supercomputer at some surprisingly pedestrian array sizes. I was properly impressed.

aDyslecticCrow•20m ago
We had this as a lab in a learning systems course. converting python loops into numpy vector manipulation (map reduce), and then into tensorflow operations, and measuring the speed.

Gave a good idea of how python is even remotely useful for AI.

ggm•3h ago
Ls and not cat? Does the author have no memory of cat -v shitstorm?
abdullahkhalids•3h ago
The TFA wants the following computer science courses:

Unlearning Object-Oriented Programming: a course on specific software engineering techniques

Classical Software Studies: a course on the history of software tools

Writing Fast Code in Slow Languages: a course on specific engineering techniques

User Experience of Command Line Tools: an engineering design course

Obsessions of the Programmer Mind: course about engineering conventions and tools.

One day, the name of science will not be so besmirched.

LPisGood•3h ago
People don’t get this up in arms when there is a math course about using math tools or math history or how math is used to solve actual problems, but for some reason they do about computer science.
abdullahkhalids•2h ago
If universities offered a major in which students learned to operate telescopes of various kinds and sizes, and then called it astrophysics, people would be mad too.
LPisGood•1h ago
Astronomy isn’t about telescopes yet astronomers spend lots of time studying and doing research regarding telescopes.
owlbite•35m ago
They just label such people as Applied Mathematicians, or worse: Physicists and Engineers; and then get back to sensible business such as algebraic geometry, complex analysis and group theory.
TZubiri•3h ago
>CSCI 2100: Unlearning Object-Oriented Programming Discover how to create and use variables that aren't inside of an object hierarchy. Learn about "functions," which are like methods but more generally useful. Prerequisite: Any course that used the term "abstract base class."

This is just a common meme that often comes from ignorance, or a strawman of what OOP is.

>CSCI 4020: Writing Fast Code in Slow Languages Analyze performance at a high level, writing interpreted Python that matches or beats typical C++ code while being less fragile and more fun to work with.

I like this one, but see?

Python is heavily OOP, everything is an object in python for example.

I'm wondering if OP took a basic OOP course or would otherwise be interested in taking one? You can learn about a thing you are against, or even form your opinion after actually learning about it.

>

kelseyfrog•3h ago
CSCI 4810: The Refusal Lab

Simulate increasingly unethical product requests and deadlines. The only way to pass is to refuse and justify your refusal with professional standards.

LPisGood•3h ago
CSCI 4812: The Career Lab

Watch as peers increasingly overpromise and accept unethical product requests and deadlines and leave you high, mighty, and picking up the scraps as they move on to the next thing.

andrewflnr•3h ago
Conventional university degrees already contain practical examples of the principles of both these courses in the form of cheating and the social dynamics around it.
jayd16•1h ago
Actually a lot of degrees have a relevant ethics class.
kelseyfrog•54m ago
I don't disagree that such classes exist, but I have yet to see an ethics lab specifically with the intention of practicing refusal.
tdeck•1h ago
But universities want their graduates to be employable, is the thing.
username223•3h ago
I had forgotten about prog21, and I'm impressed how he wrapped up his blog:

> I don't think of myself as a programmer. I write code, and I often enjoy it when I do, but that term programmer is both limiting and distracting. I don't want to program for its own sake, not being interested in the overall experience of what I'm creating. If I start thinking too much about programming as a distinct entity then I lose sight of that.

Programming is a useful skill, even in the age of large language models, but it should always be used to achieve some greater goal than just writing programs.

markus_zhang•3h ago
> CSCI 3300: Classical Software Studies Discuss and dissect historically significant products, including VisiCalc, AppleWorks, Robot Odyssey, Zork, and MacPaint. Emphases are on user interface and creativity fostered by hardware limitations.

Definitely would love that. Reading source code is pretty hard for newbies like me. Some guidance is appreciated.

zzo38computer•3h ago
I think it would be a good idea, especially CSCI 3300. (Learning them in a course is not the only way to learn computer and other stuff, but is (and should be) one way to do.)

(However, CSCI 2100 shouldn't be necessary if you should learn stuff other than OOP the first time, even if you also learn OOP.)

a1studmuffin•2h ago
I really don't understand the modern hate towards OOP. From my experience over the last few decades working with large C and C++ codebases, the former turns into a big ball of mud first.
zzo38computer•1h ago
I think that OOP can be good for some things, but that does not mean that all or most programs should use OOP for all or most things. I would say that for most things it is not helpful, even though sometimes it is helpful.
omosubi•3h ago
I would add debugging as a course. Maybe they should teach this but how to dive deep into figuring out how to learn the root cause of defects and various tools would have been enormously helpful for me. Perhaps this already exists
9dev•1h ago
Yes please. Even senior engineers apply with their debugging abilities limited to sprinkling print-exit over the code.

Do you have a moment to talk about our saviour, Lord interactive debugging?

a-dub•3h ago
> CSCI 2100: Unlearning Object-Oriented Programming

but 90s fashion is all the rage these days!

ugh123•2h ago
Not really "computer science" topics, but useful nonetheless.
zdc1•2h ago
> PSYC 4410: Obsessions of the Programmer Mind

I was definitely guilty of this in my last role. Some of my refactorings were good and needed, but also a distraction from saying the codebase was "good enough" and focusing on the broader people/team/process problems around me.

raldi•2h ago
CSCI 4083: Developing Software Alongside Other Human Beings
gonzo41•2h ago
CSCI 2170: User Experience of Command Line Tools.

This should exist and the class should study openssl.

journal•2h ago
We can take a further step back and say that no one really knows how to do pagination and those who do are sick of it to teach it to others.
tomhow•2h ago
Previously:

Computer science courses that don't exist, but should (2015) - https://news.ycombinator.com/item?id=16127508 - Jan 2018 (4 comments)

Computer science courses that don't exist, but should (2015) - https://news.ycombinator.com/item?id=13424320 - Jan 2017 (1 comment)

Computer science courses that don't exist, but should - https://news.ycombinator.com/item?id=10201611 - Sept 2015 (247 comments)

casey2•2h ago
ls should have just been for listing files, giving it 10 flags is one of the big mistakes of early UNIX design
aryehof•1h ago
CS102 Big Balls of Mud: Data buckets, functions, modules and namespaces

CS103 Methodologies: Advanced Hack at it ‘till it Works

CS103 History: Fashion, Buzzwords and Reinvention

CS104 AI teaches Software Architecture (CS103 prerequisite)

owlbite•40m ago
Introduction to PhD study: "How hard can it be, I'm sure I could write that in a week"
0xbadcafebee•1h ago
Jesus christ Yes to all of it. Also needed:

  Systems Engineering 101/201/301/401:   How to design a computer system to be reliable
  Security Engineering 101/201/301/401:  How security flaws happen and how to prevent them
  Conway's Law 101/201:                  Why the quality of the software you write is less important than your org chart
  The Real DevOps 101/201/301:           Why and how to simultaneously deliver software faster, with higher quality, and fewer bugs
  Old And Busted 101/201:                The antiquated patterns developers still use, why they're crap, what to use instead
  Thinking Outside the Box 101:          Stupid modern designs and why older ones are better
  New Technology 101:                    The new designs that are actually superior and why
  Project Management 101/201/301:        History of project management trends, and how to manage any kind of work
  Managing for Engineers 101/201/301:    Why and how to stop trying to do everything, empowering your staff, data-driven continuous improvement
  Quality Control 101/201:               Improving and maintaining quality
  Legal Bullshit 101/201:                When you are legally responsible and how not to step in it
stoneman24•58m ago
In addition, Team Dynamics 301: A course in Blame Management Handling the traditional “innocent punished, guilty escape/promoted” issue. With explanation of the meme “Success has 100 fathers/mothers while failure is a orphan.
rippeltippel•1h ago
CSCI 4321: Unlearning Vibe Coding
tdeck•1h ago
> User Experience of Command Line Tools

Is this a good place to complain about tools with long option names that only accept a single dash? I'm thinking of you, `java -jar`.

DeathArrow•1h ago
I propose CSCI 3500: Software Engineering is Not About Writing Code
DeathArrow•1h ago
Also CSCI 3600: Making Optimal Compromises in Software Development
ninetyninenine•25m ago
Unlearning object oriented programming

I think OOP became popular because it feels profound when you first grasp it. There is that euphoric moment when all the abstractions suddenly interlock, when inheritance, polymorphism, and encapsulation seem to dance together in perfect logic. It feels like you have entered a secret order of thinkers who understand something hidden. Each design pattern becomes a small enlightenment, a moment of realization that the system is clever in ways that ordinary code is not.

But if you step back far enough, the brilliance starts to look like ornament. Many of these patterns exist only to patch over the cracks in the paradigm itself. OOP is not a natural way of thinking, but a habit of thinking that bends reality into classes and hierarchies whether or not they belong there. It is not that OOP is wrong, but that it makes you mistake complexity for depth.

Then you encounter functional programming, and the same transformation begins again. It feels mind expanding at first, with the purity of immutable data, the beauty of composability, and the comfort of mathematical certainty. You trade one set of rituals for another: monads instead of patterns, recursion instead of loops, composition instead of inheritance. You feel that familiar rush of clarity, the sense that you have seen through the surface and reached the essence.

But this time the shift cuts deeper. The difference between the two paradigms is not just structural but philosophical. OOP organizes the world by binding behavior to state. A method belongs to an object, and that object carries with it an evolving identity. Once a method mutates state, it becomes tied to that state and to everything else that mutates it. The entire program becomes a web of hidden dependencies where touching one corner ripples through the whole. Over time you code yourself into a wall. Refactoring stops being a creative act and turns into damage control.

Functional programming severs that chain. It refuses to bind behavior to mutable state. Statelessness is its quiet revolution. It means that a function’s meaning depends only on its inputs and outputs. Nothing else. Such a function is predictable, transparent, and portable. It can be lifted out of one context and placed into another without consequence. The function becomes the fundamental atom of computation, the smallest truly modular unit in existence.

That changes everything. In functional programming, you stop thinking in terms of objects with responsibilities and start thinking in terms of transformations that can be freely composed. The program stops feeling like a fortress of interlocking rooms and begins to feel like a box of Lego bricks. Each function is a block, self-contained, perfectly shaped, designed to fit with others in infinitely many ways. You do not construct monoliths; you compose arrangements. When you need to change something, you do not tear down the wall. You simply reassemble the bricks into new forms.

This is the heart of functional nirvana: the dream of a codebase that can be reorganized endlessly without decay. Where every part is both independent and harmonious, where change feels like play instead of repair. Most programmers spend their careers trying to reach that state, that perfect organization where everything fits together, but OOP leads them into walls that cannot move. Functional programming leads them into open space, where everything can move.

Reality will always be mutable, but the beauty of functional programming is that it isolates that mutability at the edges. The pure core remains untouched, composed of functions that never lie and never change. Inside that core, every function is both a truth and a tool, as interchangeable as Lego bricks and as stable as mathematics.

So when we ask which paradigm handles complexity better, the answer becomes clear. OOP hides complexity behind walls. Functional programming dissolves it into parts so small and transparent that complexity itself becomes optional. The goal is not purity for its own sake, but freedom; the freedom to recompose, reorganize, and rethink without fear of collapse. That is the real enlightenment: when your code stops feeling like a structure you maintain and starts feeling like a universe you can endlessly reshape.

EigenLord•13m ago
I wish more comp sci curricula would sprinkle in more general courses in logic and especially 20th century analytic philosophy. Analytic philosophy is insanely relevant to many computer science topics especially AI.
beardyw•9m ago
> It's about being able to implement your ideas.

No, it's about creating something which does something useful and is easy to maintain. Plumbers have great ideas and approaches but you just want plumbing which works and can be fixed .

It's time developers realised they are plumbers not **** artists.

rkomorn•3m ago
> It's time developers realised they are plumbers not artists.

If we're plumbers, why all the memory leaks?

Or... are we also bad plumbers?

gt5050•2m ago
One other course that all undergrads should take is Unix 101 Just the basics of unix (bash, cat , sed, awk) How to grep logs, tail files