frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Overseas renminbi lending surges as China steps up campaign to de-dollarise

https://www.ft.com/content/4577100f-8b71-4647-8e7e-fead115d9552
1•petethomas•28s ago•0 comments

Play virtual tambola online with friends and colleagues

https://tambola.online/
1•remotemonk•2m ago•0 comments

Trump cancels trade negotiations with Canada over an ad

https://www.msn.com/en-ae/news/other/trump-abruptly-terminates-all-trade-negotiations-with-canada...
1•SanjayMehta•6m ago•0 comments

Pipe Logic

https://www.linusakesson.net/programming/pipelogic/index.php
1•KnuthIsGod•8m ago•0 comments

RFC 863 – Discard Protocol

https://datatracker.ietf.org/doc/html/rfc863
1•gurjeet•10m ago•0 comments

Trump says he is ending trade negotiations with Canada

https://www.ft.com/content/72c04f48-d221-487b-9858-1a35a577d880
1•zerosizedweasle•12m ago•0 comments

One List To Rule Them All – 100 CSS features from the past 5 years

https://nerdy.dev/cascading-secret-sauce
1•alwillis•14m ago•0 comments

Mechanical Principles (1930) by Ralph Steiner [4min selection] [video]

https://www.youtube.com/watch?v=mkQ2pXkYjRM
1•thunderbong•14m ago•0 comments

'Attention is all you need' coauthor says he's 'absolutely sick' of transformers

https://venturebeat.com/ai/sakana-ais-cto-says-hes-absolutely-sick-of-transformers-the-tech-that-...
1•achow•17m ago•0 comments

Redpoint Claims $1.8T Up for Grabs for AI Apps in New Report

https://www.upstartsmedia.com/p/redpoint-ai64-apps-cursor
1•gmays•23m ago•0 comments

Show HN: A Complete Dokploy Deployment Guide for Next.js Project

https://nexty.dev/docs/start-project/dokploy
1•weijunext•23m ago•1 comments

ChatGPT Usage Limits: What They Are and How to Get Rid of Them

https://www.bentoml.com/blog/chatgpt-usage-limits-explained-and-how-to-remove-them
1•bbzjk7•32m ago•0 comments

HN: Multi-Search Engine

https://instawarp.com/
1•moxscale•40m ago•1 comments

JupyterGIS breaks through to the next level

https://eo4society.esa.int/2025/10/16/jupytergis-breaks-through-to-the-next-level/
8•arjxn-py•44m ago•0 comments

Driven Down: Amazon delivery drivers and workplace technologies

https://www.dair-institute.org/projects/driven-down/
1•pmw•50m ago•1 comments

SHOW HN: Remember the last time you used Sticky Notes?

2•VatanaChhorn•1h ago•0 comments

Google's first carbon capture and storage project

https://blog.google/outreach-initiatives/sustainability/first-carbon-capture-storage-project/
1•radeeyate•1h ago•0 comments

Trump Imposes Sanctions on Russian Oil Companies

https://www.nytimes.com/2025/10/22/us/politics/trump-sanctions-russia-ukraine.html
3•Animats•1h ago•2 comments

US Tariff Negotiations with Canada Terminated over Advertisement

https://www.bbc.com/news/articles/cdjrlmd4pmeo
6•mlhpdx•1h ago•4 comments

I spent 200 hours building a Postgres chat client

https://elijahrogers.dev/2025/10/03/i-spent-200-hours-building-a-postgres-chat-client.html
1•_vaporwave_•1h ago•0 comments

Untapped Potential in the Java Build Tool Experience

https://javapro.io/2025/10/23/untapped-potential-in-the-java-build-tool-experience/
2•lihaoyi•1h ago•0 comments

IDEaS fictional intelligence contest: Polar paradigms 2045 – Defending Canada

https://www.canada.ca/en/department-national-defence/programs/defence-ideas/element/contests/chal...
2•neom•1h ago•0 comments

We are short DoorDash, Inc

https://img1.wsimg.com/blobby/go/cc91fda7-4669-4d1b-81ce-a0b8d77f25ab/downloads/b0a1f32a-9b7b-414...
3•tibbar•1h ago•0 comments

Tech PACs Are Closing in on the Almonds

https://www.astralcodexten.com/p/tech-pacs-are-closing-in-on-the-almonds
1•paulpauper•1h ago•0 comments

Purpose Designed for Scale: How We Built It – Lithic

https://www.lithic.com/blog/purpose-designed-for-scale-how-we-built-it
1•matthewbauer•1h ago•0 comments

To be replaced by AI is a choice

https://peter.demin.dev/12_articles/75-replaced-by-AI.html
2•peterdemin•1h ago•1 comments

Atlassian's status page is down – Links to itself for help

https://status.atlassian.com/
1•abe-101•1h ago•2 comments

Speed vs. Velocity: The Difference Between Moving Fast and Moving Forward

https://read.thecoder.cafe/p/speed-vs-velocity
1•thunderbong•2h ago•1 comments

Roc Camera

https://roc.camera/
40•martialg•2h ago•49 comments

Saturn Data – FPGA-accelerated server for high-memory high-bandwidth workloads [video]

https://www.youtube.com/watch?v=ENr8YdCH71E
5•justicz•2h ago•2 comments
Open in hackernews

Computer science courses that don't exist, but should (2015)

https://prog21.dadgum.com/210.html
113•wonger_•2h ago

Comments

fred_is_fred•2h ago
The Classical Software Studies would be quite useful. Go write a game in 64kb of RAM in BASIC. It would really stretch your creativity and coding skills.
jasonthorsness•1h ago
Agreed, it would be very interesting to see some of the care taken for resource management that is lost now because every machine has “enough” RAM and cycles…
andoando•1h ago
There's generally courses on embedded
corysama•1h ago
I’m e long thought the Gameboy Advance would make a great educational platform. Literally every aspect of the hardware is memory mapped. Just stuff values into structs at hard-coded addresses and stuff happens. No need for any OS or any API at all.
lmm•1h ago
I think working on that kind of system would be actively harmful for most programmers. It would give them a totally unbalanced intuition for what the appropriate tradeoff between memory consumption and other attributes (maintainability, defect rate, ...) is. If anything, programmers should learn on the kind of machine that will be typical for most of their career - which probably means starting with a giant supercomputing cluster to match what's going to be in everyone's pocket in 20 years' time.
bruce511•40m ago
Ha. You call it "history". I call it "childhood". I did that years before getting to Uni :)

Although, to be fair, while it was a helpful practice at coding, I'm not a game designer, so it was a game too terrible to play.

First year Uni though I spent too many hours in the lab, competing with friends, to recreate arcade games on the PC. Skipping the game design part was helpful. To be fair by then we had a glorious 640k of ram. Some Assembly required.

epalm•2h ago
Where’s the course on managing your reaction when the client starts moving the goal posts on a project that you didn’t specify well enough (or at all), because you’re a young eager developer without any scars yet?
dfex•1h ago
some scars need to be earned
ekidd•1h ago
Back in the 90s, this was actually a sneaky part of Dartmouth's CS23 Software Engineering course. At least half your grade came from a 5-person group software project which took half a semester. The groups were chosen for you, of course.

The professors had a habit of sending out an email one week before the due date (right before finals week) which contained several updates to the spec.

It was a surprisingly effective course.

(Dartmouth also followed this up with a theory course that often required writing about 10 pages of proofs per week. I guess they wanted a balance of practice and theory, which isn't the worst way to teach CS.)

collingreen•1h ago
In uni we had a semester long group project with the stronger coders as project leaders. Group leaders controlled pass/fail for the group members and vice versa. After the leaders put together project plans for their teams and everyone was supposed to start working WHOOPSIE reorg and all the leaders got put on random teams and new people got "promoted" into leader (of groups they didn't create the plan for). If the project didn't work at the end the entire group failed.

I've never been taught anything more clearly than the lessons from that class.

markus_zhang•1h ago
Shit I was thinking about exactly the same thing: professor deliberately change requirements at last week to mess up the students and give them a bit of taste of true work.

Glad that someone actually did this.

hotstickyballs•1h ago
That's the job of a (good) manager.
h4ck_th3_pl4n3t•1h ago
It's called test driven development.
mnky9800n•1h ago
I am happy to sign up for all these classes. Tbh this is what coursera or whatever should be. Not yet another machine learning set of lectures with notebooks you click run on.
pluto_modadic•1h ago
"when you need shiny tech (you don't)" (aka, when to use postgres/sqlite)
LPisGood•1h ago
It is interesting that no software engineering or computer science course I’ve seen has ever spent any time on CI/CD.

Jenkins, Docker, Kubernetes, none of these sorts of things - and I don’t even mean these specific technologies, but moreover nothing even in their ballpark.

hotstickyballs•1h ago
Personally it felt quite natural once you start to work on real software projects.
LPisGood•1h ago
I have never felt that way, and I’ve worked on a variety of projects at a variety of companies.

Everyone has a bespoke mishmash of nonsense pipelines, build tools, side cars, load balancers, Terragrunt, Terraform, Tofu, Serverless, Helm charts, etc.

There are enough interesting things here that you wouldn’t even need to make a tool heavy project style software engineering course - you could legitimately make a real life computer science course that studies the algorithms and patterns and things used.

bckr•1h ago
Merging strategies, conflict resolution, bisect debugging, and version control in general are very computer sciencey.

Would make a great course.

LPisGood•58m ago
That’s not even getting into the fact that you could basically teach multiple graduate level distributed computing courses with k8s as a case study
andoando•1h ago
Well this has nothing to do with computer science so there's that
LPisGood•1h ago
I could not disagree with you more.

It would be easy for me to agree with you. I hold a graduate degree in computer science and I’m named for my contributions proofreading/correcting a graduate text about algorithms in computability theory.

I love abstraction and algorithms and pure theory, but this whole “computer science is a branch of mathematics, nothing more” idea has always struck me as ridiculous. Are you willing to throw out all of the study of operating systems, networking, embedded systems, security (hardware and software), a good chunk of AI, programming languages, UI/UX/human computer interaction, graphics, just to draw a line around algorithms and Turing Machines and say this is all there is to computer science?

andoando•1h ago
Cryptography is all math, networking is largely math and algorithms (IMO yes this should really be replaced with information theory. Just understanding Shannons paper would have been more valuable than learning about how routers work), AI is mostly statistics (And AI as a whole Id argue is the essence of computer science), graphics is largely math and algorithms.

Yes I very much think a computer science degree should be as close to the foundation of theory as possible. And still learning Jenkins and kuberenetes or even a general course on how effectively push code is still far from the things you listed.

Theres so much computer science that isnt even covered that id include before including courses on CI/CD

LPisGood•1h ago
Yeah cryptography mostly (but certainly not all) math but it accounts for a negligible (pun intended) portion of interesting security work.

AI is a lot of math, especially if you hang out with the quasiconvex optimization crowd, but a vast majority of work in that field can not properly constitute “theory”

I think it’s clear in practice that computer science has officially strayed beyond whatever narrow bounds people originally wished to confine it to.

andoando•49m ago
Alot of its a push for practicality/catering to student's interests. IMO its a result of a really archaic education system. Universities were originally small and meant for theoretical study, not as a de facto path for everyone to enroll into in order to get a job.

If it were me Id get rid of statically defined 4 year programs, and/or definite required courses for degrees, or just degrees in general. Just offer courses and let people come learn what they want.

One of my favorite classes was a python class that focused on building some simple games with tkinter, making a chat client, hosting a server, because it was the first time I understood how actual software worked. Im really glad I took that class.

On the other hand Id love to have learned information theory, lamba calculus, all the early AI, cognitive science, theory of programming languages, philosophy behind all of it that got us here

LPisGood•39m ago
Your point is well taken and to some extent I agree, but I think you have to recognize. It’s not just student interest, career preparation, and practicality.

The research done by professional academic computer scientists also reflects the broad scope I’m advocating for.

dsr_•1h ago
You've just named most of the interesting stuff.
raphman•57m ago
I'm doing this in the software engineering¹ course I teach. However:

a) Practical CI/CD requires understanding (and some practical experience) of many other concepts like Linux shell scripting, version control, build automation, containers, server administration, etc. As few students (in our degree programme) have sufficient experience in these concepts, I spend about half of the semester teaching such devops basics instead of actual software engineering.

b) Effectively, teaching CI/CD means teaching how GitHub or GitLab do CI/CD. I feel a little bit uncomfortable teaching people how to use tech stacks owned by a single company.

¹) Actually it's more a course on basic software craftsmanship for media informatics students because no such course exists in our curriculum and I find it more important that students learn this than that they understand the V model.

wpollock•41m ago
> It is interesting that no software engineering or computer science course I’ve seen has ever spent any time on CI/CD.

It's hard to fit everything student needs to know in the curriculum. Someone else posted here they had 10 pages of proofs per week, for one course. I would have been fired for assigning so much homework!

I was a CS professor at a local college. My solution was to ignore CS1 and CS2 curriculum (we were not ABET accredited, so that's okay) in the second course of Java programming. Instead, I taught students Maven/Gradle, Git and GitHub, workflows, CI/CD, regular expressions, basic networking, basic design patterns, Spring Boot, and in general everything I thought new programmers ought to know. I even found a book that covered much of this stuff, but in the end I wrote my own learning materials and didn't use a book.

The course was a victim of its success. The school mandated the course for non-Java programmers too, resulting in a lot of push-back from the non-Java students.

If anyone is interested, I have the syllabus online still (I've since retired) at <https://wpollock.com/>. Look for COP2800 and COP2805C. I can also send the Java teaching materials as a PDF to anyone interested (book length, but sadly not publishable quality).

corysama•1h ago
> CSCI 3300: Classical Software Studies

Alan Kay, my favorite curmudgeon, spent decades trying to remind us we keep reinventing concepts that were worked out in the late 70s and he’s disappointed we’ve been running in circles ever since. He’s still disappointed because very few programmers are ever introduced to the history of computer science in the way that artists study the history of art or philosophers the history of philosophy.

PaulDavisThe1st•1h ago
The history of art or philosophy spans millenia.

The effective history of computing spans a lifetime or three.

There's no sense comparing the two. In the year 2500 it might make sense to be disappointed that people don't compare current computational practices with things done in 2100 or even 1970, but right now, to call what we have "history" does a disservice to the broad meaning of that term.

Another issue: art and philosophy have very limited or zero dependence on a material substrate. Computation has overwhelming dependence on the performance of its physical substrate (by various metrics, including but not limited to: cpu speed, memory size, persistent storage size, persistent storage speed, network bandwidth, network scope, display size, display resolution, input device characteristics, sensory modalities accessibly via digital to analog conversion, ...). To assert that the way problems were solved in 1970 obviously has dramatic lessons for how to solve them in 2025 seems to me to completely miss what we're actually doing with computers.

degamad•1h ago
> art [has] very limited or zero dependence on a material substrate

This seems to fundamentally underestimate the nature of most artforms.

degamad•1h ago
> To assert that the way problems were solved in 1970 obviously has dramatic lessons for how to solve them in 2025 seems to me to completely miss what we're actually doing with computers.

True they might not all be "dramatic lessons" for us, but to ignore them and assume that they hold no lessons for us is also a tragic waste of resources and hard-won knowledge.

citizenpaul•1h ago
Its because CS is not cared about as a true science for the most part. Nearly all of the field is focused on consolidating power and money dynamics. No one cares to make a comprehensive history since it might give your competitors an edge.
andrewflnr•1h ago
You first sentences already suggest one comparison between the histories of computing and philosophy: history of computing ought to be much easier. Most of it is still in living memory. Yet somehow, the philosophy people manage it while we computing people rarely bother.
markus_zhang•1h ago
I always think it is great value to have a whole range of history of X courses.

I once thought about a series of PHYS classes that focus on historical ideas and experiments. Students are supposed to replicate the experiments. They have to read book chapters and papers.

andrewflnr•28m ago
That's not that far off the standard physics course, is it? Certainly lots of labs I took were directly based on historical experiments.
jancsika•1h ago
If Alan Kay doesn't respond directly to this comment, what is Hacker News even for? :)

You're not wrong about history, but that only strengthens Kay's case. E.g., our gazillion-times better physical substrate should have led an array of hotshot devs to write web apps that run circles around GraIL[1] by 2025. (Note the modeless GUI interaction.) Well, guess what? Such a thing definitely doesn't exist. And that can only point to programmers having a general lack of knowledge about their incredibly short history.

(In reality my hope is some slice of devs have achieved this and I've summoned links to their projects by claiming the opposite on the internet.)

Edit: just so I get the right incantation for summoning links-- I'm talking about the whole enchilada of a visual language that runs and rebuilds the user's flowchart program as the user modelessly edits it.

1: https://www.youtube.com/watch?v=QQhVQ1UG6aM

wredcoll•49m ago
> E.g., our gazillion-times better physical substrate should have led an array of hotshot devs to write web apps that run circles around GraIL[1] by 2025

Why? What problem did it solve that we're suffering from in 2025?

corysama•40m ago
https://news.ycombinator.com/user?id=alankay Has not been active on Hacker News for several years now.

At 85 he has earned the peace of staying away from anything and everything on the internet.

Exoristos•39m ago
> The history of art or philosophy spans millen[n]ia.

And yet for the most part, philosophy and the humanities' seminal development took place within about a generation, viz., Plato and Aristotle.

> Computation has overwhelming dependence on the performance of its physical substrate [...].

Computation theory does not.

chongli•5m ago
Art and philosophy have very limited or zero dependence on a material substrate

Not true for either. For centuries it was very expensive to paint with blue due to the cost of blue pigments (which were essentially crushed gemstones).

Philosophy has advanced considerably since the time of Plato and much of what it studies today is dependent on science and technology. Good luck studying philosophy of quantum mechanics back in the Greek city state era!

eru•1h ago
Computer scientists should treat history like civil engineering or physics treat their histories. These are subjects that make objective progress.

Art or philosophy might or might not make progress. No one can say for sure. They are bad role models.

rkagerer•1h ago
"These are subjects that make objective progress."

As opposed to ours, where we're fond of subjective regression. ;-P

wnc3141•29m ago
At least for history of economics, I think it's harder to really grasp modern economic thinking without considering the layers it's built upon, the context ideas were developed within etc...
lmm•1h ago
Maybe he managed to work them out and understand them in the '70s, if you believe him. But he has certainly never managed to convey that understanding to anyone else. Frankly I think if you fail to explain your discovery to even a fraction of the wider community, you haven't actually discovered it.
synack•1h ago
When I was at RIT (2006ish?) there was an elective History of Computing course that started with the abacus and worked up to mainframes and networking. I think the professor retired years ago, but the course notes are still online.

https://www.cs.rit.edu/~swm/history/index.html

ponco•1h ago
I reflect on university, and one of the most interesting projects I did was an 'essay on the history of <operating system of your choice>' as part of an OS course. I chose OS X (Snow Leopard) and digging into the history gave me fantastic insights into software development, Unix, and software commercialisation. Echo your Mr Kay's sentiments entirely.
fsckboy•33m ago
>Alan Kay, my favorite curmudgeon, spent decades trying to remind us

Alan Kay giving the same (unique, his own, not a bad) speech at every conference for 50 years is not Alan Kay being a curmudgeon

>we keep reinventing concepts that were worked out in the late 70s and he’s disappointed we’ve been running in circles ever since.

it's Alan Kay running in circles

d_silin•1h ago
"CSCI 4020: Writing Fast Code in Slow Languages" does exist, at least in the book form. Teach algorithmic complexity theory in slowest possible language like VB or Ruby. Then demonstrate how O(N) in Ruby trumps O(N^2) in C++.
d_silin•1h ago
The book in question:

https://www.amazon.ca/Visual-Basic-Algorithms-Ready-Run/dp/0...

LPisGood•1h ago
Python has come along way. It’s never gonna win for something like high-frequency trading, but it will be super competitive in areas you wouldn’t expect.
liqilin1567•19m ago
Optimizing at the algorithmic and architectural level rather than relying on language speed
nine_k•19m ago
One of my childhood books compared bubble sort implemented in FORTRAN and running on a Cray-1 and quicksort implemented in BASIC and running on TRS-80.

The BASIC implementation started to outrun the supercomputer at some surprisingly pedestrian array sizes. I was properly impressed.

ggm•1h ago
Ls and not cat? Does the author have no memory of cat -v shitstorm?
abdullahkhalids•1h ago
The TFA wants the following computer science courses:

Unlearning Object-Oriented Programming: a course on specific software engineering techniques

Classical Software Studies: a course on the history of software tools

Writing Fast Code in Slow Languages: a course on specific engineering techniques

User Experience of Command Line Tools: an engineering design course

Obsessions of the Programmer Mind: course about engineering conventions and tools.

One day, the name of science will not be so besmirched.

LPisGood•1h ago
People don’t get this up in arms when there is a math course about using math tools or math history or how math is used to solve actual problems, but for some reason they do about computer science.
abdullahkhalids•35m ago
If universities offered a major in which students learned to operate telescopes of various kinds and sizes, and then called it astrophysics, people would be mad too.
LPisGood•6m ago
Astronomy isn’t about telescopes yet astronomers spend lots of time studying and doing research regarding telescopes.
TZubiri•1h ago
>CSCI 2100: Unlearning Object-Oriented Programming Discover how to create and use variables that aren't inside of an object hierarchy. Learn about "functions," which are like methods but more generally useful. Prerequisite: Any course that used the term "abstract base class."

This is just a common meme that often comes from ignorance, or a strawman of what OOP is.

>CSCI 4020: Writing Fast Code in Slow Languages Analyze performance at a high level, writing interpreted Python that matches or beats typical C++ code while being less fragile and more fun to work with.

I like this one, but see?

Python is heavily OOP, everything is an object in python for example.

I'm wondering if OP took a basic OOP course or would otherwise be interested in taking one? You can learn about a thing you are against, or even form your opinion after actually learning about it.

>

kelseyfrog•1h ago
CSCI 4810: The Refusal Lab

Simulate increasingly unethical product requests and deadlines. The only way to pass is to refuse and justify your refusal with professional standards.

LPisGood•1h ago
CSCI 4812: The Career Lab

Watch as peers increasingly overpromise and accept unethical product requests and deadlines and leave you high, mighty, and picking up the scraps as they move on to the next thing.

andrewflnr•1h ago
Conventional university degrees already contain practical examples of the principles of both these courses in the form of cheating and the social dynamics around it.
username223•1h ago
I had forgotten about prog21, and I'm impressed how he wrapped up his blog:

> I don't think of myself as a programmer. I write code, and I often enjoy it when I do, but that term programmer is both limiting and distracting. I don't want to program for its own sake, not being interested in the overall experience of what I'm creating. If I start thinking too much about programming as a distinct entity then I lose sight of that.

Programming is a useful skill, even in the age of large language models, but it should always be used to achieve some greater goal than just writing programs.

markus_zhang•1h ago
> CSCI 3300: Classical Software Studies Discuss and dissect historically significant products, including VisiCalc, AppleWorks, Robot Odyssey, Zork, and MacPaint. Emphases are on user interface and creativity fostered by hardware limitations.

Definitely would love that. Reading source code is pretty hard for newbies like me. Some guidance is appreciated.

zzo38computer•1h ago
I think it would be a good idea, especially CSCI 3300. (Learning them in a course is not the only way to learn computer and other stuff, but is (and should be) one way to do.)

(However, CSCI 2100 shouldn't be necessary if you should learn stuff other than OOP the first time, even if you also learn OOP.)

a1studmuffin•32m ago
I really don't understand the modern hate towards OOP. From my experience over the last few decades working with large C and C++ codebases, the former turns into a big ball of mud first.
omosubi•1h ago
I would add debugging as a course. Maybe they should teach this but how to dive deep into figuring out how to learn the root cause of defects and various tools would have been enormously helpful for me. Perhaps this already exists
a-dub•1h ago
> CSCI 2100: Unlearning Object-Oriented Programming

but 90s fashion is all the rage these days!

ugh123•1h ago
Not really "computer science" topics, but useful nonetheless.
zdc1•1h ago
> PSYC 4410: Obsessions of the Programmer Mind

I was definitely guilty of this in my last role. Some of my refactorings were good and needed, but also a distraction from saying the codebase was "good enough" and focusing on the broader people/team/process problems around me.

raldi•57m ago
CSCI 4083: Developing Software Alongside Other Human Beings
gonzo41•53m ago
CSCI 2170: User Experience of Command Line Tools.

This should exist and the class should study openssl.

journal•48m ago
We can take a further step back and say that no one really knows how to do pagination and those who do are sick of it to teach it to others.
tomhow•41m ago
Previously:

Computer science courses that don't exist, but should (2015) - https://news.ycombinator.com/item?id=16127508 - Jan 2018 (4 comments)

Computer science courses that don't exist, but should (2015) - https://news.ycombinator.com/item?id=13424320 - Jan 2017 (1 comment)

Computer science courses that don't exist, but should - https://news.ycombinator.com/item?id=10201611 - Sept 2015 (247 comments)

casey2•12m ago
ls should have just been for listing files, giving it 10 flags is one of the big mistakes of early UNIX design
aryehof•3m ago
[delayed]