frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

SectorC: A C Compiler in 512 bytes

https://xorvoid.com/sectorc.html
97•valyala•4h ago•16 comments

The F Word

http://muratbuffalo.blogspot.com/2026/02/friction.html
43•zdw•3d ago•8 comments

Brookhaven Lab's RHIC concludes 25-year run with final collisions

https://www.hpcwire.com/off-the-wire/brookhaven-labs-rhic-concludes-25-year-run-with-final-collis...
23•gnufx•2h ago•19 comments

Speed up responses with fast mode

https://code.claude.com/docs/en/fast-mode
55•surprisetalk•3h ago•54 comments

Software factories and the agentic moment

https://factory.strongdm.ai/
97•mellosouls•6h ago•175 comments

Stories from 25 Years of Software Development

https://susam.net/twenty-five-years-of-computing.html
100•vinhnx•7h ago•13 comments

Hoot: Scheme on WebAssembly

https://www.spritely.institute/hoot/
143•AlexeyBrin•9h ago•26 comments

OpenCiv3: Open-source, cross-platform reimagining of Civilization III

https://openciv3.org/
850•klaussilveira•1d ago•258 comments

I write games in C (yes, C)

https://jonathanwhiting.com/writing/blog/games_in_c/
138•valyala•4h ago•109 comments

First Proof

https://arxiv.org/abs/2602.05192
68•samasblack•6h ago•52 comments

Show HN: A luma dependent chroma compression algorithm (image compression)

https://www.bitsnbites.eu/a-spatial-domain-variable-block-size-luma-dependent-chroma-compression-...
7•mbitsnbites•3d ago•0 comments

The Waymo World Model

https://waymo.com/blog/2026/02/the-waymo-world-model-a-new-frontier-for-autonomous-driving-simula...
1093•xnx•1d ago•618 comments

Al Lowe on model trains, funny deaths and working with Disney

https://spillhistorie.no/2026/02/06/interview-with-sierra-veteran-al-lowe/
64•thelok•6h ago•10 comments

Vocal Guide – belt sing without killing yourself

https://jesperordrup.github.io/vocal-guide/
235•jesperordrup•14h ago•80 comments

Start all of your commands with a comma (2009)

https://rhodesmill.org/brandon/2009/commands-with-comma/
519•theblazehen•3d ago•191 comments

Reinforcement Learning from Human Feedback

https://rlhfbook.com/
94•onurkanbkrc•9h ago•5 comments

Show HN: I saw this cool navigation reveal, so I made a simple HTML+CSS version

https://github.com/Momciloo/fun-with-clip-path
31•momciloo•4h ago•5 comments

Selection Rather Than Prediction

https://voratiq.com/blog/selection-rather-than-prediction/
13•languid-photic•3d ago•4 comments

Coding agents have replaced every framework I used

https://blog.alaindichiappari.dev/p/software-engineering-is-back
259•alainrk•8h ago•425 comments

The AI boom is causing shortages everywhere else

https://www.washingtonpost.com/technology/2026/02/07/ai-spending-economy-shortages/
186•1vuio0pswjnm7•10h ago•266 comments

A Fresh Look at IBM 3270 Information Display System

https://www.rs-online.com/designspark/a-fresh-look-at-ibm-3270-information-display-system
48•rbanffy•4d ago•9 comments

France's homegrown open source online office suite

https://github.com/suitenumerique
615•nar001•8h ago•272 comments

72M Points of Interest

https://tech.marksblogg.com/overture-places-pois.html
36•marklit•5d ago•6 comments

We mourn our craft

https://nolanlawson.com/2026/02/07/we-mourn-our-craft/
348•ColinWright•3h ago•414 comments

Unseen Footage of Atari Battlezone Arcade Cabinet Production

https://arcadeblogger.com/2026/02/02/unseen-footage-of-atari-battlezone-cabinet-production/
124•videotopia•4d ago•39 comments

Where did all the starships go?

https://www.datawrapper.de/blog/science-fiction-decline
99•speckx•4d ago•115 comments

Show HN: Kappal – CLI to Run Docker Compose YML on Kubernetes for Local Dev

https://github.com/sandys/kappal
33•sandGorgon•2d ago•15 comments

Learning from context is harder than we thought

https://hy.tencent.com/research/100025?langVersion=en
211•limoce•4d ago•119 comments

Show HN: Look Ma, No Linux: Shell, App Installer, Vi, Cc on ESP32-S3 / BreezyBox

https://github.com/valdanylchuk/breezydemo
288•isitcontent•1d ago•38 comments

History and Timeline of the Proco Rat Pedal (2021)

https://web.archive.org/web/20211030011207/https://thejhsshow.com/articles/history-and-timeline-o...
20•brudgers•5d ago•5 comments
Open in hackernews

Derivation and Intuition behind Poisson distribution

https://antaripasaha.notion.site/Derivation-and-Intuition-behind-Poisson-distribution-1255314a56398062bf9dd9049fb1c396
105•sebg•9mo ago

Comments

meatmanek•9mo ago
Poisson distributions are sort of like the normal distribution for queuing theory for two main reasons:

1. They're often a pretty good approximation for how web requests (or whatever task your queuing system deals with) arrive into your system, as long as your traffic is predominantly driven by many users who each act independently. (If your traffic is mostly coming from a bot scraping your site that sends exactly N requests per second, or holds exactly K connections open at a time, the Poisson distribution won't hold.) Sort of like how the normal distribution shows up any time you sum up enough random variables (central limit theorem), the Poisson arrival process shows up whenever you superimpose enough uncorrelated arrival processes together: https://en.wikipedia.org/wiki/Palm%E2%80%93Khintchine_theore...

2. They make the math tractable -- you can come up with closed-form solutions for e.g. the probability distribution of the number of users in the system, the average waiting time, average number of users queuing, etc: https://en.wikipedia.org/wiki/M/M/c_queue#Stationary_analysi... https://en.wikipedia.org/wiki/Erlang_(unit)#Erlang_B_formula

emmelaich•9mo ago
Useful for understanding load on machines. One case I had was -- N machines randomly updating a central database. The database can only handle M queries in one second. What's the chance of exceeding M?

Also related to the Birthday Problem and hash bucket hits. Though with those you're only interested in low collisions. With some queues (e.g. database above) you might be interested when collisions hit a high number.

PessimalDecimal•9mo ago
There is another extremely important way in which they are like the normal distribution: both are maximum entropy distributions, i.e. each is the "most generic" within their respective families of distributions.

[1] https://en.wikipedia.org/wiki/Poisson_distribution#Maximum_e...

[2] https://en.wikipedia.org/wiki/Normal_distribution#Maximum_en...

srean•9mo ago
So is Gamma, Binomial, Bernoulli, negative-Binomial, exponential and many many more. Maxent distribution types are very common. In fact the entire family of distributions in the exponential family are Maxent distributions.
DAGdug•9mo ago
What’s special about this treatment? It’s the 101 part of a 101 probability course.
quirino•9mo ago
I really like the Poisson Distribution. A very interesting question I've come across once is:

A given event happens at a rate of every 10 minutes on average. We can see that:

- The expected length of the interval between events is 10 minutes.

- At a random moment in time the expected wait until the next event is 10 minutes.

- At the same moment, the expected time passed since the last event is also 10 minutes.

But then we would expect the interval between two consecutive events to be 10+10 = 20 minutes long. But we know intervals are 10 on average. What happened here?

The key is that by picking a random moment in time, you're more likely to fall into a bigger intervals. By sampling a random point in time the average interval you fall into really is 20 minutes long, but by sampling a random interval it is 10.

Apparently this is called the Waiting Time Paradox.

fc417fc802•9mo ago
> What happened here?

You went astray when you declared the expected wait and expected passed.

Draw a number line. Mark it at intervals of 10. Uniformly randomly select a point on that line. The expected average wait and passed (ie forward and reverse directions) are both 5, not 10. The range is 0 to 10.

When you randomize the event occurrences but maintain the interval as an average you change the range maximum and the overall distribution across the range but not the expected average values.

pfedak•9mo ago
If it wasn't clear, their statements are all true when the events follow a poisson distribution/have exponentially distributed waiting times.
yorwba•9mo ago
When you randomize the event occurences, you create intervals that are shorter and longer than average, so that a random point is more likely to be in a longer interval, so that the expected length of the interval containing a random point is greater than the expected length of a random interval.

To see this, consider just two intervals of length x and 2-x, i.e. 1 on average. A random point is in the first interval x/2 of the time and in the second one the other 1-x/2 of the time, so the expected length of the interval containing a random point is x/2 * x + (1-x/2) * (2-x) = x² - 2x + 2, which is 1 for x = 1 but larger everywhere else, reaching 2 for x = 0 or 2.

fc417fc802•9mo ago
I think I understand my mistake. As the variance of the intervals widens the average event interval remains the same but the expected average distances for a sample point change. (For some reason I thought that average distances wouldn't change. I'm not sure why.)

Your example illustrates it nicely. A more intuitive way of illustrating the math might be to suppose 1 event per 10 minutes but they always happen in pairs simultaneously (20 minute gap), or in triplets simultaneously (30 minute gap), or etc.

So effectively the earlier example that I replied to is the birthday paradox, with N people, sampling a day at random, and asking how far from a birthday you expect to be on either side.

If that counts as a paradox then so does the number of upvotes my reply received.

jwarden•9mo ago
The way, I understand it is that with a Poisson process, at every small moment in time there’s a small chance of the event happening. This leads to on average lambda events occurring during every (larger) unit of time.

But this process has no “memory” so no matter how much time has passed since the last event, the number of events expected during the next unit of time is still lambda.

me3meme•9mo ago
From last event to this event = 10, from this event to next event = 10, so the time between the first and the third event is 20, where is the surprise in the Waiting Time Paradox?, sure I must be missing some key ingredient here.
quirino•9mo ago
The random moment we picked in time is not necessarily an event. The expected time between the event to your left and the one to your right (they're consecutive) is 20 minutes.
me3meme•9mo ago
I think we must use conditional probability, that is the integral of p(X|A)P(A), for example probability the prior event was 5 minutes ago probabity(the next one is 10 minutes from the previous one (that is 1/2). This is like markov chain, probability of next state depends of current state.
hammock•9mo ago
Poisson, Pareto/power/zipf and normal distributions are really important. The top 3 for me. (What am I missing?) And often misused (most often normal). It’s really good to know which to use when
klysm•9mo ago
Normal is overused for sometimes sensible reasons though. The CLT is really handy when you have to consider sums
FilosofumRex•9mo ago
It's surprising that so few people bother to use non-parametric probability distributions. With today's computational resources, there is no need for parametric closed form models (may be with the exception of Normal for historical reasons), each dataset contains its own distribution.
klysm•9mo ago
It’s easier to do MCMC when the distributions at hand have nice analytic properties so you can take derivatives etc. You should also have a very good understanding of the standards distributions and how they all relate to each other
hyperbovine•9mo ago
How hard is it to estimate that distribution for modern high dimensional data?
jwarden•9mo ago
> What am I missing?

Beta

hammock•9mo ago
What are the common understandable use cases for beta distribution, in everyday life?
jwarden•9mo ago
I don’t use probability distributions in everyday life ;)

But it is the right distribution to represent uncertainty about the probability of binary events (eg a website user clicking some button). For example, if I have absolutely no idea the probability then I use the uniform distribution, Beta(1,1), which is the maximum entropy distribution. Then if I observe one user and they happen to click, I have Beta(2,1), and at a glance I known the mean of that (2/3) which is a useful point estimate.

klysm•9mo ago
Proportions of things frequently follow beta distributions. I think of it as the normal distribution of the domain 0 to 1.
cwmoore•9mo ago
Lightbulbs burn out, but when?
klysm•9mo ago
Later
digger495•9mo ago
Steve, le
joe_the_user•9mo ago
I can understand a message that javascript needs to be enabled for your ** site.

But permanently redirecting so I can't see this after I enable javascript is just uncool and might not endear one on site like hn where lots of folks disable js initially.

Edit: and anonymizing, disabling and reloading... It's just text with formatted math. Sooo many other solutions to this, jeesh guys.

_0ffh•9mo ago
It's notion, I don't know why people use this service.
Zecc•9mo ago
It breaks scrolling with the arrow keys or PgDn/PgUp as well.
Rant423•9mo ago
An application of the Poisson distribution (1946)

https://garcialab.berkeley.edu/courses/papers/Clarke1946.pdf

tatrajim•9mo ago
Famously used by Thomas Pynchon in Gravity's Rainbow. The notion of obtaining a distribution of random rocket attacks blew my young mind and prompted a life-long interest in the sturdy of statistics.
mmorse1217•9mo ago
This site is pretty helpful for me with this sort of thing. The style is more technical though.

https://www.acsu.buffalo.edu/~adamcunn/probability/probabili...

laichzeit0•9mo ago
But this just gives the definition of the distribution. No intuition about where it might have come from, it just appears magically out of thin air and shows some properties it has in the limit.
firesteelrain•9mo ago
At work we use Arena to model various systems and Poisson is our go to.