frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Start all of your commands with a comma (2009)

https://rhodesmill.org/brandon/2009/commands-with-comma/
211•theblazehen•2d ago•64 comments

OpenCiv3: Open-source, cross-platform reimagining of Civilization III

https://openciv3.org/
686•klaussilveira•15h ago•204 comments

The Waymo World Model

https://waymo.com/blog/2026/02/the-waymo-world-model-a-new-frontier-for-autonomous-driving-simula...
960•xnx•20h ago•553 comments

How we made geo joins 400× faster with H3 indexes

https://floedb.ai/blog/how-we-made-geo-joins-400-faster-with-h3-indexes
127•matheusalmeida•2d ago•35 comments

Unseen Footage of Atari Battlezone Arcade Cabinet Production

https://arcadeblogger.com/2026/02/02/unseen-footage-of-atari-battlezone-cabinet-production/
65•videotopia•4d ago•4 comments

Jeffrey Snover: "Welcome to the Room"

https://www.jsnover.com/blog/2026/02/01/welcome-to-the-room/
30•kaonwarb•3d ago•26 comments

Vocal Guide – belt sing without killing yourself

https://jesperordrup.github.io/vocal-guide/
45•jesperordrup•5h ago•23 comments

Show HN: Look Ma, No Linux: Shell, App Installer, Vi, Cc on ESP32-S3 / BreezyBox

https://github.com/valdanylchuk/breezydemo
236•isitcontent•15h ago•26 comments

ga68, the GNU Algol 68 Compiler – FOSDEM 2026 [video]

https://fosdem.org/2026/schedule/event/PEXRTN-ga68-intro/
8•matt_d•3d ago•2 comments

Monty: A minimal, secure Python interpreter written in Rust for use by AI

https://github.com/pydantic/monty
230•dmpetrov•15h ago•122 comments

Show HN: I spent 4 years building a UI design tool with only the features I use

https://vecti.com
334•vecti•17h ago•147 comments

Where did all the starships go?

https://www.datawrapper.de/blog/science-fiction-decline
27•speckx•3d ago•17 comments

Hackers (1995) Animated Experience

https://hackers-1995.vercel.app/
500•todsacerdoti•23h ago•244 comments

Sheldon Brown's Bicycle Technical Info

https://www.sheldonbrown.com/
384•ostacke•21h ago•97 comments

Show HN: If you lose your memory, how to regain access to your computer?

https://eljojo.github.io/rememory/
296•eljojo•18h ago•187 comments

Microsoft open-sources LiteBox, a security-focused library OS

https://github.com/microsoft/litebox
360•aktau•21h ago•183 comments

An Update on Heroku

https://www.heroku.com/blog/an-update-on-heroku/
421•lstoll•21h ago•281 comments

PC Floppy Copy Protection: Vault Prolok

https://martypc.blogspot.com/2024/09/pc-floppy-copy-protection-vault-prolok.html
67•kmm•5d ago•10 comments

Dark Alley Mathematics

https://blog.szczepan.org/blog/three-points/
95•quibono•4d ago•22 comments

Was Benoit Mandelbrot a hedgehog or a fox?

https://arxiv.org/abs/2602.01122
21•bikenaga•3d ago•11 comments

Delimited Continuations vs. Lwt for Threads

https://mirageos.org/blog/delimcc-vs-lwt
33•romes•4d ago•3 comments

How to effectively write quality code with AI

https://heidenstedt.org/posts/2026/how-to-effectively-write-quality-code-with-ai/
262•i5heu•18h ago•212 comments

Female Asian Elephant Calf Born at the Smithsonian National Zoo

https://www.si.edu/newsdesk/releases/female-asian-elephant-calf-born-smithsonians-national-zoo-an...
38•gmays•10h ago•13 comments

I now assume that all ads on Apple news are scams

https://kirkville.com/i-now-assume-that-all-ads-on-apple-news-are-scams/
1074•cdrnsf•1d ago•460 comments

Introducing the Developer Knowledge API and MCP Server

https://developers.googleblog.com/introducing-the-developer-knowledge-api-and-mcp-server/
61•gfortaine•13h ago•27 comments

Understanding Neural Network, Visually

https://visualrambling.space/neural-network/
294•surprisetalk•3d ago•46 comments

I spent 5 years in DevOps – Solutions engineering gave me what I was missing

https://infisical.com/blog/devops-to-solutions-engineering
153•vmatsiiako•20h ago•72 comments

The AI boom is causing shortages everywhere else

https://www.washingtonpost.com/technology/2026/02/07/ai-spending-economy-shortages/
14•1vuio0pswjnm7•1h ago•1 comments

Why I Joined OpenAI

https://www.brendangregg.com/blog/2026-02-07/why-i-joined-openai.html
159•SerCe•11h ago•148 comments

Show HN: R3forth, a ColorForth-inspired language with a tiny VM

https://github.com/phreda4/r3
74•phreda4•14h ago•14 comments
Open in hackernews

Why do we need dithering?

https://typefully.com/DanHollick/why-do-we-need-dithering-Ut7oD4k
147•ibobev•3mo ago

Comments

abstractspoon•3mo ago
They answered the question in the first two sentences: We don't need it, it's just an aesthetic nowadays.
debugnik•2mo ago
It's not just aesthetic, I keep seeing games with color banding because they don't bother to dither before quantizing.
amelius•2mo ago
From the article:

> We don't really need dithering anymore because we have high bit-depth colors so its largely just a retro aesthetic now.

By the way, dithering in video creates additional problems because you want some kind of stability between successive frames.

dTal•2mo ago
Yeah, the article is wrong about that.
amelius•2mo ago
It would be nice if you had some examples.
nofriend•2mo ago
see eg https://xcancel.com/theo/status/1978161273214058786?s=46
slabity•2mo ago
Acerola recently made a video about how Silk Song has banding with dark colors due to poor dithering (and how to fix it): https://www.youtube.com/watch?v=au9pce-xg5s

Highly recommend for any graphics programmer that might think dithering is unnecessary or simply a "aesthetic choice".

knollimar•2mo ago
To lend more credibility, the devs added more dithering in the next patch.
dTal•2mo ago
A great many can be found here: https://en.wikipedia.org/wiki/Dither

(also a very nice explanation of why dithering is a fundamental signal processing step applicable to many fields, not just an "aesthetic".)

amelius•2mo ago
Interesting!
Dylan16807•2mo ago
The average desktop computer is running with 8 bit color depth the vast majority of the time, so find or generate basically any wide basic gradient and you'll see it.
doormatt•2mo ago
I think you mean 24 bit. 8 bit would only be 256 colors total.
recursive•2mo ago
8 bits for each of R G and B. So a grey-scale gradient indeed has only 256 colors available. Any gradient also will have about that many at most.
kragen•2mo ago
In most gradients, the transitions in R, G, and B are at different places.
recursive•2mo ago
True. Also, in most gradients, the full range of R G and B is not used.

In rgb(50, 60, 70) to rgb(150, 130, 120), there are only 200 total transitions.

kragen•2mo ago
True!
mrandish•2mo ago
In terms of color spaces, SRGB (the typical baseline default RGB of desktop computing) is quite naive and inefficient. Pretty much its only upsides are its conceptual and mathematical simplicity. There are much more efficient color spaces which use dynamic non-linear curves and are based on how the rods and cones in human eyes sense color.

The current hotness for wide color gamuts and High Dynamic Range is ICTCP (https://en.wikipedia.org/wiki/ICtCp) which is conceptually similar to (https://en.wikipedia.org/wiki/LMS_color_space).

recursive•2mo ago
The same logic applies to any other color space with 24 total bits of resolution.
electroly•2mo ago
A very simple black-to-white gradient can only be, at most, 256 pixels wide before it starts banding on the majority of computers that use SDR displays. HDR only gives you a couple extra bits where each bit doubles how wide the gradient can be before it starts running out of unique color values. If the two color endpoints of the gradient are closer together, you get banding sooner. Dithering completely solves gradient banding.
Sesse__•2mo ago
You can do with a static dither pattern (I've done it, and it works well). It's a bit of a trade-off between banding and noise, but at least static stuff stays static and thus easily compressable.
TinkersW•2mo ago
The article is simple wrong, dithering is still widely used, and no we do not have enough color depth to avoid it. Go render a blue sky gradient without dithering, you will see obvious bands.
mrandish•2mo ago
Yep, even high quality 24-bit uncompressed imagery often benefits from dithering, especially if it's synthetically generated and, even if it's natural imagery, if it's processed or manipulated - even mildly - it'll probably benefit from dithering. If it's a digital photograph, it was probably already dithered during the de-bayering process.
jchw•2mo ago
Dithering can be for aesthetic reasons, I presume especially old-school dithering that is especially pronounced. However, dithering is actually still useful in all sorts of signal processing, particularly when there are perceptible artifacts of quantization. This occurs all the time: you can trivially observe it by making gradients that go between close looking colors, something you can see on the web right now. There are many techniques to avoid banding like this, but dithering lets you hide banding without needing increased bit depth or choosing strategic stop colors by trading off spatial resolution for (perceived) color resolution, which works excellently for gradients because it's all low frequency.

And frankly, it turns out 256 colors is quite a lot of colors especially for a small image, so with a very good quantization algorithm and a very good dithering algorithm, you can seriously crunch a lot of things down to PNG8 with no obvious loss in quality. I have done this at many of my employers, armed with other tricks, to dramatically reduce page load sizes.

matja•2mo ago
Dithering isn't only applied to 2D graphics, it can be applied in any type of spatial or temporal data to reduce the noise floor, or tune aliasing distortion noise to other parts of the frequency spectrum. Also common in audio.
tinkelenberg•2mo ago
This is the best explanation I’ve come across. I enjoy dithering as a playful way to compress file size when it makes sense.
dcrazy•2mo ago
Slightly frustrating the author started out with color images and then switched to grayscale.
raajg•2mo ago
This was recently shared on HN: https://visualrambling.space/dithering-part-1/

For anyone interested in seeing how dithering can be pushed to the limits, play 'Return of the Obra Dinn'. Dithering will always remind you of this game after that.

- https://visualrambling.space/dithering-part-1

- https://store.steampowered.com/app/653530/Return_of_the_Obra...

amiga386•2mo ago
On Return of the Obra Dinn's dithering specifically, here is the original developer blog on its dithering: https://forums.tigsource.com/index.php?topic=40832.msg136374...

It's intended, aesthetically, to remind you of Atkinson dithering (https://en.wikipedia.org/wiki/Atkinson_dithering), a variant of Floyd-Steinberg dithering often used in graphics for the black-and-white Macintosh.

susam•2mo ago
This is going to be an odd comment, but I immediately recognised the parrot in the test images. It's the scarlet macaw from 2004 which is often used in many Wikipedia articles about colour graphics.

I think this is the original, photographed and contributed by Adrian Pingstone: https://commons.wikimedia.org/wiki/File:Parrot.red.macaw.1.a...

But this particular derivative is the one that appears most often in the Wikipedia articles: https://commons.wikimedia.org/wiki/File:RGB_24bits_palette_s...

This parrot has occurred in several articles on the web. For example, here's one article from a decade or so ago: https://retroshowcase.gr/index.php?p=palette

Parrots are often used in articles and research papers about computer graphics and I think I know almost all the parrots that have ever appeared in computing literature. This particular one must be the oldest computing literature parrot I know!

By the way, I've always been fascinated by dithering ever since I first noticed it in newspapers as a child. Here was a clever human invention that could produce rich images with so little, something I could see every day and instinctively understand how it creates the optical illusion of smooth gradients, long before I knew what it was called.

malfist•2mo ago
This also used to be a really common test image: https://en.wikipedia.org/wiki/Lenna

But its apparently a cropped centerfold from Playboy

tux3•2mo ago
The original Lenna is controversial, but I'm delighted to share the "ethically sourced Lenna": https://mortenhannemose.github.io/lena/
ziml77•2mo ago
This feels better than the original anyway. I never liked the yellow color that one had. Maybe it was an artistic choice, but to me it just looked degraded, like when white plastic is left exposed to the sun.
cyclotron3k•2mo ago
Agreed. Seemed like a particularly poor choice to show off the capabilities of an image compression algorithm
malfist•2mo ago
Oh my goodness that is delightful
kelseyfrog•2mo ago
What's the impetus behind replacing the image with something even sexier?
DonHopkins•2mo ago
Cute, but still not as sexy as the original Mandrill test image!

https://www.researchgate.net/figure/Original-standard-test-i...

pezezin•2mo ago
Now I regretting leaving the machine vision field, I would love to use this picture in a paper xD
lelanthran•2mo ago
> The original Lenna is controversial, but I'm delighted to share the "ethically sourced Lenna": https://mortenhannemose.github.io/lena/

How is this ethically better than the original Lena - the model in that also one expressly approved the usage of the photo for the purposes it was being used for.

mark-r•2mo ago
How did she expressly approve the original, when she didn't know for decades that it was so widely used?
lelanthran•2mo ago
> How did she expressly approve the original, when she didn't know for decades that it was so widely used?

Maybe I read the wrong interview with her, but when she found out about it she expressed happiness about it.

Since this replacement image was created after her interview, how is it ethically better in any way?

dinkelberg•2mo ago
Lena Söderberg expressed her wish for her image to be "retired from tech" in 2019 (see the end of this clip, https://vimeo.com/372265771), when the above alternative image was published.
TacticalCoder•2mo ago
And a poster of Lenna is on the wall of the Richard Hendricks character in the Silicon Valley series. Which makes sense as he's working on a compression algorithm.
adwn•2mo ago
Just a heads-up: you seem to be shadow-banned, all your comments are auto-dead.
msephton•2mo ago
I can see them?
adwn•2mo ago
I just vouched for a few of them, maybe that's why.
jdougan•2mo ago
do you have showdead turned on?
kazinator•2mo ago
It was shot by an actual Hooker, too.
DonHopkins•2mo ago
https://news.ycombinator.com/item?id=42571652

DonHopkins 10 months ago | parent | context | favorite | on: ASCII porn predates the Internet but it's still ev...

EBCDIC porn really punched my cards. ;)

I had to carefully select just the characters that would punch low resolution monochrome pornographic images into the holes of the punch card.

Just joking, I'm not that old -- I started with ASCII line printer porn, like "MC:HUMOR;VICKI BODY", over the government sponsored ARPANET, at 300 baud, so it was like a nice long strip tease on taxpayer dollars. Vicki took almost 4 and a half minutes to finish at that rate, longer during busy weekday business hours. If I recall, the good stuff was all UPPER CASE, which made it much more intense.

https://web.archive.org/web/20210512025608/http://its.svenss...

Decades later, somebody on HN with a sharper eye than I noticed that Vicki's nipples were clearly labeled "A" and "B". Go figure!

HN: Should computer scientists keep the Lena picture? (lemire.me)

https://news.ycombinator.com/item?id=15671629

DonHopkins on Nov 10, 2017 | parent | context | favorite | on: Should computer scientists keep the Lena picture?

Does "AI:HUMOR;VICKI BODY" get grandfathered in, too?

NSFW: MS C0LLINS - 0UI - FEBRUARY 1973:

https://web.archive.org/web/20210512025608/http://its.svenss...

https://en.wikipedia.org/wiki/Grandfather_clause

mercer on Nov 11, 2017 [–]

Is the nipples being marked 'A' and 'B' part of the joke?

DonHopkins on Nov 11, 2017 | parent [–]

As far as I know, those were not the points of the joke. I noticed them for the first time yesterday too, after not noticing them for decades!

As a teen, I'd printed it out, pinned it up on my wall next to the Cray-1 centerfold, and scribbled a bunch of modem phone numbers, user names and passwords all over it, and never even noticed.

I did a quick search for other A's and B's and found that it used those characters as much as any other character for shading, but that sure seems like something some mischievous student, lab member, turist or sentient TECO script at the MIT-AI Lab might have done.

There was no file security so anyone could have edited them in.

Maybe one of Minsky's grad students was performing some A/B testing or eye tracking experiments.

Somebody should ask RMS if EMACS had some special mode for editing line printer porn.

bdamm•2mo ago
I got a bit wave of nostalgia for my CorelDRAW experience. Thanks!
yzydserd•2mo ago
Unsurprisingly, macaw test images go way back. There is one in this old Kodak test image dataset that is often used in CG tests.

https://r0k.us/graphics/kodak/kodim23.html

It seems to have been uploaded in 1999 from an old slide dataset.

This seems to be the Photo CD from 1993. I suppose the source goes back earlier.

https://www.math.purdue.edu/~lucier/PHOTO_CD/

ajb•2mo ago
The predecessor of dithering was the art of wood engraving, which reproduced the texture of illustrations in wood blocks used for printing in newspapers and books; necessarily in black and white. It's difficult to imagine now, but there was an entire industry of engravers to many of whom solely practiced engraving of others designs (often supplied drawn directly on the block to be engraved, to save time). These engravers were highly skilled and the artistry with which a piece was engraved could make a huge difference.

For example see "A treatise on wood engravings : historical and practical", by John Jackson and William Chatto, 1839[1]; here is a quote (p585 of the linked edition):

"With respect to the direction of lines, it ought at all times to be borne in mind by the wood engraver, — and more especially when the lines are not laid in by the designer, — that they should be disposed so as to denote the peculiar form of the object they are intended to represent. For instance, in the limb of a figure they ought not to run horizontally or vertically, — conveying the idea of either a flat surface or of a hard cylindrical form, — but with a gentle curvature suitable to the shape and the degree of rotundity required. A well chosen line makes a great difference in properly representing an object, when compared with one less appropriate, though more delicate. The proper disposition of lines will not only express the form required, but also produce more colour as they approach each other in approximating curves, as in the following example, and thus represent a variety of light and shade, without the necessity of introducing other lines crossing them, which ought always to be avoided in small subjects : if, however, the figures be large, it is necessary to break the hard appearance of a series of such single lines by crossing them with others more delicate."

There was even a period of a few decades after the invention of photography, during which it was not known how to mass produce photographs, and so they were manually engraved as with artworks. Eventually however, the entire profession became extinct.

[1] https://archive.org/details/treatiseonwooden00chat/page/585/.... (This is the 1881 edition)

yvdriess•2mo ago
This is the kind of nerd food I keep hoping to find on HN comments. Thanks!
christophilus•2mo ago
Malcom Guite’s new epic poem is illustrated that way: https://www.rabbitroom.com/merlinsisle

I love the look of it.

quitit•2mo ago
When teaching print (halftones, LPI, CMYK etc) I also use a parrot. I've used rainbows and chameleons too, but settled on the parrot as being the most appropriate. But now I begin to wonder if I was just parroting(har har) a paradigm.
andai•2mo ago
Also by the author: https://www.makingsoftware.com/

Recent discussions:

Making Software - https://news.ycombinator.com/item?id=43678144

How does a screen work? - https://news.ycombinator.com/item?id=44550572

What is a color space? - https://news.ycombinator.com/item?id=45013154

grep_it•2mo ago
I love the authors style!
01HNNWZ0MV43FF•2mo ago
Playdead Games did a really nice presentation about dithering for games, it gets passed around and I'm sure it's been on HN already: https://loopit.dk/banding_in_games.pdf
Dylan16807•2mo ago
> Before we all mute the word 'dithering'

Is this a reply to something?

tshaddox•2mo ago
Yes, it's referencing a tweet which briefly made the rounds a few weeks ago:

https://x.com/TukiFromKL/status/1981024017390731293

Many people believed that the author was claiming to have invented a particular illustration style which involved dithering.

Cockbrand•2mo ago
The cringe is strong with this one.

(Thanks for the heads up, I hadn't seen that)

hcs•2mo ago
You still see dithering from time to time as a cheap transparency, it's been a few years since Mario Odyssey but that's when last I recall it really stood out: https://xcancel.com/chriswade__/status/924071608976924673
BigJono•2mo ago
This is what I'm doing for my game, I didn't know it was actually a thing in some big titles too, that's reassuring. I landed on it because it was a huge code simplification compared to every other method of handling transparency, and it doesn't look completely shit in the era of high resolutions and frame rates.
AuryGlenz•2mo ago
I just implemented it for a VR app I’ve been working on where the semi-transparent objects can appear any which way, intersecting, etc. I didn’t realize how much of an issue that’d be…or how hard it’d be to come up with a shader for dithering in VR that doesn’t look awful. I’m still not super happy with what I have - it moves along with the player’s eyes - but every other solution I could come up with didn’t interact well with two screens, especially at far distances from the object. Moire for days.
lepicz•2mo ago
mafia2 used that as well (at least for cars appearing into your bubble)
magicalhippo•2mo ago
First time I saw it was in the original Unreal game (1998) when using the software renderer. It had this very distinct asymmetric dithering pattern.

Can't find a screenshot of it on short order, seems most screenshots are either of unrelated newer Unreal Engine or use hardware rendering which doesn't show this dithering.

mrguyorama•2mo ago
I don't know why but I loathe this feature.

It's pretty much the norm now and I think late UE4 in AAA games is what really pushed it?

It's cheap and simple to setup, and most games rely on TAA to make it less annoying.

But TAA sucks! And TAA encourages all sorts of extremely lazy workflows and graphical effects because it will clean them up a little, but they look so bad without it.

I hate it. Raytracing is part of this. It's all just really big companies with billions of dollars cheaping out on authoring good visuals.

Microsoft flight simulator has weird sparkly shadows because it's tracing a few hundred rays and expects you to use TAA to cover it up and it's so bad. Same exact story for reflections.

So now companies expect you to buy $800 GPUs that chew through half a kilowatt of power so that they can be lazy and not care how poorly they've authored their assets and don't really have to consider anything about their visuals.

It makes me sad.

mrandish•2mo ago
A related bit of tech trivia is that digital audio also often involves dithering, and not just decimated or compressed audio. Even very high-quality studio mastered audio benefits from an audio specific kind of dithering called noise shaping. Depending on the content, studio mixing engineers may choose different noise shaping algorithms.

https://en.wikipedia.org/wiki/Noise_shaping

olivia-banks•2mo ago
The figures in this article are really great. How where they made? If I was to try and recreate them I might render things individually and then lay it out in Illustrator to get that 3D isomorphic look, but I assume there's a better way.
kragen•2mo ago
We've had a couple of other recent discussions on dithering: https://news.ycombinator.com/item?id=45750954 and https://news.ycombinator.com/item?id=45698323. I commented specifically about the history of blue-noise dithering at https://news.ycombinator.com/item?id=45728231.

The article points out that, historically, RAM limitations were a major incentive for dithering on computer hardware. (It's the reason Heckbert discussed in his dissertation, too.) Palettizing your framebuffer is clearly one solution to this problem, but I wonder if chroma subsampling hardware might have been a better idea?

The ZX Spectrum did something vaguely like this: the screen was 256×192 pixels, and you could set the pixels independently to foreground and background colors, but the colors were provided by "attribute bytes" which each provided the color pairs for an 8×8 region http://www.breakintoprogram.co.uk/hardware/computers/zx-spec.... This gave you a pretty decent simulation of a 16-color gaming experience while using only 1.125 bits per pixel instead of the 4 you would need on an EGA. So you got a near-EGA-color experience on half the RAM budget of a CGA, and you could move things around the screen much faster than on even the CGA. (The EGA, however, had a customizable palette, so the ZX Spectrum game colors tend to be a lot more garish. The EGA also had 4.6× as many pixels.)

Occasionally in ZX Spectrum game videos like https://www.youtube.com/watch?v=Nx_RJLpWu98 you will see color-bleeding artifacts where two sprites overlap or a sprite crosses a boundary between two background colors. For applications like CAD the problem would have been significantly worse, and for reproducing photos it would have been awful.

The Nintendo did something similar, but I think had four colors per tile instead of two.

So, suppose it was 01987 and your hardware budget permitted 8 bits per pixel. The common approach at the time was to set a palette and dither to it. But suppose that, instead, you statically allocated five of those bits to brightness (a Y channel providing 32 levels of grayscale before dithering) and the other three to a 4:2:0 subsampled chroma (https://www.rtings.com/tv/learn/chroma-subsampling has nice illustrations). Each 2×2 4-pixel block on the display would have one sample of chroma, which could be a 12-bit sample: 6 bits of U and 6 bits of V. Moreover, you can interpolate the U and V values from one 2×2 block to the next. As long as you're careful to avoid drawing text on backgrounds that differ only in chroma (as in the examples in that web page) you'd get full resolution for antialiased text and near-photo-quality images.

That wouldn't liberate you completely from the need for dithering, but I think you could have produced much higher quality images that way than we in fact did with MCGA and VGA GIFs.

kragen•2mo ago
I just learned that the Yamaha V9958 video chip used in the MSX2+ implemented something very much like this: https://en.wikipedia.org/wiki/YJK
firebot•2mo ago
We really don't anymore.

Back in the late 90s maybe. Gifs and other paletted image formats were popular.

I even experimented with them. I designed various formats for The Palace. The most popular was 20-bit (6,6,6,2:RGBA, also 5,5,5,5; but the lack of color was intense, 15 bits versus 18 is quite a difference). This allowed fairly high color with anti-aliasing -edges that were semi transparent.

aidenn0•2mo ago
We absolutely still need dithering; 24-bit sRGB is not nearly enough for a large monochromatic gradient to not have visible banding without dithering.
mordae•2mo ago
https://en.wikipedia.org/wiki/Frame_rate_control

Your screen likely uses dithering to produce 1-2 LSBs of each color channel of this piece of graphics right now.

cmovq•2mo ago
Dithering is still very common in rendering pipelines. 8 bits per channel is not enough to capture subtle gradients, and you’ll get tons of banding. Particularly in mostly monochrome gradients produced by light sources. So you render everything to a floating point buffer and apply dithering.

Unlike the examples in this post, this dithering is basically invisible at high resolutions, but it’s still very much in use.

black_knight•2mo ago
Another place where dithering is useful in graphics is when you can’t do enough samples in every point to get a good estimation of some value. Add jitter to each sample and then blur, and then suddenly each point will be influenced by the samples made around them, giving higher fidelity.

I recently learned the slogan “Add jitter as close to the quantisation step as possible.” I realised that “quantisation step” is not just when clamping to a bit depth, but basically any time there is an if-test on a continuous value! This opens my mind to a lot of possible places to add dithering!

01HNNWZ0MV43FF•2mo ago
Hell, one could dither vertex positions and normals
zozbot234•2mo ago
A lot of display hardware uses a combination of spatial and temporal dithering these days. You can see it sometimes if you look up close, it appears as very faint flickering "snow" (the kind you'd see on old analog TV). Ironically, making this kind of dithering even less perceivable may turn out to be the foremost benefit of high pixel resolutions (beyond 1080p) and refresh rates (beyond 120Hz) since it seems that raising those specs is easier than directly improving color depth in hardware.
quitit•2mo ago
Adobe Illustrator 2026 has only -just- added a dithering option to their gradient tool.
kettlecorn•2mo ago
Dithering is super useful in dark scenes in games and movies.

By adding random noise to the screen it makes bands of color with harsh transitions imperceptible, and the dithering itself also isn't perceptible.

I'm sure there are better approaches nowadays but in some of my game projects I've used the screen space dither approach used in Portal 2 that was detailed in this talk: https://media.steampowered.com/apps/valve/2015/Alex_Vlachos_...

It's only a 3 line function but the jump in visual quality in dark scenes was dramatic. It always makes me sad when I see streamed content or games with bad banding, because the fix is so simple and cheap!

One thing that's important to note is that it's a bit tricky to make dithering on / off comparisons because resizing a screenshot of a scene with dithering makes the dithering no longer work unless one pixel in the image ends up exactly corresponding to one pixel on your screen

mordae•2mo ago
It's not cheap for streaming. It's harder to compress and is lost in the process. The video codec is a smart low pass filter.
kalleboo•2mo ago
The AV1 codec has support to tell the decoder to generate fake film grain, so you can add back all the noise lost in compression.

Although I don't think it's very widely used, I dunno if that's due to the compressors or decompressors.

kettlecorn•2mo ago
It needs to be done by the client and not be part of the actual video stream, otherwise it doesn't even work. When done by the client it's cheap.
jasomill•2mo ago
Not so cheap if your hardware decoder only supports 8-bit color, which is a common limitation of H.264 decoders in particular.
WheatMillington•2mo ago
What an insanely beautiful website. Reminds me of the golden days of the internet, remastered tastefully.
efilife•2mo ago
What's with the dithering trend? Why do I keep hearing about it everywhere at least once a week? Where did this originate from?
mordae•2mo ago
Physics.
Gigachad•2mo ago
Even though it looks less accurate, I prefer the look of the Ordered Bayer image. It looks artistically low-fi while the others look more like a highly compressed image to me. Considering we are able to just represent images with full colour today, the only reason I'd dither is for the aesthetic.
OCTAGRAM•2mo ago
I have noticed author uses values 0-255 for shades of grey. When 0-255 range is used, this is usually nonlinear scale with average gamma 2.2. And de facto there is a standard function that maps 0-31 to linear function and the rest to power 2.4. Average power 2.2. Checkmate black 0 and white 255 is equavalent to uniform grey shade 185 or 186 as opposed to 127 or 128. Proper calculations should be done in linear space, and 16 bits per channel is desired at least.
vismit2000•2mo ago
Earlier submission: https://news.ycombinator.com/item?id=45743067
chromehearts•2mo ago
Can somebody explain to me how dithering is an aesthetic (as mentioned in the article)? I feel I'm too young to understand that
harperlee•2mo ago
It's related to nostalgia. If you have lived at the time where dithering was used, you will have an emotional response to it. For example, if you played games as a child with dithering, then playing The Return of the Obra Dinn, which was mentioned in other thread, will take you back to a happy place.

And even if you did not live at that time, exposure to that distinct visual style will also start having meaning to you. Like how an exposed brick interior wall has a distinct aesthetic, and carries connotations of an industrial space.

badlibrarian•2mo ago
https://en.wikipedia.org/wiki/Atkinson_dithering
imtringued•2mo ago
https://shared.fastly.steamstatic.com/store_item_assets/stea...

https://shared.fastly.steamstatic.com/store_item_assets/stea...

https://store.steampowered.com/app/410970/Master_of_Orion_1/

mytailorisrich•2mo ago
We don't really need dithering anymore because we have high bit-depth colors so its largely just a retro aesthetic now.

I think "retro aesthetic" is quite plain as to what it means. You needed to read on ;)

kqr•2mo ago
I haven't written about this yet but I don't often see it mentioned: dithering has applications outside of image processing. Any time one needs to create a sequence sampled from a distribution, but would like to do so "evenly" without creating lumps, Floyd–Steinberg is a decent candidate.
yeasku•2mo ago
Also in audio to downsample to 16 bits.
mordae•2mo ago
Quantize to 16 bits.
a-french-anon•2mo ago
Something that really blew my mind, as someone who didn't study much signal processing: https://www.audiosciencereview.com/forum/index.php?threads/d...

The tl;dr is that dither isn't just for the eyes, it's mathematically needed to preserve information when undergoing quantization.

ant6n•2mo ago
So, uh, why do we need dithering?

I thought the era of 4 bit color had passed.

vardump•2mo ago
Dithering is usually used up to 18-bit color (6 bits each red, green and blue components). But it's useful even past that to reduce banding.
stavros•2mo ago
Is zooming disabled for anyone else? The article depends on me looking at the photos, but they're the size of a MicroSD card on my screen, and zooming in is disabled on this site!
charlie-83•2mo ago
For some good examples, check out what people have achieved on the PlayDate handheld with dithering on its 1-bit screen.
smougel•2mo ago
Just Vibecoded this : https://x.com/smougel/status/1989309185423532120?s=20

Subpixel dithering ! 1 bit per channel which mean that what you see is only 0 or 1 for each channel (R, G, B) By using gaussian blur, the result is perceptually very good ! X compress the image a lot but this truly 1 subpixel ON / OFF

Akronymus•2mo ago
One thing with dithering writeups that always bothered me: There never seems to be coverage of how to calculate colour similarity when there are multiple different colours involved, rather than just black and white gradients.

I am trying to implement it for myself, but really struggling to find any proper literature on that, that I am actually able to understand.

mark-r•2mo ago
The first step is to convert from the RGB color space to something more perceptual, like Lab. I'm sure there's a standard for comparing the color similarity of two images but I'm having trouble remembering the name - too early in the morning I guess.
mark-r•2mo ago
That was disappointing. I expected from the title that it would cover modern use cases. When's the last time you needed to use a web-safe palette?

It's still useful if you're trying to display a 10-bit-per-channel image on an 8-bit-per-channel display, but the gain isn't nearly as dramatic. And the need doesn't come up very often.

corysama•2mo ago
It’s useful all 8-bit scenarios when you have overlapping transparency. Especially with smooth, dark images.

https://loopit.dk/rendering_inside.pdf

The new Silksong apparently shipped with really bad banding that could have benefited from the techniques in that doc.

magicalhippo•2mo ago
Inside was pretty darn rad, very nice breakdown.

They link to the following thesis:

Optimal Dither and Noise Shaping in Image Processing

http://hdl.handle.net/10012/3867

nixpulvis•2mo ago
With high enough resolution does color depth become less important when you also dither?
corysama•2mo ago
Yep. You are trading special resolution for color resolution.