frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

We Mourn Our Craft

https://nolanlawson.com/2026/02/07/we-mourn-our-craft/
119•ColinWright•1h ago•87 comments

Speed up responses with fast mode

https://code.claude.com/docs/en/fast-mode
22•surprisetalk•1h ago•24 comments

Hoot: Scheme on WebAssembly

https://www.spritely.institute/hoot/
121•AlexeyBrin•7h ago•24 comments

Stories from 25 Years of Software Development

https://susam.net/twenty-five-years-of-computing.html
62•vinhnx•5h ago•7 comments

OpenCiv3: Open-source, cross-platform reimagining of Civilization III

https://openciv3.org/
828•klaussilveira•21h ago•249 comments

U.S. Jobs Disappear at Fastest January Pace Since Great Recession

https://www.forbes.com/sites/mikestunson/2026/02/05/us-jobs-disappear-at-fastest-january-pace-sin...
119•alephnerd•2h ago•78 comments

Al Lowe on model trains, funny deaths and working with Disney

https://spillhistorie.no/2026/02/06/interview-with-sierra-veteran-al-lowe/
55•thelok•3h ago•7 comments

Brookhaven Lab's RHIC Concludes 25-Year Run with Final Collisions

https://www.hpcwire.com/off-the-wire/brookhaven-labs-rhic-concludes-25-year-run-with-final-collis...
4•gnufx•39m ago•0 comments

The AI boom is causing shortages everywhere else

https://www.washingtonpost.com/technology/2026/02/07/ai-spending-economy-shortages/
108•1vuio0pswjnm7•8h ago•138 comments

The Waymo World Model

https://waymo.com/blog/2026/02/the-waymo-world-model-a-new-frontier-for-autonomous-driving-simula...
1059•xnx•1d ago•611 comments

Reinforcement Learning from Human Feedback

https://rlhfbook.com/
76•onurkanbkrc•6h ago•5 comments

Start all of your commands with a comma (2009)

https://rhodesmill.org/brandon/2009/commands-with-comma/
484•theblazehen•2d ago•175 comments

I Write Games in C (yes, C)

https://jonathanwhiting.com/writing/blog/games_in_c/
8•valyala•2h ago•1 comments

SectorC: A C Compiler in 512 bytes

https://xorvoid.com/sectorc.html
9•valyala•2h ago•0 comments

Vocal Guide – belt sing without killing yourself

https://jesperordrup.github.io/vocal-guide/
209•jesperordrup•12h ago•70 comments

France's homegrown open source online office suite

https://github.com/suitenumerique
558•nar001•6h ago•256 comments

Coding agents have replaced every framework I used

https://blog.alaindichiappari.dev/p/software-engineering-is-back
222•alainrk•6h ago•343 comments

A Fresh Look at IBM 3270 Information Display System

https://www.rs-online.com/designspark/a-fresh-look-at-ibm-3270-information-display-system
36•rbanffy•4d ago•7 comments

Selection Rather Than Prediction

https://voratiq.com/blog/selection-rather-than-prediction/
8•languid-photic•3d ago•1 comments

History and Timeline of the Proco Rat Pedal (2021)

https://web.archive.org/web/20211030011207/https://thejhsshow.com/articles/history-and-timeline-o...
19•brudgers•5d ago•4 comments

72M Points of Interest

https://tech.marksblogg.com/overture-places-pois.html
29•marklit•5d ago•2 comments

Unseen Footage of Atari Battlezone Arcade Cabinet Production

https://arcadeblogger.com/2026/02/02/unseen-footage-of-atari-battlezone-cabinet-production/
114•videotopia•4d ago•31 comments

Where did all the starships go?

https://www.datawrapper.de/blog/science-fiction-decline
76•speckx•4d ago•75 comments

Show HN: I saw this cool navigation reveal, so I made a simple HTML+CSS version

https://github.com/Momciloo/fun-with-clip-path
6•momciloo•2h ago•0 comments

Show HN: Look Ma, No Linux: Shell, App Installer, Vi, Cc on ESP32-S3 / BreezyBox

https://github.com/valdanylchuk/breezydemo
273•isitcontent•22h ago•38 comments

Learning from context is harder than we thought

https://hy.tencent.com/research/100025?langVersion=en
201•limoce•4d ago•111 comments

Show HN: Kappal – CLI to Run Docker Compose YML on Kubernetes for Local Dev

https://github.com/sandys/kappal
22•sandGorgon•2d ago•11 comments

Monty: A minimal, secure Python interpreter written in Rust for use by AI

https://github.com/pydantic/monty
286•dmpetrov•22h ago•153 comments

Making geo joins faster with H3 indexes

https://floedb.ai/blog/how-we-made-geo-joins-400-faster-with-h3-indexes
155•matheusalmeida•2d ago•48 comments

Software factories and the agentic moment

https://factory.strongdm.ai/
71•mellosouls•4h ago•75 comments
Open in hackernews

Dithering – Part 2: The Ordered Dithering

https://visualrambling.space/dithering-part-2/
256•ChrisArchitect•1w ago

Comments

ChrisArchitect•1w ago
Related:

Dithering - Part 1

https://news.ycombinator.com/item?id=45750954

VinLucero•1w ago
Thank you
csressel•1w ago
first post was great, this should be interesting!
subprotocol•1w ago
In chrome it says "Loading assets, please wait..." and hangs. but it works for me in firefox
jonahx•1w ago
This is really nice work, as are the other posts.

If the author stops by, I'd be interested to hear about the tech used.

ggambetta•1w ago
I used ordered dithering in my ZX Spectrum raytracer (https://gabrielgambetta.com/zx-raytracer.html#fourth-iterati...). In this case it's applied to a color image, but since every 8x8-pixel block can only have one of two colors (one of these fun limitations of the Spectrum), it's effectively monochrome dithering.
onion2k•1w ago
Spectrum Basic was my first programming language, so that gives me all sorts of nostalgia feels. Your work is awesome.
a_shovel•1w ago
Bayer dithering in particular is part of the signature look of Flipnote Studio animations, which you may recognize from animators like kekeflipnote (e.g. https://youtu.be/Ut-fJCc0zS4)
spicyjpeg•1w ago
Bayer dithering was also employed heavily on the original PlayStation. The PS1's GPU was capable of Gouraud shading with 24-bit color precision, but the limited capacity (1 MB) and bandwidth of VRAM made it preferable to use 16-bit framebuffers and textures. In an attempt to make the resulting color bands less noticeable, Sony thus added the ability to dither pixels written to the framebuffer on-the-fly using a 4x4 Bayer matrix hardcoded in the GPU [1]. On a period-accurate CRT TV using a cheap composite video cable, the picture would get blurred enough to hide away the dithering artifacts; obviously an emulator or a modern LCD TV will quickly reveal them, resulting in a distinct grainy look that is often replicated in modern "PS1-style" indie games.

Interestingly enough, despite the GPU being completely incapable of "true" 24-bit rendering, Sony decided to ship the PS1 with a 24-bit video DAC and the ability to display 24-bit framebuffers regardless. This ended up being used mainly for title screens and video playback, as the PS1's hardware MJPEG decoder retained support for 24-bit output.

[1]: https://psx-spx.consoledev.net/graphicsprocessingunitgpu/#24...

PMunch•1w ago
Just did a bit of a deep dive into dithering myself, for my project of creating an epaper laptop. https://peterme.net/building-an-epaper-laptop-dithering.html it compares both error diffusion algorithms as well as Bayer, blue noise, and some more novel approaches. Just in case anyone wants to read a lot more about dithering!
quag•1w ago
After implementing a number of dithering approaches, including blue noise and the three line approach used in modern games, I’ve found that quasi random sequences give the best results. Have you tried them out?

https://extremelearning.com.au/unreasonable-effectiveness-of...

leguminous•1w ago
What is the advantage over blue noise? I've had very good results with a 64x64 blue noise texture and it's pretty fast on a modern GPU. Are quasirandom sequences faster or better quality?

(There's no TAA in my use case, so there's no advantage for interleaved gradient noise there.)

EDIT: Actually, I remember trying R2 sequences for dither. I didn't think it looked much better than interleaved gradient noise, but my bigger problem was figuring out how to add a temporal component. I tried generalizing it to 3 dimensions, but the result wasn't great. I also tried shifting it around, but I thought animated interleaved gradient noise still looked better. This was my shadertoy: https://www.shadertoy.com/view/33cXzM

PMunch•1w ago
Ooh, I haven't actually! I'll need to implement and test this for sure. Looking at the results though it does remind me of a dither (https://pippin.gimp.org/a_dither/), which I guess makes sense since they are created in a broadly similar way.
PMunch•1w ago
Just had a look at this and here is the result for the test image: https://uploads.peterme.net/test-image_qr.png.

Looks pretty good! It looks a bit like a dither, but with fewer artifacts. Definitely a "sharper" look than blue noise, but in places like the transitions between the text boxes you can definitely see a bit more artifacts (almost looks like the boxes have a staggered edge).

Thanks for bringing this to my attention!

storystarling•1w ago
Nice writeup. I've been looking at this for a print-on-demand project and found that physical ink bleed changes the constraints quite a bit compared to e-paper. In my experience error diffusion often gets muddy due to dot gain, whereas ordered dithering seems to handle the physical expansion of the ink better.
robinsonb5•1w ago
> In my experience error diffusion often gets muddy due to dot gain

Absolutely - there's a reason why traditional litho printing uses a clustered dot screen (dots at a constant pitch with varying size).

I've spent some time tinkering with FPGAs and been interested by the parallels between two-dimensional halftoning of graphics and the various approaches to doing audio output with a 1-bit IO pin: pulse width modulation (largely analogous to the traditional printer's dot screen) seems to cope better with imperfections in filters and asymmetries in output drivers than pulse density modulation (analogous to error diffusion dithers).

zozbot234•1w ago
Traditional litho actually uses either lines in curved crosshatch patterns or irregular stippling. Might be doable using an altered error-diffusion approach that rewards tracing a clearly defined line as opposed to placing individual dots or blots.
PMunch•1w ago
Thanks! I would imagine printing on paper would be a completely different ball game. I actually considered scanning the actual epaper display to show each of the dithering techniques in their intended environment as it does change the look quite a bit. From the little I know about typography and things like ink-wells I can definitely see how certain algorithms can change quite significantly. The original post here has a pattern which looks similar to old newspapers, maybe that's worth looking into?
shultays•1w ago
I had a project with those 7 colour e-paper displays and used dithering and it looked amazing. Crazy how much you could fake with just 7 colours and dithering
PMunch•1w ago
Definitely, I've been trying out a lot of dithering algorithms, and while they have big differences with only black and white as soon as you start adding more shades of grey they all look pretty much exactly the same as the input image. I'd imagine good dithering with colours would look amazing
mblode•1w ago
I built a blue noise generator and dithering library in Rust and TypeScript. It generates blue noise textures and applies blue noise dithering to images. There’s a small web demo to try it out [1]. The code is open source [2] [3]

[1] https://blue-noise.blode.co [2] https://github.com/mblode/blue-noise-rust [3] https://github.com/mblode/blue-noise-typescript

ivanjermakov•1w ago
There is something very satisfying in viewing media at 100% resolution of your screen. Every pixel is crisp and plays a role. Joy not available by watching videos or viewing scaled images.
Fraterkes•1w ago
Half the posts here are people promoting their own projects without even mentioning the (really impressive) OP. Bit weird
treavorpasan•1w ago
When you look at something like Pietà by Michelangelo or Lolita by Vladimir Nabokov, you realise that some humans are given abilities that far exceed your own and that you will never reach their level.

When this happens, you need to stop and appreciate the sheer genius of the creator.

This is one of those posts.

Fraterkes•1w ago
I don’t know about all that, I’m just saying I thought people were being a bit rude
jasonjmcghee•1w ago
Is it self-promotion or just "hey cool I care enough about this I built something too"

It's ok for people to get excited about shared passions

augusteo•1w ago
Bookmarking this. Clear explanations of graphics algorithms are surprisingly rare.
AndrewStephens•1w ago
Normally I am not a fan of gimmicky page formats but this series really hits it out of the park with well-considered presentation.

I can't wait until the next installment on error diffusion. I still think Atkinson dithering looks great, so much so that I made a web component to dither images.

ginko•1w ago
I find these sites that try to feed you stuff at a bite-sized pace extremely disrespectful.
Sunspark•1w ago
I agree many meetings could be an email.

Look at it this way though, this site is low-key a CV portfolio piece because he isn't just writing about dithering, he's demonstrating that he can research, analyze and then both code and create a site at a level most vibers cannot.

haritha-j•1w ago
No better quantum for education than bite sized i think.
ivanesmantovich•1w ago
I’ve created a VS Code theme inspired by dithering/halftone techniques, maybe you’ll like it! I’d really appreciate any feedback:

https://github.com/ivanesmantovich/halftone-theme-vsc

kleiba•1w ago
Another interesting read: how Lucas Pope did dithering for moving game scenes in his indie game "Return of the Obra Dinn": https://forums.tigsource.com/index.php?topic=40832.msg136374...
austinthetaco•1w ago
Outside of being informative in a really fun way (I learned far more in a couple minutes than I thought I would), that website is stunning. I've been a web dev for over 10 years and I'm still baffled at how people make sites like this, does anyone have any info or resources on how to go about making these sorts of transitional 3d sites beyond just "learn threejs"?