frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

PID Controller

https://en.wikipedia.org/wiki/Proportional%E2%80%93integral%E2%80%93derivative_controller
1•tosh•1m ago•0 comments

SpaceX Rocket Generates 100GW of Power, or 20% of US Electricity

https://twitter.com/AlecStapp/status/2019932764515234159
1•bkls•1m ago•0 comments

Kubernetes MCP Server

https://github.com/yindia/rootcause
1•yindia•2m ago•0 comments

I Built a Movie Recommendation Agent to Solve Movie Nights with My Wife

https://rokn.io/posts/building-movie-recommendation-agent
1•roknovosel•2m ago•0 comments

What were the first animals? The fierce sponge–jelly battle that just won't end

https://www.nature.com/articles/d41586-026-00238-z
2•beardyw•11m ago•0 comments

Sidestepping Evaluation Awareness and Anticipating Misalignment

https://alignment.openai.com/prod-evals/
1•taubek•11m ago•0 comments

OldMapsOnline

https://www.oldmapsonline.org/en
1•surprisetalk•13m ago•0 comments

What It's Like to Be a Worm

https://www.asimov.press/p/sentience
2•surprisetalk•13m ago•0 comments

Don't go to physics grad school and other cautionary tales

https://scottlocklin.wordpress.com/2025/12/19/dont-go-to-physics-grad-school-and-other-cautionary...
1•surprisetalk•13m ago•0 comments

Lawyer sets new standard for abuse of AI; judge tosses case

https://arstechnica.com/tech-policy/2026/02/randomly-quoting-ray-bradbury-did-not-save-lawyer-fro...
2•pseudolus•14m ago•0 comments

AI anxiety batters software execs, costing them combined $62B: report

https://nypost.com/2026/02/04/business/ai-anxiety-batters-software-execs-costing-them-62b-report/
1•1vuio0pswjnm7•14m ago•0 comments

Bogus Pipeline

https://en.wikipedia.org/wiki/Bogus_pipeline
1•doener•15m ago•0 comments

Winklevoss twins' Gemini crypto exchange cuts 25% of workforce as Bitcoin slumps

https://nypost.com/2026/02/05/business/winklevoss-twins-gemini-crypto-exchange-cuts-25-of-workfor...
1•1vuio0pswjnm7•16m ago•0 comments

How AI Is Reshaping Human Reasoning and the Rise of Cognitive Surrender

https://papers.ssrn.com/sol3/papers.cfm?abstract_id=6097646
3•obscurette•16m ago•0 comments

Cycling in France

https://www.sheldonbrown.com/org/france-sheldon.html
1•jackhalford•17m ago•0 comments

Ask HN: What breaks in cross-border healthcare coordination?

1•abhay1633•18m ago•0 comments

Show HN: Simple – a bytecode VM and language stack I built with AI

https://github.com/JJLDonley/Simple
1•tangjiehao•20m ago•0 comments

Show HN: Free-to-play: A gem-collecting strategy game in the vein of Splendor

https://caratria.com/
1•jonrosner•21m ago•1 comments

My Eighth Year as a Bootstrapped Founde

https://mtlynch.io/bootstrapped-founder-year-8/
1•mtlynch•22m ago•0 comments

Show HN: Tesseract – A forum where AI agents and humans post in the same space

https://tesseract-thread.vercel.app/
1•agliolioyyami•22m ago•0 comments

Show HN: Vibe Colors – Instantly visualize color palettes on UI layouts

https://vibecolors.life/
2•tusharnaik•23m ago•0 comments

OpenAI is Broke ... and so is everyone else [video][10M]

https://www.youtube.com/watch?v=Y3N9qlPZBc0
2•Bender•23m ago•0 comments

We interfaced single-threaded C++ with multi-threaded Rust

https://antithesis.com/blog/2026/rust_cpp/
1•lukastyrychtr•25m ago•0 comments

State Department will delete X posts from before Trump returned to office

https://text.npr.org/nx-s1-5704785
7•derriz•25m ago•1 comments

AI Skills Marketplace

https://skly.ai
1•briannezhad•25m ago•1 comments

Show HN: A fast TUI for managing Azure Key Vault secrets written in Rust

https://github.com/jkoessle/akv-tui-rs
1•jkoessle•25m ago•0 comments

eInk UI Components in CSS

https://eink-components.dev/
1•edent•26m ago•0 comments

Discuss – Do AI agents deserve all the hype they are getting?

2•MicroWagie•29m ago•0 comments

ChatGPT is changing how we ask stupid questions

https://www.washingtonpost.com/technology/2026/02/06/stupid-questions-ai/
2•edward•29m ago•1 comments

Zig Package Manager Enhancements

https://ziglang.org/devlog/2026/#2026-02-06
3•jackhalford•31m ago•1 comments
Open in hackernews

HyAB k-means for color quantization

https://30fps.net/pages/hyab-kmeans/
47•ibobev•7mo ago

Comments

refulgentis•7mo ago
Highly recommend Celebi's K-Means, weighted square means.

It feeds the results from a box cutting quantizer (Wu) into K-Means, giving you deterministic initial clusters and deterministic results. It leverages CIELAB distance to avoid a bunch of computation. I used it for Material 3's dynamic color and it was awesome as it enabled higher cluster counts.

mattdesl•7mo ago
Surely this would be even faster and potentially better with OKLab? Especially in the context of CIELab based distance metrics like CIEDE2000 which are a bit heavy.

My own gripe with box cutting is that perceptual color spaces tend not to have cube shaped volumes. But they are very fast algorithms.

refulgentis•7mo ago
I am very strongly opinionated on this, but am aware this isn't a very serious matter most of the time. Imagine my tongue in cheek, and a smile, i.e. I'm open to discussion:

Oklab is a nightmare in practice - it's not linked to any perceptual color space, but it has the sheen of such in colloquial discussion. It's a singular matmul that is supposed to emulate CAM16 as best as it can.

It reminds me of the initial state of color extraction I walked into at Google, where they were using HSL -- that is more obviously wrong, but I submit they suffer from the same exact issue: their verbiage is close enough to actual verbiage that they obfuscate discussion, and prevent people from working with the actual perceptual spaces, where all of a sudden a ton of problems just...go away.

</end rant>

In practice, quantizers are all slow enough at multimegapixel that I downscale - significantly, IIRC I used 96x96 or 112x112. IIRC you could convert all 16M of RGB to CAM16 and L* in 6 seconds, in debug mode, in Dart, transpiled to Javascript in 2021, so I try to advocate for doing things with a proper color space as much as possible, the perf just doesn't matter.

EDIT: Also, I should point out that my goal was to get a completely dynamic color system built, which required mathematically guaranteeing a given contrast ratio for two given lightness values, no matter hue and chroma, so trying to use pseudo-perceptual-lightness would have been enough to completely prevent that.

I do still think it's bad in general, i.e. if it was people doing effects on images in realtime, a couple weeks ago I finally got past what I had internally at Google, and was able to use appearance modeling (i.e. the AM in CAM-16) to do an exquisite UI whose colors change based on the lighting naturally. https://x.com/jpohhhh/status/1937698857879515450

mattdesl•7mo ago
It does a pretty good job at emulating CAM16 with a fraction of the parameters, computational complexity, and processing; it’s no wonder it was adopted by CSS.

I don’t know what you mean by “not being linked to any perceptual color space” - it is derived from CAM16 & CIEDE2000, pretty similar in ethos to other spaces like ITP and the more recently published sUCS.

There’s also tons of discussion on w3c GitHub about OKLab, and it’s evolved in many ways since the original blog post such as improved matrices, new lightness estimate and OKHSV/OKHSL, and very useful cusp & gamut approximations.

I have a hard time seeing how it’s a nightmare in practice!

refulgentis•7mo ago
Because it is a matmul best-effort approximation of a perceptual color space, not a perceptual one, and in my experience that's a significant difference when deployed and for design. YMMV. :)

I cringe myself, it sounds like a nitpick, but it's an extremely significant upgrade in every case.

Most concretely, if I use actual L*, design can use palettes linked to L* and vary hue / colorfulness while meeting any contrast standard.

jcelerier•7mo ago
> IIRC you could convert all 16M of RGB to CAM16 and L* in 6 seconds, in debug mode, in Dart, transpiled to Javascript in 2021, so I try to advocate for doing things with a proper color space as much as possible, the perf just doesn't matter.

Coming from the "real-time graphics" world, if I read that something which is going to be a minor part of your whole pipeline would take 6 seconds (or even 600 or 60 ms) it would be instantly disqualified so I don't really understand why you'd say "the perf just doesn't matter" ?

refulgentis•7mo ago
> I don't understand how "the perf just doesn't matter"

Ah, apologies, I don't mean to imply color perf never matters :)

The paragraph is discussing a color quantization algorithm to extract colors from an image, not color conversion in general. It's very hard in that situation

> "a minor part of your whole pipeline would take 6 seconds (or even 600 or 60 ms"

Ah, apologies for the lack of clarity: you don't need to ever convert the entirety of RGB to CAM16 and L*. :) That's just a rough instructive benchmark I can remember.

If I'm worried about realtime, say, I know I want to convert an 6K* wallpaper with realtime appearance modelling, at 120 fps on 2022 Android, I use a shader. 0 perf issues so far. (knock on wood)

* now that I think about it...it's probably at display res, not the original 6K. Maybe 2 megapixel? shrugs

gardaani•7mo ago
> Oklab is a nightmare in practice - it's not linked to any perceptual color space, but it has the sheen of such in colloquial discussion. It's a singular matmul that is supposed to emulate CAM16 as best as it can.

Oklab is perceptually uniform and addresses issues such as unexpected hue and lightness changes in blue colors present in the CIELAB color space. https://en.wikipedia.org/wiki/Oklab_color_space

Oklab is used in CSS because it creates smoother gradients and better gamut mapping of out-of-gamut colors than Lab. Here's a picture how Oklch (on the left) creates smoother gamut mapping than CIE Lch (on the right) ("Explore OKLab gamut mapping in oklch"): https://github.com/w3c/csswg-drafts/issues/9449#issuecomment...

yorwba•7mo ago
Oklab is not perceptually uniform. It's better than other color spaces with equally simple conversion functions, but in the end, it was created as a simple approximation to more complex color spaces, so compared to the best you could do, it's merely OK (hence the name).
mattdesl•7mo ago
> Oklab is not perceptually uniform

By what metric? If the target is parity with CAM16-UCS, OKLab comes closer than many color spaces also designed to be perceptually uniform.

yorwba•7mo ago
If the target is parity with CAM16-UCS, CAM16-UCS is best, tautologically. Sure, if you need a fast approximation, by all means fall back to Oklab, but that optimization isn't going to be necessary in all cases.
mattdesl•7mo ago
Obviously; but this doesn’t suggest that OKLab is not a perceptually uniform color space.

There is no “one true” UCS model - all of these are just approximations of various perception and color matching studies, and at some point CAM16-UCS will probably be made obsolete as well.

refulgentis•7mo ago
No offense, but I do find the interlocution here somewhat hard-headed.

In a sentence, color science is a science.

The words you are using have technical meanings.

When we say "Oklab isn't a perceptually accurate color system", we are not saying "it is bad" - we are saying "it is a singular matmul that is meant to imitate a perceptually accurate color system" -- and that really matters, really -- Google doesn't launch Material 3 dynamic color if we just went in on that.

The goal was singular matmul. Not perceptual accuracy.

Let me give you another tell something is really off that you'll understand intuitively.

People love quoting back the Oklab blog post, you'll also see in a sibling comment something about gradients and CAM16-UCS.

The author took two colors across the color wheel, blue and yellow, then claimed that because the CAM16-UCS gradient has gray in it, Oklab is better.

That's an absolutely insane claim.

Blue and yellow are across the color wheel from each other.

Therefore, a linear gradient between the two has to pass through the center of the color wheel.

Therefore a gradient, i.e. a lerp, will have gray in it -- if it didn't, that would be really weird and indicate some sort of fundamental issue with the color modeling.

So of course, Oklab doesn't have gray in the blue-yellow gradient, and this is written up as a good quality.

If they knew what they were talking about at the time, they wouldn't have been doing gradients in CAM16-UCS, and not done a lerp, but used the standard CSS gradient technique of "rotating" to the new point.

Because that's how you avoid gray.

Not making up a new color space, writing it up with a ton of misinfo, then leaving it up without clarification so otherwise-smart people end up completely confused for years, repeating either the blog post or "nothings perfect" ad naseum as an excuse to never engage with anything past it. They walk away with the mistaken understanding a singular matmul somehow magically blew up 50 years of color science.

I just hope this era passes within my lifetime. HSL was a tragedy. This will be worse, if it leaves the ability to do actual color science some sort of fringe slow thing in people's heads.

mattdesl•7mo ago
Yes, it's a matmul; many color models just boil down to simple math. For example, look at Li and Luo's 2024 "simple color appearance model"[1], which is very similar to OKLab (just matmul!), and created for many of the same reasons (just an approximation!). Like OKLab, it also improves upon CAM16-UCS hue linearity issues in blue. Ironically, Luo was one of the authors who proposed CAM16-UCS in 2017. And, although it certainly improves upon CAM16-UCS for many applications, I'm not yet convinced it is superior to OKLab (you can see my implementation here: [2]).

And I think you might be mis-remembering Ottosson's original blog post; he demonstrates a gradient between white and blue, not blue and yellow.

[1] https://opg.optica.org/oe/fulltext.cfm?uri=oe-32-3-3100

[2] https://github.com/texel-org/color/blob/main/test/spaces/sim...

refulgentis•7mo ago
> Yes, it's a matmul; many color models just boil down to simple math.

All do. :)

> For example, look at Li and Luo's 2024 "simple color appearance model"[1], which is very similar to OKLab (just matmul!), and created for many of the same reasons (just an approximation!)

I don't understand what this shows me. I don't see anyone in the thread arguing there can only be one color model with one matmul. I feel self-concious I'm missing something, so I thought maybe the implication is this a real scientist working on a real space therefore our haranguing about "actual" perceptual spaces is hair-splitting, as we see a color scientist making an approximation. You also note that it is an approximation, as does the paper, so I don't think that's the case...idk :(

> Like OKLab, it also improves upon CAM16-UCS hue linearity issues in blue. Ironically, Luo was one of the authors who proposed CAM16-UCS in 2017.

What's the ironic part? (my understanding: you read this as a competition, so you find it ironic, in the colloquial sense, that the color space you perceive us advocating for, or that Oklab can replace, was created by someone who made a singular matmul type-space like Oklab in a paper?)

> I'm not yet convinced it is superior to OKLab (you can see my implementation here: [2]).

I appreciate your work and desire here and you have firey curiosity. In practice, color science uses UCS spaces to measure color difference, not render colors. (he uses CAM16-UCS and CAM16 interchangeably as well, it's confusing)

> And I think you might be mis-remembering Ottosson's original blog post; he demonstrates a gradient between white and blue, not blue and yellow.

You're right! That makes a whole lot less obvious that there's something wrong. :( Here, the sin is throwing away the whole science bit and says that's fine, look at this example.

See gradients here. https://m3.material.io/blog/science-of-color-design

Note particularly the black and white one. Gives a great sense of how much of an outlier Oklab is, and you can't fuck around with lightness that much, that's how you measure contrast.

refulgentis•7mo ago
> Oklab is perceptually uniform and addresses issues such as unexpected hue and lightness changes in blue colors present in the CIELAB color space. https://en.wikipedia.org/wiki/Oklab_color_space

This isn't true. Oklab is a singular matmul meant to approximate a perceptually accurate color space.

> Oklab is used in CSS because it creates smoother gradients and better gamut mapping of out-of-gamut colors than Lab.

That's not true, at all. Not even wrong. Gamut mapping is separate from color space.

> Here's a picture how Oklch (on the left) creates smoother gamut mapping than CIE Lch (on the right)

I love the guy who wrote this but we have an odd relationship, I'd have people tell me all the time he wondered why I wasn't reaching out to him, and we've never met, he's never contacted me, etc.

If you're him, we should talk sometime.

I doubt you're him, because you're gravely misunderstanding the diagram and work there. They're comparing gamut mapping algorithms, not comparing color spaces, and what is being discussed is gamut mapping, not color spaces.

mattdesl•7mo ago
I’ve done some color quantization tests with HyAB and OKLab on this same image. A couple notes:

- what works well for this image might not work well for other images! I learned the hard way after lots of testing on this image, only to find things that did not generalize well.

- parametrizing the AB plane weight is pretty useful for color quantization; I’ve found some images will be best with more weight given to colour, and other images need more weight given to tone. OKLab creator suggests a factor of 2 in deltaEOK[1] but again this is something that should be adjustable IMHO..

- there’s another interesting and efficient color space (poorly named) sUCS and sCAM[2] that boasts impressive results in their paper for tasks like this. Although I’ve found it not much better for my needs than OKLab in my brief tests[3] (and note, both color spaces are derived using CIEDE2000)

[1] https://github.com/color-js/color.js/blob/9d812464aa318a9b47...

[2] https://opg.optica.org/oe/fulltext.cfm?uri=oe-32-3-3100&id=5...

[3] https://x.com/mattdesl/status/1902699888057446670