frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Personalizing esketamine treatment in TRD and TRBD

https://www.frontiersin.org/articles/10.3389/fpsyt.2025.1736114
1•PaulHoule•31s ago•0 comments

SpaceKit.xyz – a browser‑native VM for decentralized compute

https://spacekit.xyz
1•astorrivera•1m ago•1 comments

NotebookLM: The AI that only learns from you

https://byandrev.dev/en/blog/what-is-notebooklm
1•byandrev•1m ago•1 comments

Show HN: An open-source starter kit for developing with Postgres and ClickHouse

https://github.com/ClickHouse/postgres-clickhouse-stack
1•saisrirampur•2m ago•0 comments

Game Boy Advance d-pad capacitor measurements

https://gekkio.fi/blog/2026/game-boy-advance-d-pad-capacitor-measurements/
1•todsacerdoti•2m ago•0 comments

South Korean crypto firm accidentally sends $44B in bitcoins to users

https://www.reuters.com/world/asia-pacific/crypto-firm-accidentally-sends-44-billion-bitcoins-use...
1•layer8•3m ago•0 comments

Apache Poison Fountain

https://gist.github.com/jwakely/a511a5cab5eb36d088ecd1659fcee1d5
1•atomic128•5m ago•1 comments

Web.whatsapp.com appears to be having issues syncing and sending messages

http://web.whatsapp.com
1•sabujp•5m ago•2 comments

Google in Your Terminal

https://gogcli.sh/
1•johlo•6m ago•0 comments

Shannon: Claude Code for Pen Testing: #1 on Github today

https://github.com/KeygraphHQ/shannon
1•hendler•7m ago•0 comments

Anthropic: Latest Claude model finds more than 500 vulnerabilities

https://www.scworld.com/news/anthropic-latest-claude-model-finds-more-than-500-vulnerabilities
1•Bender•11m ago•0 comments

Brooklyn cemetery plans human composting option, stirring interest and debate

https://www.cbsnews.com/newyork/news/brooklyn-green-wood-cemetery-human-composting/
1•geox•11m ago•0 comments

Why the 'Strivers' Are Right

https://greyenlightenment.com/2026/02/03/the-strivers-were-right-all-along/
1•paulpauper•13m ago•0 comments

Brain Dumps as a Literary Form

https://davegriffith.substack.com/p/brain-dumps-as-a-literary-form
1•gmays•13m ago•0 comments

Agentic Coding and the Problem of Oracles

https://epkconsulting.substack.com/p/agentic-coding-and-the-problem-of
1•qingsworkshop•14m ago•0 comments

Malicious packages for dYdX cryptocurrency exchange empties user wallets

https://arstechnica.com/security/2026/02/malicious-packages-for-dydx-cryptocurrency-exchange-empt...
1•Bender•14m ago•0 comments

Show HN: I built a <400ms latency voice agent that runs on a 4gb vram GTX 1650"

https://github.com/pheonix-delta/axiom-voice-agent
1•shubham-coder•14m ago•0 comments

Penisgate erupts at Olympics; scandal exposes risks of bulking your bulge

https://arstechnica.com/health/2026/02/penisgate-erupts-at-olympics-scandal-exposes-risks-of-bulk...
4•Bender•15m ago•0 comments

Arcan Explained: A browser for different webs

https://arcan-fe.com/2026/01/26/arcan-explained-a-browser-for-different-webs/
1•fanf2•17m ago•0 comments

What did we learn from the AI Village in 2025?

https://theaidigest.org/village/blog/what-we-learned-2025
1•mrkO99•17m ago•0 comments

An open replacement for the IBM 3174 Establishment Controller

https://github.com/lowobservable/oec
1•bri3d•19m ago•0 comments

The P in PGP isn't for pain: encrypting emails in the browser

https://ckardaris.github.io/blog/2026/02/07/encrypted-email.html
2•ckardaris•21m ago•0 comments

Show HN: Mirror Parliament where users vote on top of politicians and draft laws

https://github.com/fokdelafons/lustra
1•fokdelafons•22m ago•1 comments

Ask HN: Opus 4.6 ignoring instructions, how to use 4.5 in Claude Code instead?

1•Chance-Device•23m ago•0 comments

We Mourn Our Craft

https://nolanlawson.com/2026/02/07/we-mourn-our-craft/
1•ColinWright•26m ago•0 comments

Jim Fan calls pixels the ultimate motor controller

https://robotsandstartups.substack.com/p/humanoids-platform-urdf-kitchen-nvidias
1•robotlaunch•30m ago•0 comments

Exploring a Modern SMTPE 2110 Broadcast Truck with My Dad

https://www.jeffgeerling.com/blog/2026/exploring-a-modern-smpte-2110-broadcast-truck-with-my-dad/
1•HotGarbage•30m ago•0 comments

AI UX Playground: Real-world examples of AI interaction design

https://www.aiuxplayground.com/
1•javiercr•31m ago•0 comments

The Field Guide to Design Futures

https://designfutures.guide/
1•andyjohnson0•31m ago•0 comments

The Other Leverage in Software and AI

https://tomtunguz.com/the-other-leverage-in-software-and-ai/
1•gmays•33m ago•0 comments
Open in hackernews

Better SRGB to Greyscale Conversion

https://30fps.net/pages/better-srgb-to-greyscale/
27•ibobev•3mo ago

Comments

ansgri•3mo ago
The issue with color to grayscale conversion for human consumption is in most cases there is no well-defined ground truth. People don’t see in grayscale, so the appearance preservation approach doesn’t work. And the source image was most likely heavily color corrected to match certain aesthetic. So the problem becomes “to preserve as much information, both content and aesthetic, within constraints of the target grayscale medium”.

The bottom line is, use some standardized conversion (like described here — just to avoid surprising users) if images don’t actually matter, some contrast-preserving method if content matters, and edit creatively otherwise.

ChrisMarshallNY•3mo ago
I used to do a lot of image processing programming.

The basic way to do it, is with weighted LUTs. The "poor man's conversion" was to just convert the green channel, and toss out the red and blue.

uninformedprior•3mo ago
I ran into this subjectiveness in graphics recently. Thought I was doing the "correct" thing blending in linear space but turns out blending in SRGB looks a lot better for certain applications and that's what most popular applications do.
zokier•3mo ago
for blending oklab almost always works better than srgb (linear or gamma).
uninformedprior•3mo ago
It's possible I wasn't specific enough when I said "graphics". Typically I blend in CIELAB when interpolating between colors for visualizations (eg data science).

But I'm unaware of rendering engines that do alpha blending in something other than linear or SRGB. Photoshop, for instance, blends in sRGB by default, while renderers that simulate light physically will blend in linear RGB (to the best of my knowledge).

It depends on the GPU and the implementation, but I personally would not want to spend the compute on per-pixel CIELAB conversions for blending.

zokier•3mo ago
> But I'm unaware of rendering engines that do alpha blending in something other than linear or SRGB.

Well, spectral rendering is a thing, kinda bypasses the problem of color blending for rendering in some cases.

ChrisMarshallNY•3mo ago
The thing that is difficult to "math," is that we perceive color in a certain way (if you ever look at the CIELAB[0] space, that's based on human eye perception). So there's a lot of "it just don't look right." involved.

I have found that getting weighted LUTs that have been extracted from some process (math, context measurements, user testing, etc.), and simply applying them in the conversion is how you execute the conversion, but generating the LUTs is the tricky part. It's not always best handled by a formula. I guess you could really go crazy, and generate the LUT on the fly, as a per-pixel conversion (we actually did something like this, for RAW conversion).

[0] https://en.wikipedia.org/wiki/CIELAB_color_space

Remnant44•3mo ago
I've run into this as well. Problem is that linear RGB is most definitely not a perceptually uniform space, so blending in it frequently does something different than you want. Use linear for physically based light and mixing, but if you are modeling an operation that is based on human perception it is going to be completely wrong.

The dark irony then, is that sRGB with its gamma curve applied, models luminance better (closer to human perception) for blending than linear does. If you can afford to do the blend in a perceptually uniform space like oklab, even better of course.

miladyincontrol•3mo ago
Agreed. For similar reasons a lot of B&W photographers use different color filters to achieve the look they want, rather than only take the film's native rendition how it is.
kccqzy•3mo ago
I don't do film photography but even then I often experiment with different grayscale filters to achieve different looks. I remember reading a book a long time ago that recommended different conversions for landscape photography and portrait photography with faces.
icedshrimp•3mo ago
Since this is all perceptual anyways, go ahead and compare some color formats for yourself. I'll say that for me Oklab is the most perceptually accurate at lightness representation