frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

Rescuing two PDP-11s from a former British Telecom underground shelter (2023)

https://forum.vcfed.org/index.php?threads/rescuing-two-pdp-11-systems-in-uk-from-a-former-big-british-telecom-underground-shelter-in-central-london.1244723/page-2
34•mhh__•2h ago•5 comments

Extending Emacs with Fennel (2024)

https://andreyor.st/posts/2024-12-20-extending-emacs-with-fennel/
29•Bogdanp•2h ago•1 comments

Qwen3-Coder: Agentic coding in the world

https://qwenlm.github.io/blog/qwen3-coder/
528•danielhanchen•10h ago•177 comments

Mathematics for Computer Science (2024)

https://ocw.mit.edu/courses/6-1200j-mathematics-for-computer-science-spring-2024/
83•vismit2000•4h ago•10 comments

When Is WebAssembly Going to Get DOM Support?

https://queue.acm.org/detail.cfm?id=3746174
27•jazzypants•2h ago•8 comments

Show HN: WTFfmpeg – Natural Language to FFmpeg Translator

https://github.com/scottvr/wtffmpeg
47•ycombiredd•4h ago•27 comments

Depot (YC W23) Is Hiring a Technical Content Writer (Remote)

https://www.ycombinator.com/companies/depot/jobs/BzrfAzP-technical-content-writer
1•jacobwg•29m ago

Org tutorials

https://orgmode.org/worg/org-tutorials/index.html
58•dargscisyhp•4h ago•11 comments

More than you wanted to know about how Game Boy cartridges work

https://abc.decontextualize.com/more-than-you-wanted-to-know/
288•todsacerdoti•12h ago•30 comments

Countries across the world see food price shocks from climate extremes

https://www.bsc.es/news/bsc-news/countries-across-the-world-see-food-price-shocks-climate-extremes-research-involving-bsc-shows
52•littlexsparkee•4h ago•26 comments

Why you can't color calibrate deep space photos

https://maurycyz.com/misc/cc/
119•LorenDB•7h ago•54 comments

Android Earthquake Alerts: A global system for early warning

https://research.google/blog/android-earthquake-alerts-a-global-system-for-early-warning/
253•michaefe•13h ago•79 comments

Algorithms for Modern Processor Architectures

https://lemire.github.io/talks/2025/sea/sea2025.html
158•matt_d•9h ago•17 comments

Managing EFI boot loaders for Linux: Controlling secure boot (2015)

https://www.rodsbooks.com/efi-bootloaders/controlling-sb.html
27•CaliforniaKarl•3d ago•0 comments

Swift-erlang-actor-system

https://forums.swift.org/t/introducing-swift-erlang-actor-system/81248
273•todsacerdoti•13h ago•56 comments

AI coding agents are removing programming language barriers

https://railsatscale.com/2025-07-19-ai-coding-agents-are-removing-programming-language-barriers/
56•Bogdanp•4h ago•42 comments

We built an air-gapped Jira alternative for regulated industries

https://plane.so/blog/everything-you-need-to-know-about-plane-air-gapped
209•viharkurama•12h ago•128 comments

I watched Gemini CLI hallucinate and delete my files

https://anuraag2601.github.io/gemini_cli_disaster.html
178•anuraag2601•13h ago•192 comments

AI groups spend to replace low-cost 'data labellers' with high-paid experts

https://www.ft.com/content/e17647f0-4c3b-49b4-a031-b56158bbb3b8
8•eisa01•3d ago•1 comments

Don't animate height

https://www.granola.ai/blog/dont-animate-height
368•birdculture•3d ago•209 comments

Fourier lightfield multiview stereoscope for large field-of-view 3D imaging

https://www.spiedigitallibrary.org/journals/advanced-photonics-nexus/volume-4/issue-04/046008/Fourier-lightfield-multiview-stereoscope-for-large-field-of-view-3D/10.1117/1.APN.4.4.046008.full
7•PaulHoule•2d ago•0 comments

Subliminal learning: Models transmit behaviors via hidden signals in data

https://alignment.anthropic.com/2025/subliminal-learning/
156•treebrained•14h ago•35 comments

TODOs aren't for doing

https://sophiebits.com/2025/07/21/todos-arent-for-doing
337•todsacerdoti•18h ago•197 comments

TapTrap: Animation‑Driven Tapjacking on Android

https://taptrap.click/
54•Bogdanp•8h ago•8 comments

Show HN: A word of the day that doesn't suck

47•jsomers•20h ago•20 comments

Gemini North telescope discovers long-predicted stellar companion of Betelgeuse

https://www.science.org/content/article/betelgeuse-s-long-predicted-stellar-companion-may-have-been-found-last
124•layer8•15h ago•30 comments

Font Comparison: Atkinson Hyperlegible Mono vs. JetBrains Mono and Fira Code

https://www.anthes.is/font-comparison-review-atkinson-hyperlegible-mono.html
205•maybebyte•17h ago•134 comments

Project Lyra – Exploring Interstellar Objects

https://i4is.org/what-we-do/technical/project-lyra/
8•andsoitis•3h ago•0 comments

Many lung cancers are now in nonsmokers

https://www.nytimes.com/2025/07/22/well/lung-cancer-nonsmokers.html
151•alexcos•16h ago•191 comments

Show HN: Phind.design – Image editor & design tool powered by 4o / custom models

https://phind.design
56•rushingcreek•14h ago•16 comments
Open in hackernews

Why you can't color calibrate deep space photos

https://maurycyz.com/misc/cc/
119•LorenDB•7h ago

Comments

Retr0id•7h ago
The next space mission should be to leave a colour calibration chart on the moon.
jofer•6h ago
The moon itself already is one. Moonshots are widely used in calibration, at least for earth observation satellites. The brightness of the full moon at each wavelength at each day of the year is predictable and well-known, so it makes a good target to check your payload against.
embedded_hiker•6h ago
They brought a gnomon, with a color chart, on the Apollo missions. They would set it up for many of the pictures of samples.

https://airandspace.si.edu/collection-objects/gnomon-lunar-a...

shagie•5h ago
They also put color calibration charts on Mars rovers. For example https://www.lucideon.com/news/colour-standards-on-mars
JNRowe•1h ago
There is even a Damien Hirst¹ haphazardly spread about the surface for that purpose.

One of the great gifts Pillinger² had was being able to shake up public interest via pop culture; there was also call sign by Blur for Beagle 2.

¹ https://www.researchgate.net/figure/Spot-Painting-Beagle-2-C...

² https://en.wikipedia.org/wiki/Colin_Pillinger

pgreenwood•5h ago
Here's a shot of a color chart on the moon from Apollo 17 (AS17-137-20900):

https://tothemoon.im-ldi.com/data_a70/AS17/extra/AS17-137-20...

kurthr•7h ago
What's the white point? Is it D65? Not when the sun isn't out.
klysm•6h ago
I've always been confused by what the white point actually _means_. Since we are dealing with strictly emissive sources here, and not reflected sunlight, does the whitepoint even mean anything?
esafak•5h ago
In a scene lit overwhelmingly by one approximately Planckian light source, the white point is the color of the closest Planckian light source.

If the light source is not approximately Planckian, or if multiple illuminants have different temperatures, a white point is not defined.

klysm•4h ago
So in this case there is no sensible white point since there is no illuminant right?
esafak•3h ago
I'm not sure what case we're talking about but if it serves visible light it is an illuminant.
bhouston•6h ago
> Many other cameras, particularly those with aggressive UV-IR cut filters, underespond to H-a, resulting in dim and blueish nebula. Often people rip out those filters (astro-modification), but this usually results in the camera overresponding instead.

Hmm... astrophotographers do not use cameras with UV-IR cut filters at all. For example, I owned a few of these:

https://www.zwoastro.com/product-category/cameras/dso_cooled...

They also generally do not use sensors that have Bayer filters. This also screws things up.

Instead they use monochromatic sensors with narrowband filters (either one band or multiple) over them keyed to specific celestial emissions. The reason for this is that it gets rid of light pollution that is extensive and bumps up the signal to noise for the celestial items, especially the small faint details. Stuff like this:

https://telescopescanada.ca/products/zwo-4-piece-31mm-ha-sii...

https://telescopescanada.ca/products/zwo-duo-band-filter

Often these are combined with a true color capture (or individual RGBL narrowband) just to get the stars coloured properly.

Almost everything you see in high end astrophotography is false color because they map these individual narrowband captures on the monochrome sensors to interesting colours and often spending a lot of time manipulating the individual channels.

This is done at the medium to high end using the PixInsight software - including by NASA for the recent James Webb images: https://www.pbs.org/video/new-eye-on-the-universe-zvzqn1/

The James Web telescope has a set of 29 narrowband filters for its main sensor: https://jwst-docs.stsci.edu/jwst-near-infrared-camera/nircam...

Hubble pictures were famously coloured in a particular way that it has a formal name:

https://www.astronomymark.com/hubble_palette.htm

(My shots: https://app.astrobin.com/u/bhouston#gallery)

verandaguy•6h ago

    > astrophotographers do not use cameras with UV-IR cut filters at all
I'll be pedantic here and say that the author's probably talking to people who use DSLRs with adapter rings for telescopes. I've been interested in doing this for a while (just unable to financially justify it), and I think this is actually something people in this niche do.

Then there are things like the Nikon D810A, which remove the UV-IR filter from the factory (but IIRC retain the Bayer filter).

bhouston•6h ago
My recommendation, as someone who started with a DSLR and then modded it to remove the UV-IR filter, I would have been better to just skip to a beginner cooled mono astrophotography camera, like the ASI533MM Pro. It is night and day difference in terms of quality and roughly the same cost and it automates better much better.

A high end DSLR is a huge waste of money in astrophotography. Spend the same amount on a dedicated astrophotography camera and you’ll do much better.

verandaguy•6h ago
How do you recover colour from a mono astro camera? Just run it for 3 exposures behind a gel of each of the R/G/B colours, then comp?
gibybo•5h ago
Yes, and you would almost certainly want to automate it with a filter wheel that changes the filters for you on a schedule. However, a key advantage of a mono camera is that you don't have to limit yourself to RGB filters. You can use some other set of filters better suited for the object you are capturing and map them back to RGB in software. This is most commonly done with narrowband filters for Hydrogen, Sulfur and Oxygen which allow you to see more detail in many deep space objects and cut out most of the light pollution that would otherwise get in your way.
schoen•5h ago
> It is night and day difference

Particularly high praise in astronomy!

recipe19•6h ago
What you're describing is the domain of a very, very small number of hobbyists with very deep pockets (plus various govt-funded entities).

The vast majority of hobby astrophotography is done pretty much as the webpage describes it, with a single camera. You can even buy high-end Canon cameras with IR filters factory-removed specifically for astrophotography. It's big enough of a market that the camera manufacturer accommodates it.

bhouston•6h ago
> What you're describing is the domain of a very, very small number of hobbyists with very deep pockets

Sort of. The telescope used for the Dumbbell nebula captures featured in the article was at worth around $1000 and his mount is probably $500. A beginner cooled monochrome astrophotography camera is around $700 and if you want filters and a controller another $500.

There are quite a few people in the world doing this, upwards of 100K:

https://app.astrobin.com/search

Various PixInsight videos have +100K views: https://youtu.be/XCotRiUIWtg?si=RpkU-sECLusPM1j-&utm_source=...

Intro to narrowband also has 100K+ views: https://youtu.be/0Fp2SlhlprU?si=oqWrATDDwhmMguIl&utm_source=...

looofooo0•2h ago
Some even scratch of the bayer pattern of old cameras.
tomrod•5h ago
And the entire earth observation industry, which doesn't look the same way but uses the same base tech stack.
tecleandor•4h ago
You don't need very big pockets for that.

Today you can find very affordable monochromatic astrophotography cameras, and you can also modify cheap DSLR cameras or even compact cameras to remove its IR/UV/low pass filters. You can even insert a different semi permanent internal filter after that (like a IR or UV band pass)

I've done a Nikon D70 DSLR and a Canon Ixus/Elph compact.

Some cameras are very easy, some very difficult, so better check first some tutorials before buying a camera. And there are companies doing the conversion for you for a bunch of hundred dollars (probably 300 or 400).

looofooo0•2h ago
You can even do the conversion diy.
system2•6h ago
Isn't this why they always use the term "artist's impression" when they are colored?
recipe19•6h ago
I think that term is reserved mostly for actual artwork (renderings, paintings, etc).

Some deep-space astronomy pictures are in completely made-up color, often because they're taken at wavelengths different than visible light and then color-mapped to look pretty.

But the point here is even if you're taking images with a regular camera pointed at the sky, it's pretty much impossible to match "reality".

okanat•6h ago
There are different reasons for that. Things like black holes are really hard to observe even in other light spectrums. Same for other objects like planets. So the drawings are made in hypothetical expectations based on simulations rather than direct observations.

Many observations come from scientific cameras rather than actual visible spectrum cameras discussed in TFA. They are not artist's impression like the first case. They will have a completely different view of the object so any visible-light predictions will have some guessing in it but the final picture will be not 100% you would see.

nwallin•5h ago
No.

When you see "artist's impression" in a news article about space, what you're looking at is a painting or drawing created from whole cloth by an artist.

This article is about how sensors turned signals into images. When you take pictures with a 'normal' camera, we've designed them so that if you take certain steps, the image on your screen looks the same as what it would look like in real life with no camera or monitor. This article is stating that with the cameras and filters they use for telescopes, that same process doesn't really work. We use special filters to measure specific spectral properties about an astronomical object. This gives good scientific information, however, it means that in many cases it's impossible to reconstruct what an astronomical object would really look like if our eyes were more sensitive and we looked at it.

bhickey•6h ago
The tiniest of corrections: Ha is 656.28nm not 565.
execat•6h ago
At risk of going off-topic, when I see comments like these, I wonder how the comment author comes up with these corrections (cross-checked, the comment is in fact true)

Did you have the number memorized or did you do a fact check on each of the numbers?

kragen•5h ago
I didn't know the number was wrong, but something about the statement seemed very wrong, because the 565nm number is only 10nm away from 555nm, conventionally considered the absolute maximum wavelength of human visual sensitivity (683lm/W). And you can see that in the photopic sensitivity curves in the rest of the article: both red and green cones respond strongly to light all around that wavelength. So it seemed implausible that 565nm would be nearly invisible.

But I didn't know whether Ha was actually highly visible or just had a different wavelength. I didn't know 683lm/W either, and I wasn't exactly sure that 555nm was the peak, but I knew it was somewhere in the mid-500s. If I'd been less of a lazy bitch I would have fact-checked that statement to see where the error was.

kragen•1h ago
I see that there's a [dead] reply by the kind of person who thinks "tryhard" is an insult and has applied it to me.

When I compare people I know about who tried hard to the people I know about who didn't try hard, literally every single person I would want to be like is one of the people who tried hard. I'm unable to imagine what it would be like to want to be like the other group.

I mean, I don't want to be like Michael Jordan, but I can imagine wanting to be like him, and in part this is because specifically what he's famous for is succeeding at something very difficult that he had to try unbelievably hard at.

So I'm delighted to declare myself a tryhard, or at least an aspiring tryhard.

Completely by coincidence, when I saw the tryhard comment, I happened to be reading https://www.scattered-thoughts.net/writing/things-unlearned/:

> People don't really say this [that intelligence trumps expertise] explicitly, but it's conveyed by all the folk tales of the young college dropout prodigies revolutionizing everything they touch. They have some magic juice that makes them good at everything.

> If I think that's how the world works, then it's easy to completely fail to learn. Whatever the mainstream is doing is ancient history, whatever they're working on I could do it in a weekend, and there's no point listening to anyone with more than 3 years experience because they're out of touch and lost in the past.

> Similarly for programmers who go into other fields expecting to revolutionize everything with the application of software, without needing to spend any time learning about the actual problem or listening to the needs of the people who have been pushing the boulder up the hill for the last half century.

> This error dovetails neatly with many of the previous errors above eg [sic] no point learning how existing query planners work if I'm smart enough to arrive at a better answer from a standing start, no point learning to use a debugger if I'm smart enough to find the bug in my head.

> But a decade of mistakes later I find that I arrived at more or the less the point that I could have started at if I was willing to believe that the accumulated wisdom of tens of thousands of programmers over half a century was worth paying attention to.

> And the older I get, the more I notice that the people who actually make progress are the ones who are keenly aware of the bounds of their own knowledge, are intensely curious about the gaps and are willing to learn from others and from the past. One exemplar of this is Julia Evans, whose blog archives are a clear demonstration of how curiosity and lack of ego is a fast path to expertise.

bhickey•4h ago
In this case I coincidentally spent a few hundred hours of hobby time over the last year designing hydrogen alpha telescopes.
nothacking_•5h ago
Fixed.
vFunct•6h ago
You can if you use hyper spectral imaging...
choonway•6h ago
probably will come out within the next 5 iphone generations.

POC already out...

https://pmc.ncbi.nlm.nih.gov/articles/PMC8404918/

kragen•5h ago
People have been making production hyperspectral sensors for decades, including hobbyists in garages; we're well beyond the proof-of-concept stage.
nothacking_•5h ago
The problem with hyperspectral imaging is that it ends up throwing away 99.9% of all the light that hits your camera. It's been done for the sun and some very bright nebulae, but really isn't practical for most of the stuff in space.
klysm•6h ago
Recently I've been on a bit of a deep dive regarding human color vision and cameras. This left me with the general impression that RGB bayer filters are vastly over-utilized (mostly due to market share), and are they are usually not great for tasks other than mimicking human vision! For example, if you have a stationary scene, why not put a whole bunch of filters in front of a mono camera and get much more frequency information?
jofer•6h ago
In case you weren't already aware, that last bit basically describes most optical scientific imaging (e.g. satellite imaging or spectroscopy in general).
nothacking_•5h ago
That's common in high end astophotography, and almost exclusively used at professional observatories. However, scientists like filters that are "rectangular", with a flat passband and sharp falloff, very unlike human color vision.
rachofsunshine•5h ago
Assuming the bands are narrow, that should allow approximately true-color images, shouldn't it?

Human S cone channel = sum over bands of (intensity in that band) * (human S-cone sensitivity in that channel)

and similarly for M and L cone channels, which goes to the integral representing true color in the limit.

Are the bands too wide for this to work?

nothacking_•5h ago
> Are the bands too wide for this to work?

For wideband filters used for stars and galaxies, yes. Sometimes the filters are wider then the entire visible spectrum.

For narrowband filters used to isolate emission from a particular element, no. If you have just the Oxygen-III signal isolated from everything else, you can composite it as a perfect turquoise color.

chaboud•4h ago
I think you want a push broom setup:

https://www.adept.net.au/news/newsletter/202001-jan/pushbroo...

Hyperspectral imaging is a really fun space. You can do a lot with some pretty basic filters and temporal trickery. However, once you’re out of hot mirror territory (near IR and IR filtering done on most cameras), things have to get pretty specialized.

But grab a cold mirror (visible light cutting IR filter) and a nighvision camera for a real party on the cheap.

adornKey•1h ago
And don't forget about polarization! There's more information out there than just frequency.
jofer•6h ago
These same things apply to satellite images of the Earth as well. Even when you have optical bands that roughly correspond to human eye sensitivity, they're a quite different response pattern. You're also often not working with those wavelength bands in the visualizations you make.

Scientific sensors want as "square" a spectral response as possible. That's quite different than human eye response. Getting a realistic RGB visualization from a sensor is very much an artform.

dheera•6h ago
It's worth noting that many NASA images use the "HSO" palette which is false color imagery. In particular the sulfur (S) and hydrogen (H) lines are both red to the human eye, so NASA assigns them to different colors (hydrogen->red, sulfur->green, oxygen->blue) for interpretability.
mystraline•5h ago
The proper color of an image would be a multispectral radiograph similar to a waterfall plot for each point. Each FFT bin would be 100GHz in size, and the range would be over 1000THz. And in a way, that'd what a color sensor is doing at the CCD level too - collapsing and averaging the radio energy its susceptible to a specific color.
7373737373•45m ago
https://en.wikipedia.org/wiki/Hyperspectral_imaging
monkeyelite•5h ago
Disappointing that most space photos are made by mapping an analog input onto a gradient and that this isn’t stated more directly.
hliyan•4h ago
I still haven't forgiven whoever made Voyager's first images of Jupiter's moon Io bright red and yellow, and The Saturnian moon Enceladus green.
ianburrell•4h ago
Neptune was shown as deep blue for a long time, but it is really a similar color as Uranus, a pale greenish-blue.
cyb_•3h ago
Having dabbled a bit in astrophotography, I would suggest that color is best used to bring out the structure (and beauty) of the object. Trying to faithfully match the human eye would, unfortunately, cause a lot of that data to be harder to see/understand. This is especially true in narrowband.
strogonoff•20m ago
It is not just in space where nothing is lit by a uniform light source or with a uniform brightness. This is also true for many casual photos you would take on this planet.

Outside of a set of scenarios like “daylight” or “cloudy”, and especially if you shoot with a mix of disparate artificial existing light sources at night, you have a very similar problem. Shooting raw somewhat moves this problem to development stage, but it remains a challenge: balance for one, make the others look weird. Yet (and this is a paradox not present in deep space photography) astoundingly the same scene can look beautiful to the human eye!

In the end, it is always a subjective creative job that concerns your interpretation of light and what you want people to see.

HPsquared•15m ago
I suppose the human visual system is already adapted to deal with the same problem.