frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

2 hour screen time limit and curfews for children being considered by government

https://news.sky.com/story/two-hour-screen-time-limit-and-curfews-for-children-being-considered-by-government-13400156
2•austinallegro•3m ago•0 comments

OpenAI CEO Sam Altman Warns of an AI 'Fraud Crisis' – CNN Business

https://www.cnn.com/2025/07/22/tech/openai-sam-altman-fraud-crisis
2•thm•8m ago•1 comments

Humidity Rules Everything Around Me

https://twitter.com/yoltartar/status/1947650847950639553
1•Michelangelo11•13m ago•0 comments

Portal Explorer

https://optozorax.github.io/portal/
1•Trung0246•16m ago•0 comments

The unsolved tension at the heart of AI

https://tushardadlani.com/the-unsolved-tension-at-the-heart-of-ai-what-scales-vs-what-matters
1•tush726•19m ago•0 comments

What Makes SQL Special?

https://technicaldeft.com/posts/what-makes-sql-special
2•elza_1111•20m ago•0 comments

Make a Fish

http://makea.fish
1•novia•23m ago•0 comments

Improving KAN with CDF normalization to quantiles

https://arxiv.org/abs/2507.13393
1•jarekd•25m ago•1 comments

Ask HN: Is Tensorflow.js Dead?

3•fouronnes3•27m ago•1 comments

Aging Clock Unveils Compounds That Rejuvenate Brain Cells

https://neurosciencenews.com/aging-clock-neurogenesis-29510/
2•lentoutcry•30m ago•0 comments

Stargate advances with 4.5 GW partnership with Oracle

https://openai.com/index/stargate-advances-with-partnership-with-oracle/
1•taubek•30m ago•0 comments

Show HN: I built a site to help new dog owners pick a boy dog name fast

https://boydognames.net
1•droidHZ•34m ago•0 comments

Capturing anesthetic gases could prevent global warming, new study shows

https://phys.org/news/2025-07-capturing-anesthetic-gases-global.html
2•PaulHoule•39m ago•0 comments

Rescuing two PDP-11s from a former British Telecom underground shelter (2023)

https://forum.vcfed.org/index.php?threads/rescuing-two-pdp-11-systems-in-uk-from-a-former-big-british-telecom-underground-shelter-in-central-london.1244723/page-2
4•mhh__•40m ago•0 comments

Extending Emacs with Fennel (2024)

https://andreyor.st/posts/2024-12-20-extending-emacs-with-fennel/
4•Bogdanp•49m ago•0 comments

Making Sense of Hanlon's Razor

https://domofutu.substack.com/p/making-sense-of-hanlons-razor
1•wjb3•1h ago•0 comments

Is the Interstellar Object 3I/Atlas Alien Technology?

https://avi-loeb.medium.com/is-the-interstellar-object-3i-atlas-alien-technology-b59ccc17b2e3
2•greesil•1h ago•3 comments

Tamiya chairman Shunsaku Tamiya dies at 90

https://www.dailyexpress.com.my/news/263013/tamiya-chairman-shunsaku-tamiya-dies-at-90/
1•mbrd•1h ago•1 comments

Ask HN: Programmable, affordable developer toys similar to DeskHog?

2•adarshd•1h ago•0 comments

When Is WebAssembly Going to Get DOM Support?

https://queue.acm.org/detail.cfm?id=3746174
5•jazzypants•1h ago•1 comments

Ask HN: What software subscriptions are worth paying for?

30•helloworlddd•1h ago•36 comments

How HN: Vivezia – A Wellness Tracker with Privacy in Mind

https://www.vivezia.com
1•rmagrare•1h ago•0 comments

Private equity firms flip assets to themselves in record numbers

https://www.ft.com/content/88a4e3e3-cefb-48d8-ab81-75cf85039b83
2•cwwc•1h ago•0 comments

Whom Do We Trust? How AI Is (Re)Shaping Our Interactions Today (Gillian Tett) [video]

https://www.youtube.com/watch?v=AVXnBLh9tWY
1•maartenscholl•1h ago•0 comments

Show HN: NextDevKit – Next.js and OpenNext SaaS Template, Goodbye Vercel Bills

https://nextdevkit.com
1•guangzhengli•1h ago•0 comments

The benefits of trunk-based development

https://thinkinglabs.io/articles/2025/07/21/on-the-benefits-of-trunk-based-development.html
33•gpi•1h ago•36 comments

In Ukraine's bombed out reservoir a forest has grown

https://www.theguardian.com/environment/2025/jul/22/in-a-bombed-out-reservoir-ukraine-huge-forest-grown-a-return-to-life-or-toxic-timebomb
6•NewJazz•1h ago•0 comments

Ask HN: Looking for Research Ideas in Cybersecurity (Graduate Student)

1•hogexmox•1h ago•0 comments

Automatic Linux migration tool for windows [video]

https://www.youtube.com/watch?v=PMoXClh8emw
1•Jotalea•2h ago•2 comments

Show HN: Coder.ninja – Best Projects and Coders

https://coder.ninja
1•ethx64•2h ago•0 comments
Open in hackernews

Why you can't color calibrate deep space photos

https://maurycyz.com/misc/cc/
106•LorenDB•6h ago

Comments

Retr0id•5h ago
The next space mission should be to leave a colour calibration chart on the moon.
jofer•5h ago
The moon itself already is one. Moonshots are widely used in calibration, at least for earth observation satellites. The brightness of the full moon at each wavelength at each day of the year is predictable and well-known, so it makes a good target to check your payload against.
embedded_hiker•4h ago
They brought a gnomon, with a color chart, on the Apollo missions. They would set it up for many of the pictures of samples.

https://airandspace.si.edu/collection-objects/gnomon-lunar-a...

shagie•4h ago
They also put color calibration charts on Mars rovers. For example https://www.lucideon.com/news/colour-standards-on-mars
JNRowe•10m ago
There is even a Damien Hirst¹ haphazardly spread about the surface for that purpose.

One of the great gifts Pillinger² had was being able to shake up public interest via pop culture; there was also call sign by Blur for Beagle 2.

¹ https://www.researchgate.net/figure/Spot-Painting-Beagle-2-C...

² https://en.wikipedia.org/wiki/Colin_Pillinger

pgreenwood•4h ago
Here's a shot of a color chart on the moon from Apollo 17 (AS17-137-20900):

https://tothemoon.im-ldi.com/data_a70/AS17/extra/AS17-137-20...

kurthr•5h ago
What's the white point? Is it D65? Not when the sun isn't out.
klysm•5h ago
I've always been confused by what the white point actually _means_. Since we are dealing with strictly emissive sources here, and not reflected sunlight, does the whitepoint even mean anything?
esafak•4h ago
In a scene lit overwhelmingly by one approximately Planckian light source, the white point is the color of the closest Planckian light source.

If the light source is not approximately Planckian, or if multiple illuminants have different temperatures, a white point is not defined.

klysm•2h ago
So in this case there is no sensible white point since there is no illuminant right?
esafak•2h ago
I'm not sure what case we're talking about but if it serves visible light it is an illuminant.
bhouston•5h ago
> Many other cameras, particularly those with aggressive UV-IR cut filters, underespond to H-a, resulting in dim and blueish nebula. Often people rip out those filters (astro-modification), but this usually results in the camera overresponding instead.

Hmm... astrophotographers do not use cameras with UV-IR cut filters at all. For example, I owned a few of these:

https://www.zwoastro.com/product-category/cameras/dso_cooled...

They also generally do not use sensors that have Bayer filters. This also screws things up.

Instead they use monochromatic sensors with narrowband filters (either one band or multiple) over them keyed to specific celestial emissions. The reason for this is that it gets rid of light pollution that is extensive and bumps up the signal to noise for the celestial items, especially the small faint details. Stuff like this:

https://telescopescanada.ca/products/zwo-4-piece-31mm-ha-sii...

https://telescopescanada.ca/products/zwo-duo-band-filter

Often these are combined with a true color capture (or individual RGBL narrowband) just to get the stars coloured properly.

Almost everything you see in high end astrophotography is false color because they map these individual narrowband captures on the monochrome sensors to interesting colours and often spending a lot of time manipulating the individual channels.

This is done at the medium to high end using the PixInsight software - including by NASA for the recent James Webb images: https://www.pbs.org/video/new-eye-on-the-universe-zvzqn1/

The James Web telescope has a set of 29 narrowband filters for its main sensor: https://jwst-docs.stsci.edu/jwst-near-infrared-camera/nircam...

Hubble pictures were famously coloured in a particular way that it has a formal name:

https://www.astronomymark.com/hubble_palette.htm

(My shots: https://app.astrobin.com/u/bhouston#gallery)

verandaguy•5h ago

    > astrophotographers do not use cameras with UV-IR cut filters at all
I'll be pedantic here and say that the author's probably talking to people who use DSLRs with adapter rings for telescopes. I've been interested in doing this for a while (just unable to financially justify it), and I think this is actually something people in this niche do.

Then there are things like the Nikon D810A, which remove the UV-IR filter from the factory (but IIRC retain the Bayer filter).

bhouston•4h ago
My recommendation, as someone who started with a DSLR and then modded it to remove the UV-IR filter, I would have been better to just skip to a beginner cooled mono astrophotography camera, like the ASI533MM Pro. It is night and day difference in terms of quality and roughly the same cost and it automates better much better.

A high end DSLR is a huge waste of money in astrophotography. Spend the same amount on a dedicated astrophotography camera and you’ll do much better.

verandaguy•4h ago
How do you recover colour from a mono astro camera? Just run it for 3 exposures behind a gel of each of the R/G/B colours, then comp?
gibybo•4h ago
Yes, and you would almost certainly want to automate it with a filter wheel that changes the filters for you on a schedule. However, a key advantage of a mono camera is that you don't have to limit yourself to RGB filters. You can use some other set of filters better suited for the object you are capturing and map them back to RGB in software. This is most commonly done with narrowband filters for Hydrogen, Sulfur and Oxygen which allow you to see more detail in many deep space objects and cut out most of the light pollution that would otherwise get in your way.
schoen•3h ago
> It is night and day difference

Particularly high praise in astronomy!

recipe19•5h ago
What you're describing is the domain of a very, very small number of hobbyists with very deep pockets (plus various govt-funded entities).

The vast majority of hobby astrophotography is done pretty much as the webpage describes it, with a single camera. You can even buy high-end Canon cameras with IR filters factory-removed specifically for astrophotography. It's big enough of a market that the camera manufacturer accommodates it.

bhouston•5h ago
> What you're describing is the domain of a very, very small number of hobbyists with very deep pockets

Sort of. The telescope used for the Dumbbell nebula captures featured in the article was at worth around $1000 and his mount is probably $500. A beginner cooled monochrome astrophotography camera is around $700 and if you want filters and a controller another $500.

There are quite a few people in the world doing this, upwards of 100K:

https://app.astrobin.com/search

Various PixInsight videos have +100K views: https://youtu.be/XCotRiUIWtg?si=RpkU-sECLusPM1j-&utm_source=...

Intro to narrowband also has 100K+ views: https://youtu.be/0Fp2SlhlprU?si=oqWrATDDwhmMguIl&utm_source=...

looofooo0•1h ago
Some even scratch of the bayer pattern of old cameras.
tomrod•3h ago
And the entire earth observation industry, which doesn't look the same way but uses the same base tech stack.
tecleandor•3h ago
You don't need very big pockets for that.

Today you can find very affordable monochromatic astrophotography cameras, and you can also modify cheap DSLR cameras or even compact cameras to remove its IR/UV/low pass filters. You can even insert a different semi permanent internal filter after that (like a IR or UV band pass)

I've done a Nikon D70 DSLR and a Canon Ixus/Elph compact.

Some cameras are very easy, some very difficult, so better check first some tutorials before buying a camera. And there are companies doing the conversion for you for a bunch of hundred dollars (probably 300 or 400).

looofooo0•1h ago
You can even do the conversion diy.
system2•5h ago
Isn't this why they always use the term "artist's impression" when they are colored?
recipe19•5h ago
I think that term is reserved mostly for actual artwork (renderings, paintings, etc).

Some deep-space astronomy pictures are in completely made-up color, often because they're taken at wavelengths different than visible light and then color-mapped to look pretty.

But the point here is even if you're taking images with a regular camera pointed at the sky, it's pretty much impossible to match "reality".

okanat•5h ago
There are different reasons for that. Things like black holes are really hard to observe even in other light spectrums. Same for other objects like planets. So the drawings are made in hypothetical expectations based on simulations rather than direct observations.

Many observations come from scientific cameras rather than actual visible spectrum cameras discussed in TFA. They are not artist's impression like the first case. They will have a completely different view of the object so any visible-light predictions will have some guessing in it but the final picture will be not 100% you would see.

nwallin•4h ago
No.

When you see "artist's impression" in a news article about space, what you're looking at is a painting or drawing created from whole cloth by an artist.

This article is about how sensors turned signals into images. When you take pictures with a 'normal' camera, we've designed them so that if you take certain steps, the image on your screen looks the same as what it would look like in real life with no camera or monitor. This article is stating that with the cameras and filters they use for telescopes, that same process doesn't really work. We use special filters to measure specific spectral properties about an astronomical object. This gives good scientific information, however, it means that in many cases it's impossible to reconstruct what an astronomical object would really look like if our eyes were more sensitive and we looked at it.

bhickey•5h ago
The tiniest of corrections: Ha is 656.28nm not 565.
execat•4h ago
At risk of going off-topic, when I see comments like these, I wonder how the comment author comes up with these corrections (cross-checked, the comment is in fact true)

Did you have the number memorized or did you do a fact check on each of the numbers?

kragen•4h ago
I didn't know the number was wrong, but something about the statement seemed very wrong, because the 565nm number is only 10nm away from 555nm, conventionally considered the absolute maximum wavelength of human visual sensitivity (683lm/W). And you can see that in the photopic sensitivity curves in the rest of the article: both red and green cones respond strongly to light all around that wavelength. So it seemed implausible that 565nm would be nearly invisible.

But I didn't know whether Ha was actually highly visible or just had a different wavelength. I didn't know 683lm/W either, and I wasn't exactly sure that 555nm was the peak, but I knew it was somewhere in the mid-500s. If I'd been less of a lazy bitch I would have fact-checked that statement to see where the error was.

bhickey•3h ago
In this case I coincidentally spent a few hundred hours of hobby time over the last year designing hydrogen alpha telescopes.
nothacking_•4h ago
Fixed.
vFunct•5h ago
You can if you use hyper spectral imaging...
choonway•4h ago
probably will come out within the next 5 iphone generations.

POC already out...

https://pmc.ncbi.nlm.nih.gov/articles/PMC8404918/

kragen•4h ago
People have been making production hyperspectral sensors for decades, including hobbyists in garages; we're well beyond the proof-of-concept stage.
nothacking_•4h ago
The problem with hyperspectral imaging is that it ends up throwing away 99.9% of all the light that hits your camera. It's been done for the sun and some very bright nebulae, but really isn't practical for most of the stuff in space.
klysm•5h ago
Recently I've been on a bit of a deep dive regarding human color vision and cameras. This left me with the general impression that RGB bayer filters are vastly over-utilized (mostly due to market share), and are they are usually not great for tasks other than mimicking human vision! For example, if you have a stationary scene, why not put a whole bunch of filters in front of a mono camera and get much more frequency information?
jofer•5h ago
In case you weren't already aware, that last bit basically describes most optical scientific imaging (e.g. satellite imaging or spectroscopy in general).
nothacking_•4h ago
That's common in high end astophotography, and almost exclusively used at professional observatories. However, scientists like filters that are "rectangular", with a flat passband and sharp falloff, very unlike human color vision.
rachofsunshine•3h ago
Assuming the bands are narrow, that should allow approximately true-color images, shouldn't it?

Human S cone channel = sum over bands of (intensity in that band) * (human S-cone sensitivity in that channel)

and similarly for M and L cone channels, which goes to the integral representing true color in the limit.

Are the bands too wide for this to work?

nothacking_•3h ago
> Are the bands too wide for this to work?

For wideband filters used for stars and galaxies, yes. Sometimes the filters are wider then the entire visible spectrum.

For narrowband filters used to isolate emission from a particular element, no. If you have just the Oxygen-III signal isolated from everything else, you can composite it as a perfect turquoise color.

chaboud•3h ago
I think you want a push broom setup:

https://www.adept.net.au/news/newsletter/202001-jan/pushbroo...

Hyperspectral imaging is a really fun space. You can do a lot with some pretty basic filters and temporal trickery. However, once you’re out of hot mirror territory (near IR and IR filtering done on most cameras), things have to get pretty specialized.

But grab a cold mirror (visible light cutting IR filter) and a nighvision camera for a real party on the cheap.

adornKey•3m ago
And don't forget about polarization! There's more information out there than just frequency.
jofer•5h ago
These same things apply to satellite images of the Earth as well. Even when you have optical bands that roughly correspond to human eye sensitivity, they're a quite different response pattern. You're also often not working with those wavelength bands in the visualizations you make.

Scientific sensors want as "square" a spectral response as possible. That's quite different than human eye response. Getting a realistic RGB visualization from a sensor is very much an artform.

dheera•4h ago
It's worth noting that many NASA images use the "HSO" palette which is false color imagery. In particular the sulfur (S) and hydrogen (H) lines are both red to the human eye, so NASA assigns them to different colors (hydrogen->red, sulfur->green, oxygen->blue) for interpretability.
mystraline•3h ago
The proper color of an image would be a multispectral radiograph similar to a waterfall plot for each point. Each FFT bin would be 100GHz in size, and the range would be over 1000THz. And in a way, that'd what a color sensor is doing at the CCD level too - collapsing and averaging the radio energy its susceptible to a specific color.
monkeyelite•3h ago
Disappointing that most space photos are made by mapping an analog input onto a gradient and that this isn’t stated more directly.
hliyan•3h ago
I still haven't forgiven whoever made Voyager's first images of Jupiter's moon Io bright red and yellow, and The Saturnian moon Enceladus green.
ianburrell•3h ago
Neptune was shown as deep blue for a long time, but it is really a similar color as Uranus, a pale greenish-blue.
cyb_•2h ago
Having dabbled a bit in astrophotography, I would suggest that color is best used to bring out the structure (and beauty) of the object. Trying to faithfully match the human eye would, unfortunately, cause a lot of that data to be harder to see/understand. This is especially true in narrowband.