If the light source is not approximately Planckian, or if multiple illuminants have different temperatures, a white point is not defined.
We physicists never use the term Planckian for thermal black bodies. That adjective would be used in quantum mechanics, though, for very small things.
To understand what I mean by "closest Planckian light source" see https://en.wikipedia.org/wiki/Planckian_locus
Hmm... astrophotographers do not use cameras with UV-IR cut filters at all. For example, I owned a few of these:
https://www.zwoastro.com/product-category/cameras/dso_cooled...
They also generally do not use sensors that have Bayer filters. This also screws things up.
Instead they use monochromatic sensors with narrowband filters (either one band or multiple) over them keyed to specific celestial emissions. The reason for this is that it gets rid of light pollution that is extensive and bumps up the signal to noise for the celestial items, especially the small faint details. Stuff like this:
https://telescopescanada.ca/products/zwo-4-piece-31mm-ha-sii...
https://telescopescanada.ca/products/zwo-duo-band-filter
Often these are combined with a true color capture (or individual RGBL narrowband) just to get the stars coloured properly.
Almost everything you see in high end astrophotography is false color because they map these individual narrowband captures on the monochrome sensors to interesting colours and often spending a lot of time manipulating the individual channels.
This is done at the medium to high end using the PixInsight software - including by NASA for the recent James Webb images: https://www.pbs.org/video/new-eye-on-the-universe-zvzqn1/
The James Web telescope has a set of 29 narrowband filters for its main sensor: https://jwst-docs.stsci.edu/jwst-near-infrared-camera/nircam...
Hubble pictures were famously coloured in a particular way that it has a formal name:
https://www.astronomymark.com/hubble_palette.htm
(My shots: https://app.astrobin.com/u/bhouston#gallery)
> astrophotographers do not use cameras with UV-IR cut filters at all
I'll be pedantic here and say that the author's probably talking to people who use DSLRs with adapter rings for telescopes. I've been interested in doing this for a while (just unable to financially justify it), and I think this is actually something people in this niche do.Then there are things like the Nikon D810A, which remove the UV-IR filter from the factory (but IIRC retain the Bayer filter).
A high end DSLR is a huge waste of money in astrophotography. Spend the same amount on a dedicated astrophotography camera and you’ll do much better.
Essentially yes. To get faint details in astrophotography, you actually will capture a series of images of each filter with long exposure times like 3 minutes per capture with a total capture time per filter measured in hours. You then star align everything, then you integrate the captures for each filter into a single frame to remove noise and boost signal, then you comp them together.
Particularly high praise in astronomy!
The vast majority of hobby astrophotography is done pretty much as the webpage describes it, with a single camera. You can even buy high-end Canon cameras with IR filters factory-removed specifically for astrophotography. It's big enough of a market that the camera manufacturer accommodates it.
Sort of. The telescope used for the Dumbbell nebula captures featured in the article was at worth around $1000 and his mount is probably $500. A beginner cooled monochrome astrophotography camera is around $700 and if you want filters and a controller another $500.
There are quite a few people in the world doing this, upwards of 100K:
https://app.astrobin.com/search
Various PixInsight videos have +100K views: https://youtu.be/XCotRiUIWtg?si=RpkU-sECLusPM1j-&utm_source=...
Intro to narrowband also has 100K+ views: https://youtu.be/0Fp2SlhlprU?si=oqWrATDDwhmMguIl&utm_source=...
Today you can find very affordable monochromatic astrophotography cameras, and you can also modify cheap DSLR cameras or even compact cameras to remove its IR/UV/low pass filters. You can even insert a different semi permanent internal filter after that (like a IR or UV band pass)
I've done a Nikon D70 DSLR and a Canon Ixus/Elph compact.
Some cameras are very easy, some very difficult, so better check first some tutorials before buying a camera. And there are companies doing the conversion for you for a bunch of hundred dollars (probably 300 or 400).
Conversions done in places like Kolari or Spencer run about $300-500 depending on the camera model.
If I were to buy a brand new A7 IV or something like that, I would of course ask one of those shops to do it for me.
Some deep-space astronomy pictures are in completely made-up color, often because they're taken at wavelengths different than visible light and then color-mapped to look pretty.
But the point here is even if you're taking images with a regular camera pointed at the sky, it's pretty much impossible to match "reality".
Many observations come from scientific cameras rather than actual visible spectrum cameras discussed in TFA. They are not artist's impression like the first case. They will have a completely different view of the object so any visible-light predictions will have some guessing in it but the final picture will be not 100% you would see.
When you see "artist's impression" in a news article about space, what you're looking at is a painting or drawing created from whole cloth by an artist.
This article is about how sensors turned signals into images. When you take pictures with a 'normal' camera, we've designed them so that if you take certain steps, the image on your screen looks the same as what it would look like in real life with no camera or monitor. This article is stating that with the cameras and filters they use for telescopes, that same process doesn't really work. We use special filters to measure specific spectral properties about an astronomical object. This gives good scientific information, however, it means that in many cases it's impossible to reconstruct what an astronomical object would really look like if our eyes were more sensitive and we looked at it.
Did you have the number memorized or did you do a fact check on each of the numbers?
But I didn't know whether Ha was actually highly visible or just had a different wavelength. I didn't know 683lm/W either, and I wasn't exactly sure that 555nm was the peak, but I knew it was somewhere in the mid-500s. If I'd been less of a lazy bitch I would have fact-checked that statement to see where the error was.
When I compare people I know about who tried hard to the people I know about who didn't try hard, literally every single person I would want to be like is one of the people who tried hard. I'm unable to imagine what it would be like to want to be like the other group.
I mean, I don't want to be like Michael Jordan, but I can imagine wanting to be like him, and in part this is because specifically what he's famous for is succeeding at something very difficult that he had to try unbelievably hard at.
So I'm delighted to declare myself a tryhard, or at least an aspiring tryhard.
Completely by coincidence, when I saw the tryhard comment, I happened to be reading https://www.scattered-thoughts.net/writing/things-unlearned/:
> People don't really say this [that intelligence trumps expertise] explicitly, but it's conveyed by all the folk tales of the young college dropout prodigies revolutionizing everything they touch. They have some magic juice that makes them good at everything.
> If I think that's how the world works, then it's easy to completely fail to learn. Whatever the mainstream is doing is ancient history, whatever they're working on I could do it in a weekend, and there's no point listening to anyone with more than 3 years experience because they're out of touch and lost in the past.
> Similarly for programmers who go into other fields expecting to revolutionize everything with the application of software, without needing to spend any time learning about the actual problem or listening to the needs of the people who have been pushing the boulder up the hill for the last half century.
> This error dovetails neatly with many of the previous errors above eg [sic] no point learning how existing query planners work if I'm smart enough to arrive at a better answer from a standing start, no point learning to use a debugger if I'm smart enough to find the bug in my head.
> But a decade of mistakes later I find that I arrived at more or the less the point that I could have started at if I was willing to believe that the accumulated wisdom of tens of thousands of programmers over half a century was worth paying attention to.
> And the older I get, the more I notice that the people who actually make progress are the ones who are keenly aware of the bounds of their own knowledge, are intensely curious about the gaps and are willing to learn from others and from the past. One exemplar of this is Julia Evans, whose blog archives are a clear demonstration of how curiosity and lack of ego is a fast path to expertise.
POC already out...
Human S cone channel = sum over bands of (intensity in that band) * (human S-cone sensitivity in that channel)
and similarly for M and L cone channels, which goes to the integral representing true color in the limit.
Are the bands too wide for this to work?
For wideband filters used for stars and galaxies, yes. Sometimes the filters are wider then the entire visible spectrum.
For narrowband filters used to isolate emission from a particular element, no. If you have just the Oxygen-III signal isolated from everything else, you can composite it as a perfect turquoise color.
https://www.adept.net.au/news/newsletter/202001-jan/pushbroo...
Hyperspectral imaging is a really fun space. You can do a lot with some pretty basic filters and temporal trickery. However, once you’re out of hot mirror territory (near IR and IR filtering done on most cameras), things have to get pretty specialized.
But grab a cold mirror (visible light cutting IR filter) and a nighvision camera for a real party on the cheap.
Then I felt surprised that I was surprised by that.
But that said, I’m actually surprised that astrophotographers are so interested in calibrating stars to the human eye. The article shows through a number of examples (IR, hydrogen emission line) that the human eye is a very poor instrument for viewing the “true” color of stars. Most astronomical photographs use false colors (check the captions on the NASA archives) to show more than what the eye can see, to great effect.
I think when astrophotographers are trying to render an image it makes sense that they would want the colors to match what your eyes would see looking through a good scope.
I've only experienced dramatic color from deep sky objects a few times (the blue of the Orion Nebula vastly outshines all the other colors, for instance), and its always sort of frustrating that the picture show something so wildly different from what my own eyes see.
Just rgb filters aren't really going to get you anything better than a bayer matrix for the same exposure time, and most subjects on earth are moving too much to do separate exposures for 3 filters.
The benefits of a mono camera and rgb filters is that you can take advantage of another quirk of our perception; we are more sensitive to intensity than color. Because of this, it's possible to get a limited amount of exposure time with the rgb filters, and use a 4th "luminance" filter for the majority of the time. During processing you can combine your rgb images, convert that to HSI and replace the I channel with your luminance image. Because the L filter doesn't block much light it's faster at getting signal, but it's only really a benefit for really dark stuff where getting enough signal is an issue.
Canon has made a few astrophotography cameras:
https://en.wikipedia.org/wiki/Canon_EOS_R#Variants
There are also modified cameras available with the filters removed:
Scientific sensors want as "square" a spectral response as possible. That's quite different than human eye response. Getting a realistic RGB visualization from a sensor is very much an artform.
The nice thing about deep space objects is that they don't change at a rate that there is any visible difference over timespans relevant to humans. This means that you can do very, very long exposures. It also means that you don't need to capture all colours at once.
The linked article talks about using a color camera with a bayer matrix that uses dyed glass to get color data and how the filters don't filter out light in the 800-1000nm range.
Lots of amateur astrophotographers don't use color camera's but monochrome cameras in combination with separate filters. These filters are of much higher quality than the dye based filters used on camera sensors. Instead of dyes they use interference filters[1]. These filters do not have the same issue as described in the article. For example the LRGB filter set I use only lets through a 100nm band for each color. Next to that you can use filter that only let through emissions from specific elements, like the HSO filters mentioned (Hydrogen-alpha, Sulphur-II, Oxygen-III). In my case I have filters with a bandwidth of 3nm around the emission line of each of these elements.
The bigger issue with making these photo's look like what they 'really look like' is that they really look like nothing at all. The light from these objects is so faint that even with a highly sensitive cooled camera you need many hours of exposure to grab enough data to make a nice picture. What they 'really look like' to the human eye is black.
The issue is worse when you add in the narrowband data, because that overlaps with the visible spectrum, but in that case you usually emphasise certain bands to stand out more in the final picture. It's not uncommon to combine the LRGB data with e.g. H-alpha data.
In the end the colours you choose are up to the photographer and their goals. Are you just going for a pretty picture or are you trying to visualize scientific data?
> It's not uncommon to combine the LRGB data with e.g. H-alpha data.
Yeah I do this too. Typically I do HOO to maintain as close to spectral accuracy as possible. I don't have an S filter.
It is, by the way, possible to get DSOs shot as a single image, it's tricky. See the 3rd image in this series:
https://www.instagram.com/p/CKXdL1YndQi/?img_index=3
I want to try to fine tune a diffusion model to make it possible to actually shoot these single shot with good fidelity.
Unfortunately I'm in a Bortle 5 to 6 area so I'd need quite a bit of integration time to get a decent SnR. There aren't really any proper dark sites in western Europe.
Yeah that would do it. I'm in the western US and there are lots of Bortle 2 sites I can access in a weekend, and Bortle 1 sites I can access on a long weekend.
There's even a Bortle 3 site I can get to after work just 2 hours from San Jose (Pinnacles National Park) which is exactly where I took that seagull nebula picture.
The nasa images are colored artistically with no correspondence to how it would look if you were to visit.
Otherwise it's all just slightly different shades of red and IR.
Outside of a set of scenarios like “daylight” or “cloudy”, and especially if you shoot with a mix of disparate artificial existing light sources at night, you have a very similar problem. Shooting raw somewhat moves this problem to development stage, but it remains a challenge: balance for one, make the others look weird. Yet (and this is a paradox not present in deep space photography) astoundingly the same scene can look beautiful to the human eye!
In the end, it is always a subjective creative job that concerns your interpretation of light and what you want people to see.
No, cones do not produce a negative response. The graph shows the intensity of the primaries required to recreate the spectral colour at that wavelength. The negative implies that the primary was added to the spectral colour to match it with itself, instead of adding it with the other primaries.
https://en.wikipedia.org/wiki/CIE_1931_color_space#Color_mat...
not what was claimed at all...
That's what I gathered from spectral response. Usually spectral response in this context refers to the responsivity of the cones. Even when accounting for 'brain subtracting green from red' (which I assume comes from the opponent process theory) the following graph has nothing to do with it. The captions too read 'Yes, this results in red having negative sensitivity @500 nm', implying the red (L) cones have a negative sensitivity to cyans — which, again, is not really the case.
Yes, they do, after the photoreceptors. Those CIE colorspace curves aren't biology, and shouldn't be interpreted as such.
LMS colorspace is the (currently understood) biological colorspace [1], and contains inhibitions, from the opponent process [2] found in the meatware [3]:
red-green: L - M
blue-yellow: S - (L + M)
This contains a nice introduction to biological colorspace [4].
[1] https://en.wikipedia.org/wiki/LMS_color_space
[2] https://en.wikipedia.org/wiki/Opponent_process
[3] https://en.wikipedia.org/wiki/Lateral_geniculate_nucleus#Col...
[4] https://color2.psych.upenn.edu/brainard/papers/Stockman_Brai...
> This contains a nice introduction to biological colorspace
Looks like an interesting read, thanks for sharing!
Does anyone know the answer to this? Would it just be black? Or just a bright white star?
The exception to this is stuff in our own solar system.
I think a better way to describe the issue is that much of the structure of the cosmos is only visible in non-visible wavelengths, so while calibrated, accurate visuals "like you were in a safe glass bubble" is a good category of astrophotography to continue to refine, it's a tiny slice of what's emitting electromagnetic radiation that's worth visualizing. And if cameras can convert invisible colors into visible ones, that's a blessing of a capability.
Certainly some phenomena just don't have much visible light of interest - so shifting the the spectra is inherently necessary in those cases and is thus inherently subjective.
The Milky Way and the entire local group of galaxies are flowing towards the superclusters located directly on the opposite side of the galaxy, and there's huge (thousands of light years) clouds of gas and "dust" (supernova molecular feces) that are like actual, thick, opaque storm clouds that form a donut around the galaxy, so it's impossible to "see" where we're headed except for the the X-rays and mid-infrared that doesn't interact with the huge clouds. And we won't be able to until we can leave the plane of the Milky Way's disk, or we build a telescope that is 2 dozen orders of magnitude more sensitive than anything we have so far.
For most things like nebula and galaxies the shapes can be made out easily but often the colors are absent or muted because our eyes' color cones don't do as well in low-light as a monochromatic rods. This is similar to how if you have ever seen the aurora the colors are hard to perceive but if you use a camera they pop out a lot more.
There's a big range though and it depends on the intensity of light from the object. There are certainly nebulas and other objects out there with enough light intensity that our eyes can naturally perceive the color.
The other thing to keep in mind is that people's night vision and ability to perceive color in low light is highly variable, much like how our visual acuity (near/far sightedness) vary greatly by individual and worsen somewhat with age.
If you want to see all the cool shit 4 billion light years away, you are gonna have to get those retinal IR implants you keep asking for each Christmas installed.
also, without an atmosphere or magnetic field or day/night your eyes might not have as much longevity due to no shielding from harmful radiation/uv/etc.
Retr0id•6mo ago
jofer•6mo ago
embedded_hiker•6mo ago
https://airandspace.si.edu/collection-objects/gnomon-lunar-a...
shagie•6mo ago
JNRowe•6mo ago
One of the great gifts Pillinger² had was being able to shake up public interest via pop culture; there was also call sign by Blur for Beagle 2.
¹ https://www.researchgate.net/figure/Spot-Painting-Beagle-2-C...
² https://en.wikipedia.org/wiki/Colin_Pillinger
pgreenwood•6mo ago
https://tothemoon.im-ldi.com/data_a70/AS17/extra/AS17-137-20...
gowld•6mo ago
This version shows different shades of colors: https://eol.jsc.nasa.gov/SearchPhotos/photo.pl?mission=AS17&...
rtkwe•6mo ago
pgreenwood•6mo ago
rtkwe•6mo ago
pgreenwood•6mo ago