As for tone mapping, I think the examples they show tend way too much towards flat low-local-contrast for my tastes.
I'm a huge fan of Helldivers 2, but playing the game in HDR gives me a headache: the muzzle flash of weapons at high RPMs on a screen that goes to 240hz is basically a continuous flashbang for my eyes.
For a while, No Mans' Sky in HDR mode was basically the color saturation of every planet dialed up to 11.
The only game I've enjoyed at HDR was a port from a console, Returnal. The use of HDR brights was minimalistic and tasteful, often reserved for certain particle effects.
I stopped playing that game for several years, and when I went back to it, the color and brightness had been wrecked to all hell. I have heard that it's received wisdom that gamers complain that HDR modes are "too dark", so perhaps that's part of why they ruined their game's renderer.
Some games that I think currently have good HDR:
* Lies of P
* Hunt: Showdown 1896
* Monster Hunter: World (if you increase the game's color saturation a bit from its default settings)
Some games that had decent-to-good HDR the last time I played them, a few years ago:
* Battlefield 1
* Battlefield V
* Battlefield 2042 (If you're looking for a fun game, I do NOT recommend this one. Also, the previous two are probably chock-full of cheaters these days.)
I found Helldivers 2's HDR mode to have blacks that were WAY too bright. In SDR mode, nighttime in forest areas was dark. In HDR mode? It was as if you were standing in the middle of a field during a full moon.
Everything is flattened, contrast is eliminated, lights that should be "burned white" for a cinematic feel are brought back to "reasonable" brightness with HDR, really deep blacks are turned into flat greys, etc. The end result is the flat and washed out look of movies like Wicked. It's often correlated to CGI-heavy movies, but in reality it's starting to affect every movie.
The utility of HDR (as described in the article) is without question. It's amazing looking at an outdoors (or indoors with windows) scene with your Mk-1 eyeballs, then taking a photo and looking at it on a phone or PC screen. The pic fails to capture what your eyes see for lighting range.
Also the maximum brightness isn't even that bright at 800 nits, so no HDR content really looks that different. I think newer OLEDs are brighter though. I'm still happy with the screen in general, even in SDR the OLED really shines. But it made me aware not all HDR screens are equal.
Also, in my very short experiment using HDR for daily work I ran into several problems, the most serious of which was the discovery that you can no longer just screenshot something and expect it to look the same on someone else's computer.
To be pedantic, this has always been the case... Who the hell knows what bonkers "color enhancement" your recipient has going on on their end?
But (more seriously) it's very, very stupid that most systems out there will ignore color profile data embedded in pictures (and many video players ignore the same in videos [0]). It's quite possible to tone-map HDR stuff so it looks reasonable on SDR displays, but color management is like accessibility in that nearly noone who's in charge of paying for software development appears to give any shits about it.
[0] A notable exception to this is MPV. I can't recommend this video player highly enough.
HDR full screen content: Yes.
HDR general desktop usage: No. In fact you'll probably actively dislike it to the point of just turning it off entirely. The ecosystem just isn't ready for this yet, although with things like the "constrained-high" concepts ( https://www.w3.org/TR/css-color-hdr-1/#the-dynamic-range-lim... ) this might, and hopefully does, change & improve to a more pleasing result
Also this is assuming an HDR monitor that's also a good match for your ambient environment. The big thing nobody really talks about wiith HDR is that it's really dominated by how dark you're able to get your surrounding environment such that you can push your display "brightness" (read: SDR whitepoint) lower and lower. OLED HDR monitors, for example, look fantastic in SDR and fantastic in HDR in a dark room, but if you have typical office lighting and so you want an SDR whitepoint of around 200-300 nits? Yeah, they basically don't do HDR at all anymore at that point.
I use a mini-led monitor, and its quite decent, except for starfields, & makes it very usable even in bright conditions, and HDR video still is better in bright conditions than the equivalent SDR video.
Like totally washed out
Top tip: If you have HDR turned on for your display in Windows (at least, MacOS not tested) and then share your screen in Teams, your display will look weirdly dimmed for everyone not using HDR on their display—which is everyone.
The difference is absolutely stunning in some games.
In MS Flight Simulator 2024, going from SDR to HDR goes from looking like the computer game it is to looking life-like. Deeper shadows with brighter highlights makes the scene pop in ways that SDR just can't do.
EDIT: You'll almost certainly need an OLED monitor to really appreciate it, though. Local dimming isn't good enough.
If you have say a 400 nits display the HDR may actually look worse than SDR. So it really depends on your screen.
Given that monitors report information about their HDR minimum and maximum panel brightness capabilities to the machine they are connected to, any competently-built HDR renderer (whether that be for games or movies or whatever) will be able to take that information and adjust the picture appropriately.
You know the 0-10 brightness slider you have to pick at the start of a game? Imagine setting it to 0 and still being able to spot the faint dark spot. The dynamic range of things you can see is so much expanded.
Early HDR screens were very limited (limited dimming zones, buggy implementation) but if you get one post 2024 (esp the oled ones) they are quite decent. However it needs to be supported at many layers: not just the monitor, but also the operating system, and the content. There are not many games with proper HDR implementation; and even if there is, it may be bad and look worse — the OS can hijack the rendering pipeline and provide HDR map for you (Nvidia RTX HDR) which is a gamble: it may look bleh, but sometimes also better than the native HDR implementation the game has).
But when everything works properly, wow it looks amazing.
Note that HDR only actually changes how bright things can get. There's zero difference in the dark regions. This is made confusing because HDR video marketing often claims it does, but it doesn't actually. HDR monitors do not, in general, have any advantage over SDR monitors in terms of the darks. Local dimming zones improve dark contrast. OLED improves dark contrast. Dynamic contrast improves dark contrast. But HDR doesn't.
This matches my experience; 0 to 5 look identically black if I turn off HDR
But like if you can't see a difference between 0 to 5 in a test pattern like this https://images.app.goo.gl/WY3FhCB1okaRANc28 in SDR but you can in HDR then that just means your SDR factory calibration is bad, or you've fiddled with settings that broke it.
I'd also be interested in hearing whether it makes sense to look into OLED HDR 400 screens (Samsung, LG) or is it really necessary to get an Asus ProArt which can push the same 1000 nits average as the Apple XDR display (which, mind you, is IPS).
On my Macbook Pro only activates when it needs to but honestly I've only seen one video [1] that impressed me with it, the rest was completely meh. Not sure if its because it's mostly iPhone photography you see in HDR which is overall pretty meh looking anyway.
[1] https://www.youtube.com/watch?v=UwCFY6pmaYY I understand this isn't a true HDR process but someone messing with it in post, but it's the only video I've seen that noticeably shows you colors you can't see on a screen otherwise.
As others here have said, OLED monitors are generally excellent at reproducing a HDR signal, especially in a darker space. But they're terrible for productivity work because they'll get burned in for images that don't change a lot. They're fantastic for movies and gaming, though.
There are a few good non-OLED HDR monitors, but not many. I have an AOC Q27G3XMN; its a 27" 1440p 180hz monitor that is good for entry-level HDR, especially in brighter rooms. It has over 1000 nits of brightness, and no major flaws. It only has 336 backlight zones, though, so you might notice some blooming around subtitles or other fine details where there's dark and light content close together. (VA panels are better than IPS at suppressing that, though.) It's also around half the price of a comparable OLED.
Most of the other non-OLED monitors with good HDR support have some other deal-breaking flaws or at least major annoyances, like latency, screwing up SDR content, buggy controls, etc. The Monitors Unboxed channel on YouTube and rtngs.com are both good places to check.
My current monitor is an OLED and HDR in games looks absolutely amazing. My previous was an IPS that supported HDR, but turning it on caused the backlight to crank to the max, destroying black levels and basically defeating the entire purpose of HDR. Local dimming only goes so far.
Yeah, that's kind of what I meant when I said that most monitors that advertise HDR shouldn't.
The AOC monitor is the third or fourth one I've owned that advertised HDR, but the first one that doesn't look like garbage when it's enabled.
I haven't gone oled yet because of both the cost and the risk of burn-in in my use case (lots of coding and other productivity work, occasional gaming).
I have a LG 2018 OLED that has some burnt in Minecraft hearts because of that, not from Minecraft itself, but just a few hours of minecraft Youtube video in those settings from the built in youtube client, but virtually no other detectable issues after excessive years of use with static content.
You only see them with fairly uniform colors as a background where color banding would usually be my bigger complaint.
So burn-ins definitely happen, but they are far from being a deal breaker over the obvious benefits you get vs other types of displays.
And driving everything possible in dark mode (white text on dark bg) on those displays is even the logical thing to do. Then you dont need much max brightness anyway and even save some energy.
And thats now that all the LEDs are still fresh. I can't imagine how bad it will be in a few years.
Also, a lot of Software doesn't expect the subpixel arrangement, so text will often look terrible.
For desktop work, don't bother unless your work involves HDR content.
It's late night here so I was reading this article in dark mode, at a low display brightness - and when I got to the HDR photos I had to turn down my display even more to not strain my eyes, then back up again when I scrolled to the text.
For fullscreen content (games, movies) HDR is alright, but for everyday computing it's a pretty jarring experience as a user.
I set my screen brightness to a certain level for a reason. Please don’t just arbitrarily turn up the brightness!
There is no good way to disable HDR on photos for iPhone, either. Sure, you can turn off the HDR on photos on your iphone. But then, when you cast to a different display, the TV tries to display the photos in HDR, and it won’t look half as good.
You might be on to something there. Technically, HDR is mostly about profile signaling and therefore about interop. To support it in mpeg dash or hls media you need to make sure certain codec attributes are mentioned in the xml or m3u8 but the actual media payload stays the same.
Any bit or Bob being misconfigured or misinterpreted in the streaming pipeline will result in problems ranging from slightly suboptimal experience to nothing works.
Besides HDR, "spatial audio" formats like Dolby Atmos are notorious for interop isuues
On both Android & iOS/MacOS it's not that HDR is ignoring your screen brightness, but rather the brightness slider is controlling the SDR range and then yes HDR can exceed that, that's the singular purpose of HDR to be honest. All the other purported benefits of HDR are at best just about HDR video profiles and at worst just nonsense bullshit. The only thing HDR actually does is allow for brighter colors vs. SDR. When used selectively this really enhances a scene. But restraint is hard, and most forms of HDR content production are shit. The HDR images that newer iPhones and Pixel phones are capturing are generally quite good because they are actually restrained, but then ironically both of them have horrible HDR video that's just obnoxiously bright.
Doesn't this mean HDR is ignoring my brightness setting? Looking at the Mac color profiles, the default HDR has some fixed max brightness regardless of the brightness slider. And it's very bright, 1600 nits vs the SDR max of 600 nits. At least I was able to pick another option capping HDR to 600, but that still allows HDR video to force my screen to its normal full brightness even if I dimmed it.
It's not just the HDR content that gets brighter, but SDR content too. When I test it in Chrome on Android, if an HDR image shows up on screen the phone start overriding the brightness slider completely and making everything brighter, including the phone's system UI.
>The only thing HDR actually does is allow for brighter colors vs. SDR.
Not just brighter, but also darker, so it can preserve detail in dark areas rather than crushing them.
For context: YouTube automatically edits the volume of videos that have an average loudness beyond a certain threshold. I think the solution for HDR is similar penalization based on log luminance or some other reasonable metric.
I don't see this happening on Instagram any time soon, because bad HDR likely makes view counts go up.
As for the HDR photos in the post, well, those are a bit strong to show what HDR can do. That's why the Mark III beta includes a much tamer HDR grade.
Another related parallel trend recently is that bad AI images get very high view and like counts, so much so that I've lost a lot of motivation for doing real photography because the platforms cease to show them to anyone, even my own followers.
For anyone else who was confused by this, it seems to be a client-side audio compressor feature (not a server-side adjustment) labeled as "Stable Volume". On the web, it's toggleable via the player settings menu.
https://support.google.com/youtube/answer/14106294
I can't find exactly when it appeared but the earliest capture of the help article was from May 2024, so it is a relatively recent feature: https://web.archive.org/web/20240523021242/https://support.g...
I didn't realize this was a thing until just now, but I'm glad they added it because (now that I think about it) it's been awhile since I felt the need to adjust my system volume when a video was too quiet even at 100% player volume. It's a nice little enhancement.
There are still gimmicks, but at least they do not include music so badly clipped as to be unlistenable... hint: go get the DVD or Blu-Ray release of whatever it is and you are likely to enjoy a not clipped album.
It is all about maximizing the overall sonic impact the music is capable of. Now when levels are sane, song elements well differentiated and equalized such that no or only a minor range of frequencies are crushed due to many sounds all competing for them, it will sound, full, great and not tiring!
Thanks audio industry. Many ears appreciate what was done.
I completely understand the desire to address the issue of content authors misusing or intentionally abusing HDR with some kind of auto-limiting algorithm similar to the way the radio 'loudness wars' were addressed. Unfortunately, I suspect it will be difficult, if not impossible, to achieve without also negatively impacting some content applying HDR correctly for artistically expressive purposes. Static photos may be solvable without excessive false positive over-correction but cinematic video is much more challenging due to the dynamic nature of the content.
As a cinemaphile, I'm starting to wonder if maybe HDR on mobile devices simply isn't a solvable problem in practice. While I think it's solvable technically and certainly addressable from a standards perspective, the reality of having so many stakeholders in the mobile ecosystem (hardware, OS, app, content distributors, original creators) with diverging priorities makes whatever we do from a base technology and standards perspective unlikely to work in practice for most users. Maybe I'm too pessimistic but as a high-end home theater enthusiast I'm continually dismayed how hard it is to correctly display diverse HDR content from different distribution sources in a less complex ecosystem where the stakeholders are more aligned and the leading standards bodies have been around for many decades (SMPTE et al).
In contrast, my TV will change brightness modes to display HDR content and disables some of the brightness adjustments when displaying HDR content. It can be very uncomfortably bright in a dark room while being excessively dim in a bright room. It requires adjusting settings to a middle ground resulting in a mixed/mediocre experience overall. My wife’s laptop is the worst of all our devices, while reviews seem to praise the display, it has an overreactive adaptive brightness that cannot be disabled (along with decent G2G response but awful B2W/W2B response that causes ghosting).
I think it's because no one wants it.
(It doesn’t help that Windows only allows HDR to be defined in EDID and monitor INF files, and that PC monitors start shutting off calibration features when HDR is enabled because their chipsets can’t keep up — just as most modern Sony televisions can’t do both Dolby Vision and VRR because that requires too much processing power for their design budget.)
Some games also have a separate slider https://i.imgur.com/wenBfZY.png for adjusting "paper white", which is the HDR white one might normally associate with matching to SDR reference white (100 nits when in a dark room according to the SDR TV color standards, higher in other situations or standards). Extra note: the peak brightness slider in this game (Red Dead Redemption 2) is the same knob as the brightness slider in the above Battlefield V screenshot)
If I enable HDR the Firefox ones become a gray mess vs the lights feeling like actual lights in Safari.
edit: Ah, nevermind. It seems Firefox is doing some sort of post-processing (maybe bad tonemapping?) on-the-fly as the pictures start out similar but degrade to washed out after some time. In particular, the "OVERTHROW BOXING CLUB" photo makes this quite apparent.
That's a damn shame Firefox. C'mon, HDR support feels like table stakes at this point.
edit2: Apparently it's not table stakes.
> Browser support is halfway there. Google beat Apple to the punch with their own version of Adaptive HDR they call Ultra HDR, which Chrome 14 now supports. Safari has added HDR support into its developer preview, then it disabled it, due to bugs within iOS.
at which point I would just say to `lux.camera` authors - why not put a big fat warning at the top for users with a Firefox or Safari (stable) browser? With all the emphasis on supposedly simplifying a difficult standard, the article has fallen for one of its most famous pitfalls.
"It's not you. HDR confuses tons of people."
Yep, and you've made it even worse for a huge chunk of people. :shrug: Great article n' all just saying.
Second, the HDR effect seems to be implemented in a very crude way, which causes the whole Android UI (including the Android status bar at the top) to become brighter when HDR content is on screen. That's clearly not right. Though, of course, this might also be some issue of Android rather than Chrome, or perhaps of the Qualcomm graphics driver for my Adreno GPU, etc.
A lot of these design flaws are fixed by Firefox's picture in picture option but for some reason, with the way you coded it, the prompt to pop it out as PIP doesn't show up
"we finally explain what HDR actually means"
Then spends 2/3rds of the article on a tone mapping expedition, only to not address the elephant in the room, that is the almost complete absence of predictable color management in consumer-grade digital environments.
UIs are hardly ever tested in HDR: I don't want my subtitles to burn out my eyes in actual HDR display.
It is here, where you, the consumer, are as vulnerable to light in a proper dark environment for movie watching, as when raising the window curtains on a bright summer morning. (That brightness abuse by content is actually discussed here)
Dolby Vision and Apple have the lead here as a closed platforms, on the web it's simply not predictably possible yet.
Best hope is the efforts of the Color on the Web Community Group from my impression.
For example, in video games, "HDR" has been around since the mid '00s, and refers to games that render a wider dynamic range than displays were capable of, and use post-process effects to simulate artifcats like bloom and pupil dilation.
In photography, HDR has almost the opposite meaning of what it does everywhere else. Long and multiple exposures are combined to create an image that has very little contrast, bringing out detail in a shot that would normally be lost in shadows or to overexposure.
Bad HDR boils down to poor taste and the failure of platforms to rein it in. You can't fix bad HDR by switching encodings any more than you can fix global warming by switching from Fahrenheit to Celsius.
Color management and handling HDR in UIs is probably a bit out of scope.
No. Because it's written for the many casual photographers we've spoken with who are confused and asked for an explainer.
> Then spends 2/3rds of the article on a tone mapping expedition, only to not address the elephant in the room, that is the almost complete absence of predictable color management in consumer-grade digital environments.
That's because this post is about HDR and not color management, which is different topic.
To be fair, it would be pretty weird if you found your own post off-putting :P
On the HN frontpage, people are likely thinking of one of at least three things:
HDR as display tech (hardware)
HDR as wide gamut data format (content)
HDR as tone mapping (processing)
...
So when the first paragraph says we finally explain what HDR actually means, it set me off on the wrong foot—it comes across pretty strongly for a term that’s notoriously context-dependent. Especially in a blog post that reads like a general explainer rather than a direct Q&A response when not coming through your apps channels.
Then followed up by The first HDR is the "HDR mode" introduced to the iPhone camera in 2010. caused me to write the comment.
For people over 35 with even the faintest interest in photography, the first exposure to the HDR acronym probably didn’t arrive with the iPhone in 2010, but HDR IS equivalent to Photomatix style tone mapping starting in 2005 as even mentioned later. The ambiguity of the term is a given now. I think it's futile to insist or police one meaning other the other in non-scientific informal communication, just use more specific terminology.
So the correlation of what HDR means or what sentiment it evokes in people by age group and self-assesed photography skill might be something worthwhile to explore.
The post get's a lot better after that. That said, I really did enjoy the depth. The dive into the classic dodge and burn and the linked YouTube piece. One explainer at a time makes sense—and tone mapping is a good place to start. Even tone mapping is fine in moderation :)
Often, we don't get that and this topic, plus my relative ignorance on it, welcomed the post as written.
Now I even remember the 2005 HDR HL2 Lost Coast Demo was a thing 20 years ago: https://bit-tech.net/previews/gaming/pc/hl2_hdr_overview/1/
Yeah, I had a full halt and process exception on that line too. I guess all the research, technical papers and standards development work done by SMPTE, Kodak, et al in the 1990s and early 2000s just didn't happen? Turns out Apple invented it all in 2010 (pack up those Oscars and Emmys awarded for technical achievement and send'em back boys!)
It's about HDR from the perspective of still photography, in your app, on iOS, in the context of hand-held mobile devices. The post's title ("What Is HDR, Anyway?"), content level and focus would be appropriate in the context of your company's social media feeds for users of your app - which is probably the audience and context it was written for. However in the much broader context of HN, a highly technical community whose interests in imaging are diverse, the article's content level and narrow focus aren't consistent with the headline title. It seems written at a level appropriate for novice users.
If this post was titled "How does Halide handle HDR, anyway?" or even "How should iOS photo apps handle HDR, anyway?" I'd have no objection about the title's promise not matching the content for the HN audience. When I saw the post's headline I thought "Cool! We really need a good technical deep dive into the mess that is HDR - including tech, specs, standards, formats, content acquisition, distribution and display across content types including stills, video clips and cinematic story-telling and diverse viewing contexts from phones to TVs to cinemas to VR." When I started reading and the article only used photos to illustrate concepts best conveyed with color gradient graphs PLUS photos, I started to feel duped by the title.
(Note: I don't use iOS or your app but the photo comparison of the elderly man near the end of the article confused me. From my perspective (video, cinematography and color grading), the "before" photo looks like a raw capture with flat LUT (or no LUT) applied. Yet the text seemed to imply Halide's feature was 'fixing' some problem with the image. Perhaps I'm misunderstanding since I don't know the tool(s) or workflow but I don't see anything wrong with the original image. It's what you want in a flat capture for later grading.)
I predict HDR content on the web will eventually be disabled or mitigated on popular browsers similarly to how auto-playing audio content is no longer allowed [1]
Spammers and advertisers haven't caught on yet to how abusively attention grabbing eye-searingly bright HDR content can be, but any day now they will and it'll be everywhere.
1. https://hacks.mozilla.org/2019/02/firefox-66-to-block-automa...
High dynamic resolution has always been about tone mapping. Post-sRGB color profile support is called “Wide color” these days, has been available for twenty years or more on all DSLR cameras (such as Nikon ProPhoto RGB supported in-camera on my old D70), and has nothing to do with the dynamic range and tone mapping of the photo. It’s convenient that we don’t have to use EXR files anymore, though!
An HDR photo in sRGB will have the same defects beyond peak saturation at any given hue point, as an SDR photo in sRGB would, relative to either in DCI-P3 or ProPhoto. Even a two-bit black-or-white “what’s color? on or off pixels only” HyperCard dithered image file can still be HDR or SDR. In OKLCH, the selected luminosity will also impact the available chroma range; at some point you start spending your new post-sRGB peak chroma on luminosity instead; but the exact characteristic of that tradeoff at any given hue point is defined by the color profile algorithm, not by whether the photo is SDR or HDR, and the highest peak saturation possible for each hue is fixed, whatever luminosity it happens to be at.
The photo capture HDR is good. That's a totally different thing and shouldn't have had its name stolen.
About HDR on phones, I think they are the blight of photography. No more shadows and highlights. I find they are good at capturing family moments, but not as a creative tool.
I still use it myself but I need to redo the build system and release it with an updated LibRaw... not looking forward to that.
Slide film has probably a third the dynamic range of negative film and is meant as the final output fit for projection to display.
All this aside, HDR and high brightness are different things - HDR is just a representational thing. You can go full send on your SDR monitor as well, you'll just see more banding. The majority of the article is just content marketing about how they perform automatic tonemapping anyways.
That’s a consequence of https://en.wikipedia.org/wiki/Adaptation_(eye). If you look at 1000 nits on a display in bright sunlight, with your eyes adapted to the bright surroundings, the display would look rather dim.
Literal snort.
This is also true for consumers. I don't own a single 4k or HDR display. I probably won't own an HDR display until my TV dies, and I probably won't own a 4k display until I replace my work screen, at which point I'll also replace one of my home screens so I can remote into it without scaling.
I also have a screen which has a huge gamut and blows out colors in a really nice way (a bit like the aftereffects of hallucinogens, it has colors other screens just don't) and you don't have to touch any settings.
My OLED TV has HDR and it actually seems like HDR content makes a difference while regular content is still "correct".
People in the HN echo chamber over-estimate hardware adoption rates. For example, there are millions of people who went straight from CDs to streaming, without hitting the iPod era.
A few years ago on HN, there was someone who couldn't wrap their brain around the notion that even though VCRs were invented in the early 1960's that in 1980, not everyone owned one, or if they did, they only had one for the whole family.
Normal people aren't magpies who trash their kit every time something shiny comes along.
Who?
There was about a decade there where everyone who had the slightest interest in music had an mp3 player of some kind, at least in the 15-30 age bracket.
1: Well my car would play MP3s burned to CDs in its CD player; not sure if that counts.
I finished high school in 2001 and didn't immediately go to college, so I just didn't have a need for a personal music player anymore. I was nearly always at home or at work, and I drove a car that actually had an MP3 CD player. I felt no need to get an iPod.
In 2009, I started going to college, but then also got my first smartphone, the Motorola Droid, which acted as my portable MP3 player for when I was studying in the library or taking mass transit.
If you were going to school or taking mass transit in the middle of the '00s, then you were probably more likely to have a dedicated MP3 player.
Point of clarification: While the technology behind the VCR was invented in the '50s and matured in the '60s, consumer-grade video tape systems weren't really a thing until Betamax and VHS arrived in 1975 and 1976 respectively.
Early VCRs were also incredibly expensive, with prices ranging from $3,500 to almost $10,000 after adjusting for inflation. Just buying into the VHS ecosystem at the entry level was a similar investment to buying an Apple Vision Pro today.
Don't feel like you have to. I bought a giant fancy TV with it, and even though it's impressive, it's kinda like ultra-hifi-audio. I don't miss it when I watch the same show on one of my older TVs.
If you ever do get it, I suggest doing for a TV that you watch with your full attention, and watching TV / movies in the dark. It's not very useful on a TV that you might turn on while doing housework; but very useful when you are actively watching TV with your full attention.
Also in my country (Italy) TV transmissions are 1080i at best, a lot are still 570i (PAL resolution). Streaming media can be 4K (if you have enough bandwidth to stream it at that resolution, which I don't have at my house). Sure, if you download pirated movies you find it at 4K, and if you have the bandwidth to afford it... sure.
But even there, sometimes is better a well done 1080p movie than an hyper compressed 4K one, since you see compression artifacts.
To me 1080p, and maybe even 720p, is enough for TV vision. Well, sometimes I miss the CRT TVs, they where low resolution but for example had a much better picture quality than most modern 4K LCD TV where black scenes are gray (I know there is OLED, but is too expensive and has other issues).
Kind of crazy no one thought of this aspect and we just march on to higher resolution and the required hardware for that.
My own movie collection is mostly 2-4GB SDR 1080p files and looks wonderful.
Like a lot of things, it’s weird how some people are more sensitive to visual changes. For example:
- At this point, I need 120hz displays. I can easily notice when my wife’s phone is in power saver mode at 60hz.
- 4k vs 1080p. This is certainly more subtle, but I definitely miss detail in lower res content.
- High bitrate. This is way more important than 4k vs 1080p or even HDR. But it’s so easy to tell when YouTube lowers the quality setting on me, or when a TV show is streaming at a crappy bitrate.
- HDR is tricky, because it relies completely on the content creator to do a good job producing HDR video. When done well, the image basically sparkles, water looks actually wet, parts of the image basically glow… it looks so good.
I 100% miss this HDR watching equivalent content on other displays. The problem is that a lot of content isn’t produced to take advantage of this very well. The HDR 4k Blu-ray of several Harry Potter movies, for example, has extremely muted colors and dark scenes… so how is the image going to pop? I’m glad we’re seeing more movies rely on bright colors and rich, contrasty color grading. There are so many old film restorations that look excellent in HDR because the original color grade had rich, detailed, contrasty colors.
On top of that, budget HDR implementations, ESPECIALLY in PC monitors, just don’t get very bright. Which means their HDR is basically useless. It’s impossible to replicate the “shiny, wet look” of really good HDR water if the screen can’t get bright enough to make it look shiny. Plus, it needs to be selective about what gets bright, and cheap TVs don’t have a lot of backlighting zones to make that happen very well.
So whereas I can plug in a 4k 120hz monitor and immediately see the benefit in everything I do for normal PC stuff, you can’t get that with HDR unless you have good source material and a decent display.
I think the industry is strangling itself putting "DisplayHDR 400" certification on edgelit/backlit LCD displays. In order for HDR to look "good" you either need high resolution full array local dimming backlighting (which still isn't perfect), or a panel type that doesn't use any kind of backlighting like OLED.
Viewing HDR content on these cheap LCDs often looks worse than SDR content. You still get the wider color gamut, but the contrast just isn't there. Local dimming often loses all detail in shadows whenever there is something bright on the screen.
I absolutely loathe consuming content on a mobile screen, but its the reality is the vast majority are using phone and tablets most the time.
The problem starts with sending HDR content to SDR-only devices, or even just other HDR-standards. Not even talking about printing here.
This step can inherently only be automated so much, because it's also a stylistic decision on what information to keep or emphasize. This is an editorial process, not something you want to emburden casual users with. What works for some images can't work for others. Even with AI the preference would still need to be aligned.
https://docs.krita.org/en/general_concepts/colors/bit_depth....
https://docs.krita.org/en/general_concepts/colors/color_spac...
https://docs.krita.org/en/general_concepts/colors/scene_line...
It didn't take very long to learn, and it turned out to be extremely important in the work I did during the early days at Waymo and later at Motional.
I wanted to pass along this fun video from several years ago that discusses HDR: https://www.youtube.com/watch?v=bkQJdaGGVM8 . It's short and fun, I recommend it to all HN readers.
Separately, if you want a more serious introduction to digital photography, I recommend the lectures by Marc Levoy from his Stanford course: https://www.youtube.com/watch?v=y7HrM-fk_Rc&list=PL8ungNrvUY... . I believe he runs his own group at Adobe now after leading a successful effort at Google making their pixel cameras the best in the industry for a couple of years. (And then everyone more-or-less caught up, just like with most tech improvements in the history of smartphones).
https://blog.adobe.com/en/publish/2023/10/10/hdr-explained
Greg Benz Photography maintains a list of software here:
https://gregbenzphotography.com/hdr-display-photo-software/
I'm not sure what FOSS options there are; it's difficult to search for given that "HDR" can mean three or four different things in common usage.
This. I can always tell when someone "gets" software development when they either understand (or don't) that computers can't read minds or infer intent like a person can.
My understanding is most SDR TVs and computer screens have displays about 200-300 nits (aka cd/m²). Is that the correct measure of the range of the display? The brightest white is 300 nits brighter than the darkest black?
We’ve had HDR formats and HDR capture and edit workflows since long before HDR displays. The big benefit of HDR capture & formats is that your “negative” doesn’t clip super bright colors and doesn’t lose color resolution in super dark color. As a photographer, with HDR you can re-expose the image when you display/print it, where previously that wasn’t possible. Previously when you took a photo, if you over-exposed it or under-exposed it, you were stuck with what you got. Capturing HDR gives the photographer one degree of extra freedom, allowing them to adjust exposure after the fact. Ansel Adams wasn’t using HDR in the same sense we’re talking about, he was just really good at capturing the right exposure for his medium without needing to adjust it later. There is a very valid argument to be made for doing the work up-front to capture what you’re after, but ignoring that for a moment, it is simply not possible to re-expose Adams’ negatives to reveal color detail he didn’t capture. That’s why he’s not using HDR, and why saying he is will only further muddy the water.
I came here to point this out. You have a pretty high dynamic range in the captured medium, and then you can use the tools you have to darken or lighten portions of the photograph when transferring it to paper.
Notably, the dodging and burning used by photographers aren't obsolete. There's a reason these tools are included in virtually every image-editing program out there. Manipulating dynamic range, particularly in printed images, remains part of the craft of image-making.
All mediums have a range, and they've never all matched. Sometimes we've tried to calibrate things to match, but anyone watching SDR content for the past many years probably didn't do so on a color-calibrated and brightness calibrated screen - that wouldn't allow you to have a brightness slider.
HDR on monitors is about communicating content brightness and monitor capabilities, but then you have the question of whether to clip the highlights or just map the range when the content is mastered for 4000 nits but your monitor manages 1000-1500 and only in a small window.
That said, there is one important part that is often lost. One of the ideas behind HDR, sometimes, is to capture absolute values in physical units, rather than relative brightness. This is the distinguishing factor that film and paper and TVs don’t have. Some new displays are getting absolute brightness features, but historically most media display relative color values.
That isn't what the article claims. It says:
"Ansel Adams, one of the most revered photographers of the 20th century, was a master at capturing dramatic, high dynamic range scenes."
"Use HDR" (your term) is vague to the point of not meaning much of anything, but the article is clear that Adams was capturing scenes that had a high dynamic range, which is objectively true.
It is directly addressing capture.
Edit: and btw I am objecting to calling film capture “HDR”, I don’t think that helps define HDR nor reflects accurately on the history of the term.
Film provided a higher dynamic range than digital sensors, and professionals wanted to capture that for image editing.
Sure, it wasn’t terribly deep HDR by today’s standards. Cineon used 10 bits per channel with the white point at coding value 685 (and a log color space). That’s still a lot more range and superwhite latitude than you got with standard 8-bpc YUV video.
I’m certain physicists had high range digital cameras before Cineon, and they were working in absolute physical metrics. That would be a stronger example.
You bring up an important point that is completely lost in the HDR discussion: this is about color resolution at least as much as it’s about range, if not moreso. I can use 10 bits for a [0..1] range just as easily as I can use 4 bits to represent quantized values from 0 to 10^9. Talking about the range of a scene captured is leaving out most of the story, and all of the important parts. We’ve had outdoor photography, high quality films, and the ability to control exposure for a long time, and that doesn’t explain what “HDR” is.
But I agree that the term is such a wide umbrella that almost anything qualifies. Fifteen years ago you could do a bit of superwhite glows and tone mapping on 8-bpc and people called that look HDR.
This 10 bit scanner gave you headroom of like 30% above white. So yeah it qualifies as a type of high dynamic range when compared to 8 bit/channel RGB, but on the other hand, a range of [0 .. 1.3] isn’t exactly in the spirit of what “HDR” stands for. The term implicitly means a lot more than 1.0, not just a little. And again people developing HDR like Greg Ward and Paul Debevec were arguing for absolute units such as luminance, which the Cineon scanner does not do.
> OpenEXR (www.openexr.net), its previously proprietary extended dynamic range image file format, to the open source community
https://web.archive.org/web/20170721234341/http://www.openex...
And "larger dynamic range" by Rea & Jeffrey (1990):
> With γ = 1 there is equal brightness resolution over the entire unsaturated image at the expense of a larger dynamic range within a given image. Finally, the automatic gain control, AGC, was disabled so that the input/output relation would be constant over the full range of scene luminances.
https://doi.org/10.1080/00994480.1990.10747942
I'm not sure when everyone settled on "high" rather than "large" or "extended", but certainly 'adjective dynamic range' is near-universal.
I don't see the confusion here.
https://news.ycombinator.com/item?id=43987923
That said, the entire reason that tonemapping is a thing, and the primary focus of the tonemapping literature, is to solve the problem of squeezing images with very wide ranges into narrow display ranges like print and non-HDR displays, and to achieve a natural look that mirrors human perception of wide ranges. Tonemapping might be technically independent of HDR, but they did co-evolve, and that’s part of the history.
https://www.kimhildebrand.com/how-to-use-the-zone-system/
where my interpretation is colored by the experience of making high quality prints and viewing them under different conditions, particularly poor illumination quality but you could also count "small handheld game console", "halftone screened and printed on newsprint" as other degraded conditions. In those cases you might imagine that the eye can only differentiate between 11 tones so even if an image has finer detail it ought to connect well with people if colors were quantized. (I think about concept art from Pokémon Sun and Moon which looked great printed with a thermal printer because it was designed to look great on a cheap screen.)
In my mind, the ideal image would look good quantized to 11 zones but also has interesting detail in texture in 9 of the zones (extreme white and black don't show texture). That's a bit of an oversimplification (maybe a shot outdoors in the snow is going to trend really bright, maybe for artistic reasons you want things to be really dark, ...) but Ansel Adams manually "tone mapped" his images using dodging, burning and similar techniques to make it so.
Reminded me of the classic "HDR in games vs HDR in photography" comparison[0]
[0] https://www.realtimerendering.com/blog/thought-for-the-day/
Creative power is still in your hands versus some tone mapper's guesses at your intent.
Can people go overboard? Sure, but thats something they will do regardless of any hdr or lack thereof.
On an aside its still rough that just about every site that touches gain map (adaptive HDR as this blog calls them) HDR images will lose that metadata if they need to scale, recompress, or transform the images otherwise. Its led me to just make my own site, but also to handle what files a client gets a bit smarter . For instance if a browser doesnt support .jxl or .avif images, im sure it wont want an hdr jpeg either, thats easy to handle on a webserver.
I love when product announcements and ADS in general are high value works. This one was good education for me. Thank you for it!
I had also written about my plasma and CRT displays and how misunderstandings about HDR made things generally worse and how I probably have not seen the best these 10 bit capable displays can do.
And finally, I had written about 3D TV and how fast, at least 60Hz per eye, 3D in my home made for amazing modeling and assembly experiences! I was very sad to see that tech dead end.
3D for technical content create has a lot of legs... if only more people could see it running great...
Thanks again. I appreciate the education.
Hopefully HN allows me to share an App Store link... this app works best on Pro iPhones, which support ProRAW, although I do some clever stuff on non-Pro iPhones to get a more natural look.
Not having before-and-after comparisons is mostly down to my being concerned about whether that would pass App Review; the guidelines indicate that the App Store images are supposed to be screenshots of the app, and I'm already pushing that rule with the example images for filters. I'm not sure a hubristic "here's how much better my photos are than Apple's" image would go over well. Maybe in my next update? I should at least have some comparisons on my website, but I've been bad at keeping that updated.
There's no Live Photo support, though I've been thinking about it. The reason is that my current iPhone 14 Pro Max does not support Live Photos while shooting in 48-megapixel mode; the capture process takes too long. I'd have to come up with a compromise such as only having video up to the moment of capture. That doesn't prevent me from implementing it for other iPhones/cameras/resolutions, but I don't like having features unevenly available.
I really appreciate the article. I could feel that they also have a product to present, because of the many references, but it was also very informative besides that.
I wonder if there’s an issue in Windows tonemapping or HDR->SDR pipeline, because perceptually the HDR image is really off.
It’s more off than if I took an SDR picture of my iPhone showing the HDR image and showed that SDR picture on the said Windows machine with an IPS panel. Which tells me that the manual HDR->SDR “pipeline” I just described is better.
I think Windows showing HDR content on a non-HDR display should just pick an SDR-sized section of that long dynamic range and show it normally. Without trying to remap the entire large range to a smaller one. Or it should do some other perceptual improvements.
Then again, I know professionally that Windows HDR is complicated and hard to tame. So I’m not really sure the context of remapping as they do, maybe it’s the only way in some contingency/rare scenario.
Sidebar: I kinda miss when Halide's driving purpose was rapid launch and simplicity. I would almost prefer a zoom function to all of this HDR gymnastics (though, to be clear, Halide is my most-used and most-liked camera app).
EDIT: Ah, I see, it's a Mark III feature. That is not REMOTELY clear in the (very long) post.
My hypothesis are the following:
- Increase display lighting to increase peak white point + use a black ink able to absorb more light (can Vantablack-style pigments be made into ink?) => increase dynamic range of a printable picture
- Alternatively, have the display lighting include visible light + invisible UV light, and have the printed picture include an invisible layer of UV ink that shines white : the pattern printed in invisible UV-ink would be the "gain map" to increase the peak brightness past incident visible light into HDR range.
What do you folks think?
The hardest part of it, by far, was taking hundreds upon hundreds of pictures of a blank piece of paper in different lighting conditions with different settings.
4ad•6h ago
the__alchemist•6h ago
4ad•6h ago
From a technical point of view, HDR is just a set of standards and formats for encoding absolute-luminance scene-referred images and video, along with a set of standards for reproduction.
cornstalks•6h ago
And no, it's not necessarily absolute luminance. PQ is absolute, HLG is not.
skhameneh•6h ago
Also DCI-P3 should fit in here somewhere, as it seems to be the most standardized color space for HDR. I would share more insight, if I had it. I thought I understood color profiles well, but I have encountered some challenges when trying to display in one, edit in another, and print “correctly”. And every device seems to treat color profiles a little bit differently.
kllrnohj•5h ago
All transfer functions can generally work on either integer range or floating point. They basically just describe a curve shape, and you can have that curve be over the range of 0.0-1.0 just as easily as you can over 0-255 or 0-1023.
Extended sRGB is about the only thing that basically requires floating point, as it specifically describes 0.0-1.0 as being equivalent to sRGB and then has a valid range larger than that (you end up with something like -.8 to 2.4 or greater). And representing that in integer domain is conceptually possible but practically not really.
> Also DCI-P3 should fit in here somewhere, as it seems to be the most standardized color space for HDR.
BT2020 is the most standardized color space for HDR. DCI-P3 is the most common color gamut of HDR displays that you can actually afford, however, but that's a smaller gamut than what most HDR profiles expect (HDR10, HDR10+, and "professional" DolbyVision are all BT2020 - a wider gamut than P3). Which also means most HDR content specifies a color gamut it doesn't actually benefit from having as all that HDR content is still authored to only use somewhere between the sRGB and DCI-P3 gamut since that's all anyone who views it will actually have.
cornstalks•4h ago
The math uses real numbers but table 2-4 ("Digital representation") discusses how the signal is quantized to/from analog and digital. The signal is quantized to integers.
This same quantization process is done for sRGB, BT.709, BT.2020, etc. so it's not unique to HLG. It's just how digital images/video are stored.
dahart•3h ago
https://www.graphics.cornell.edu/~bjw/rgbe.html
It uses a type of floating point, in a way, but it’s a shared 8 bit exponent across all 3 channels, and the channels are still 8 bits each, so the whole thing fits in 32 bits. Even the .txt file description says it’s not “floating point” per-se since that implies IEEE single precision floats.
Cameras and displays don’t typically use floats, and even CG people working in HDR and using, e.g., OpenEXR, might use half floats more often that float.
Some standards do exist, and it’s improving over time, but the ideas and execution of HDR in various ways preceded any standards, so I think it’s not helpful to define HDR as a set of standards. From my perspective working in CG, HDR began as a way to break away from 8 bits per channel RGB, and it included improving both color range and color resolution, and started the discussion of using physical metrics as opposed to relative [0..1] ranges.
kllrnohj•6h ago
pavlov•6h ago