This is one of the stupid things with many monitors, showing HDR at 250 nits is worse than showing no HDR at all. So no matter what you do, 99% of HDR content will look bad on your screen.
Games are just truly awful in making scenes completely in viewable, even when the HDR areas, the blacks and whites, have interactive elements in them you need to see and know about.
Not that many games on the console that take advantage of it, mind you. More testing needed.
Interestingly my laptop's display reaches 500 nits and that is already painfully high outside of midday hours. My phone goes to 875 and I find that only to be useful outside in the summer sun.
I disagree. The wide color gamut is -for me- a huge thing about HDR. My VA monitor provides ~300 nits of brightness and I've been quite happy with the games that didn't phone in their HDR implementation.
Plus, any non-trash HDR monitor will tell the computer it's attached to what its maximum possible brightness is, so the software running on that computer can adjust its renderer accordingly.
My monitor does do that, but alas the software itself (Windows 10) wasn't good enough to adjust stuff correctly. It did made the decision to switch to ArchLinux easier by being one less thing I'll be missing
Calibrate the display with HDR enabled for a better SDR response.
A while back, I tried an OLED gaming monitor that was widely reviewed as being very good. While it was somewhat better than the VA monitor that I've been using for years, it was nowhere near 1,500 USD good. I could see someone coming from an IPS or TN screen being very impressed with it, though.
It depends on the monitor and the colors involved in the transition. My VA monitor (a BenQ EW3270U) has limited-but-noticeable smearing between certain dark colors. Blacks and dark colors against mid-brightness and brighter are just fine. [0] It's my understanding that this monitor has quite-a-bit-less-bad color smearing than most VA panels, and has roughly the same -er- amount of slow transitions (just with a different set of colors) as my Asus PA246 IPS monitor.
I play a variety of video games, so I see both muddily-dark and high-contrast areas. I'm fairly pleased with the performance of the panel they dropped into this monitor. Honestly, the off-axis color and contrast shifting is way more noticeable than the color smearing... and folks who sit down in front of this monitor don't tend to notice those shifts.
(Plus, if you play 3D video games released within the last five years, crap like temporal-anti-aliasing and its bastard children add so much smearing and rendering artifacts that it becomes quite challenging to determine what visual artifacts are actual pixels commanded to be on the screen by the renderer, and what might from too-slow flipping of the pixels in the screen. Is this a good state of affairs? Definitely not. But it's the one we find ourselves in.)
[0] It's entirely unlike the OLED screen in the Nexus 5a which has incredible smearing between black and a huge array of dark-to-medium-brightness colors. This smearing reduces as you increase the brightness of the screen, but doesn't go away entirely until you get to like the top quarter of the screen's brightness. (If you have one of these phones, drop your screen brightness to the bottom quarter and browse through back issues of the Gunnerkrigg Court webcomic. There are PLENTY of black-on-X color combinations to get a really obvious demonstration of the problem.)
Not sure if I'm missing a setting, but I end up having to manually turn HDR on before playing a game and off after.
After a while people turn it back down to like a 4 and it improves things.
All I see is opinions though. And the internet is full of them. You just have to Google "why does this game look so ...". At least if the author had compared the search stats of "good/bad/beautiful/washed out" it would've carried some weight.
The GTA 5 screenshot is a terrible example. It looks like a cheap, dead, video game environment, reminding me how far we've come.
And we need some examples of good, cinematic, artful tone mapping, like any scene of a Hollywood movie set in Mexico...
And I agree that it would be nice to have some positive examples. I think there were a bunch of SNES games which did it well, but that may just be nostalgia.
I remember it looked beautiful. Especially comparing to early 3D games of that era.
That's not tone mapping, but color grading
- The first steps in Limveld
- Liurnia of the Lakes (from Stormveil)
- Leyndell
- The first look at the Scadutree
- Cerulean Coast
- Stone Coffin Fissure
- Enir Ilim
I can't remember another property with a similar diversity of incredibly beautiful and imposing areas.
Author is fumbling the difference between aesthetics and realism. Videogames feeling videogamey? What a travesty.
There do seem to be plenty of issues around HDR for sure, in some games I had to intentionally disable HDR on my PS5 because it just looked bad on my setup.
One can claim HZD's look is an "artistic choice" and that's inarguable, but the author believes it's simply not enough attention to the tone mapping process, which is a very complicated topic that's not usually taken seriously in game dev compared to film production.
I have yet to get any benefit out of it.
I disable it everywhere I can. In Instagram for example. When it is turned on (the default) every now and then I get some crazy glaring image in my feed that hurts.
Maybe it is because I don't play games? Is HDR useful anywhere outside of games?
Are you using an Apple machine to do your browsing? I have heard that Apple has (for some damn reason) decided to do this sort of crap with HDR-pictures-in-an-otherwise-SDR-document. It's nuts. This doesn't happen to me on Windows, and -because I use xorg- I've no idea what happens on Linux.
And I indirectly taught me how to use the exposure feature in my iPhone camera (when you tap a point in the picture). It's so that you choose the "middle gray" point of the picture for the tone mapping process, using your eyes which have a much greater dynamic range than a CCD sensor. TIL.
No, it uses that to set the physical exposure via the shutter speed and ISO (iPhones have a fixed aperture, so that cannot be changed). It literally says this in the video you linked. This is not tone mapping. Tone mapping in a way may also happen afterwards to convert from the wider dynamic range of the sensor if the output format has a more limited dynamic range.
There's a stark contrast here with MS Flight Simulator which looks great but maybe a bit too pretty. It's certainly very pleasing to look at but not necessarily realistic.
One thing with flying is that visibility isn't necessarily that good and a big part of using flight simulators professionally is actually learning to fly when the visibility is absolutely terrible. What's the relevance of scenery if visibility is at the legal minimums? You see the ground shortly before you land, a few feet in front of you.
And even under better conditions, things are hazy and flat (both in color and depth). A crisp, high contrast, saturated view is pretty but not what a pilot deals with. A real problem for pilots is actually spotting where the airport is. Which is surprisingly hard even when the weather is nice and sunny.
An interesting HDR challenge with cockpits is that the light level inside and outside are miles apart. When flying in the real world, your eyes compensate for this when you focus on the instruments or look outside. But technically any screenshot that features a bright outside and clearly legible instruments at the same time is not very realistic but also kind of necessary. You need to do some HDR trickery to make that work. Poor readability of instruments is something X-plane addressed in one of their recent updates. It was technically correct but not that readable.
X-plane rendering has made some big improvements with all this during the v12 release over the last three years.
> In the real world, the total contrast ratio between the brightest highlights and darkest shadows during a sunny day is on the order of 1,000,000:1.
And this is of course silly. In the real world you can have complete darkness, at which point dynamic range shoots up to infinity.
> A typical screen can show 8 (curved to 600:1 or so).
Not entirely sure about this either, monitors have been pulling 1000:1 and 2000:1 dynamic ranges since forever, even back in 2017 when this article was written, but maybe I just never looked too deep into it.
The points are: game graphics is indeed suffering, but the problem is not being unlike films and photos, it's the opposite. The games should stop using film industry produced tone mapping curves and instead create their own, making a clean break.
Personally, I agree with the video.
The RTS switch to 3D was a mistake and I think RTSes will continue to fail until their developers realize what actually makes them fun is actively hindered by this technology.
I suspect part of the challenge with making a hit game with last-gen graphics (like Breath of the Wild) is that you need actual artists to make it look good.
Their hardware is underpowered, games look like cheap cartoons, but the effort spent into gameplay more than compensates.
Nintendo games don't look like cheap cartoons at all. They are absolutely not photorealistic but they do put a lot of work on the aesthetics/art and it's most of the time relly impressive once you take the hardware limitations into account.
Mario 64 ran on the same console that was known for its 3D blur.
Mario Galaxy 1&2 (which are still totally modern in terms of aesthetics) ran on what was basically an overclocked gamecube.
Mario Kart 8 which is still more beautiful than a lot of modern games ran on the Switch, which is itself based on a 2015 mid-range smartphone hardware.
I like my indie games, but not many are putting out what Nintendo is.
I mean it’s all subjective though.
I considered, and passed on, the other consoles.
Nintendo is playing a different game than other console/game makers (excuse the pun), IMHO.
IMO it leads to really stilted experiences, like where now you have some photo realistic person with their foot hovering slightly in space, or all that but you still see leaves clipping through eachother, or the unanny valley of a super realistic human whose eyes have a robotic lock on your face, etc.
Physical interaction with game worlds (wasd and a single pivot, or maybe a joystick and a couple buttons) hasn't increased in depth in 20 years which only emphasizes the disjointedness.
I am personally not content with that and I explore all I can, and am trying to make games that skirt the trends a little bit.
But that stark contrast between visual fidelity but a lack of interactivity has been a pet peeve of mine for a while. You can even do so much more with just mouse and keyboard interactions, but I think it's overshadowed by the much lower risk visual fidelity goals.
The most amazing gaming experience I've ever had was walking around the city at night in Cyberpunk 2077. For the first time in my life, I felt I was actually in the future. Zelda can't pull that off with me, despite being a great game from other perspectives.
This is one reason, I believe, why some people can't stand animated cartoons. I like them but I know many people who won't even consider watching animation.
If we define immersion as "your vision focuses on what's inside the screen and you ignore the world around the screen, and you mostly ignore that your control of the player character is through a keyboard and mouse", then I've experienced immersion with every first person game ever, including Minecraft. I never considered that some people might need photorealism for that at all. There was another commenter that mentioned being unable to walk over a short wall due to character controller limitations as being immersion-breaking. I agree this is annoying but the qualia of it is more like a physical confusion rather than being something that actually breaks my experience of the game.
I'm also thinking this might be related to why I find VR to be, while very cool, not some revolutionary new technology that will fundamentally change the world.
VR despite its limitations is the one thing I’ve ever achieved “presence” in, as in feeling if for a brief moment, I was actually there.
Elite dangerous, OLED Unit, HOTAS. For a brief moment in time my brain believed it was in the cockpit of a spaceship.
Most releveant to this comment thread however was the fact that the graphics were very crude and not in a good way. I absolutely dispute the claim that realism equals (immersion/presence - I'm not getting involved in the debate about the distinction between the two)
I’d agree that certain degree of graphics helps with immersion, but photorealistic graphics only offers cheap immersion which turns off the immersion centre in the brain — Ok this is just my babble so 100% guess.
(edited to clarify that I'm not laboring under the misapprehension that Cyberpunk 2077 isn't a AAA game)
Was it this one? https://www.gdcvault.com/play/1015464/Attention-Not-Immersio...
It's clearly going for photo realism, but it somehow looks worse to me than older, lower-fidelity games.
It really feels like they put so much work into how everything looks in the primary and secondary stories.
i can agree though that just "jobbing" it looks more like a run-of-the-mill shooter, though.
I then played it again, on the same monitor, last year, and i was pleased with the gameplay, but again, i didn't find anything that remarkable about the overall graphics. the fidelity was great, especially at distance, due to 4k.
I'm 50 hours deep in literally as i type this (about to launch the game), and this time, this time it is completely different. I have an LG 2k HDR screen with "Smart HDR" and i finally - finally - get it. Your eyes have to adjust just like in real life, to go from dark indoors to bright outdoors. you can see tail-lights and headlights in the mountains of NPCs driving around. lasers sweeping you are menacing.
Even fallout 4, which is the first game i played in 4k 10 years ago, looks easily 10 times better in HDR. And i only have the "vanilla+" mod set, 5GB of mods, not the 105GB modset.
I coined the phrase 4 or 5 years ago, that HDR stood for: Hot Damn, Reds! and really, reds are still my least favorite part, they burn to deeply, but from watching several movies on an HDR 4k TV and being real unimpressed, to just these two games, my entire viewpoint has drastically changed.
I didn't know you could put arbitrary people into photo mode in CP2077, and also pose them and move them around, so i was just entering photo mode as best i could and lighting and fiddling with the curves; however, these all took over 4 seconds to "render" to the final image, which i found interesting: https://imgur.com/a/DTesuhF
This is a blanket statement I would disagree with.
> Many people, like me, feel like they are inside the game world, rather than playing a game with a TV/monitor in front of them
I can't disagree with a statement about personal preference.
So which is it?
There are exceptions, but the general public will almost always prefer a photo-realistic renaissance painting to a Picasso portrait, a lavish period piece like Titanic to an experimental set design like Dogville.
Did we play the same game? Some of the best lore-building and environmental theming around, paired with some cool mechanics?
Sure, the combat got repetitive but this was hardly something to "just sell GPUs"
Crysis' system requirements at launch were so far above what most people had that I'll give you that. Control wasn't that way at all.
> Some games are sold just so the end user can enjoy exercising their new GPU and monitor.
Being used “for benchmarking” and “being sold just” for that purpose are two very different things.
It was a good looking game at the time, but remember it originally came out on PS4/Xbox One and that version did NOT have raytracing.
The lore was annoying to listen to; whenever I wanted to listen to an audio log, I had to stop playing the game and watch the exact same video of a man smoking and being mysterious.
The cool game mechanics were basically just the gravity gun from Half Life 2, which came out over 20 years ago.
It did have some cool environmental set pieces, but overall I just found the game too pretentious for something that was basically a rip off of the SCP wiki.
It's something I really noticed when playing Disaster Report 4, where the people look amazingly realistic but some restrictions are clearly just 'developers didn't make this bit walkable'.
While also expecting you to go around searching for hidden goodies nd secret paths.
I swear, the invisible walls are the only thing pushing it to a 9/10 from a 10/10 for me.
Cars are also easier to make photorealistic. Less uncanny valley effect, lots of flat shiny surfaces.
What absolutely breaks immersion for me in most AAA car games is the absolute lack of crash, scratch, and dirt mechanics. Cars racing around the track for 2 hours don’t look like showroom pieces! Make ‘em dirty darn it. And when I crash into a wall …
I’m really excited to try Wreckfest 2 when I get around to it. Arcade-ish driving, not super photorealistic, they put it all on realistic soft body collision physics instead.
And granted this was an amateur race day, just weekenders having a good time, but it makes sense when you think about it: if the body panels aren't like falling off and are just a bit beat up... why replace them? Especially on some of these cars (late model Corvettes and Mustangs) they don't come cheap at all, and they'll require refinishing and you have to do your livery over again too.
Like a hockey player doesn't buy a new helmet every time they get hit, they/the team would be broke before the season was out.
And that's why I always think ladies who wear just enough clothes are way more sexy than nude ladies.
Hopefully this doesn't offend anyone.
Because WOW factor sells, specially if it's a new ip. You can see most trailers full of comments "this looks bad".
This is also true for non-photorealistic 3D games. They benefit from high-tech effects like outline shaders, sharp shadows, anti-aliasing and LoD blending - but all of that tech is improving over time, so older efforts don't look quite right any more, and today's efforts won't look quite right in 2045.
When a game developer decides to step off this treadmill, they usually make a retro game. I'd like to see more deliberately low-tech games which aren't retro games. If modern players think your game looks good on downlevel hardware, then it will continue to look good as hardware continues to improve - I think this is one reason why Nintendo games have so much staying power.
This has been the norm in 2D game development for ages, but it's much more difficult in 3D. For example, if the player is ever allowed to step outdoors, you'll struggle to meet modern expectations for draw distance and pop-in - and even if your game manages to have cutting-edge draw distance for 2025, who can say whether future players will still find it convincing? The solution is to only put things in the camera frustum when you know you can draw them with full fidelity; everything in the game needs to look as good as it's ever going to look.
I've been playing Cyberpunk 2077, and while the graphics are great, it's clear they could do more in the visual realm. It doesn't use current gen hardware to the maximum, in every way, because they also targeted last-gen consoles. I'm thinking in particular of the PS5s incredibly fast IO engine with specialized decompression hardware. In a game like Rachet and Clank: A Rift Apart, that hardware is used to jump you through multiple worlds incredibly quickly, loading a miraculous amount of assets. In Cyberpunk, you still have to wait around in elevators, which seem like diegetic loading screens.
And also the general clunkiness of the animations, the way there's only like two or three body shapes that everyone conforms to - these things would go farther in creating a living/breathing world, in the visual realm.
In other realms, the way you can't talk to everyone or go into every building is a bit of a bummer.
For FPS, HL2/Doom3 is probably the last generation that enjoys a huge modding community. Anything above it pushes ordinary modders away. I believe it is still quite possible to make mods for say UE4, but it just took such a long time that the projects never got finished.
In certain way, I so much wish the graphics froze by the year 2005.
I personally like Cyberpunk's 2077 style, it looks great maxed out with HDR. Yes, the models aren't the best, but the overall look/vibe is spectacular at times.
Cyberpunk has vanishingly few elevators. While it may be a loading hide in some spots, it's certainly not indicative of the game which otherwise has ~zero loading screens as you free roam the city including going in & out of highly detailed buildings and environments.
> I've been playing Cyberpunk 2077, and while the graphics are great, it's clear they could do more in the visual realm. It doesn't use current gen hardware to the maximum
I'm not sure how you can reach this conclusion to be honest. Cyberpunk 2077 continues to be the poster child of cutting edge effects - there's a reason Nvidia is constantly using it for every new rendering tech they come out with.
Unfortunately, this also meant that Firefox gave eyestrain headaches to every design professional in the world, because our pro color displays had so much more eye-stabbing color and brightness capability than everyone else’s. It sucked, we looked up the hidden preference that could have been flipped to render color correctly at any time, and it was tolerable.
Then Apple standardized DCI-P3 laptop displays on their phones and tablets, where WebKit did the right thing — and on laptops and desktops, where Firefox did not. Safari wasn’t very good yet back then to earn conversions, though certainly it is now, and when people tried to switch from Firefox the colors looked washed out and bland next to that native display punch. So everyone thought that Apple’s displays were too bright whenever they surfed the web and suffered through a bad LUT experience — literally, Firefox was jamming 100% phosphor brightness into monitors well in excess of sRGB’s specified luminosity range — by dimming their displays and complaining about Apple.
And one day, Chrome showed up; faster, lighter, and most critically, not migraine inducing. The first two advantages drew people in; the third made them feel better physically.
Designers, professionals, everyone who already had wide color monitors and then also students; would have eventually discovered (perhaps without ever realizing it!) that with Chrome (and with Safari, if they’d put up with it), they didn’t have to dim their monitors, because color wasn’t forcibly oversaturated on phosphors that could, at minimum, emit 50% higher nits than the old sRGB-era displays. The web didn’t cause eye strain and headaches anymore.
Firefox must have lost an entire generation of students in a year flat — along with the everyone in web design, photography, and marketing that could possibly switch. Sure, Chrome was slightly better at the time; but once people got used to normal sRGB colors again, they couldn’t switch back to Firefox without everything being garish and bright, and so if they wished to leave Chrome they’d exit to Safari or Opera instead.
I assume that the only reason Firefox finally fixed this was that CSS forcibly engraved into the color v3 specification a few years ago that, unless otherwise hinted, #ff0000 is in the sRGB color space and must be rendered as such. Which would have left them no room to argue; and so Firefox finally, far too late to regain its lost web designer proponents, switched the default.
As the article describes, Nintendo understands this lesson fully, and chose to ship Zelda with artistic color that renders beautifully assuming any crap TV display, rather than going for the contrast- and saturation-maximizing overtones of the paired combination of brighter- and more-saturated- than sRGB that TV manufacturers call HDR. One need only look to a Best Buy TV wall to understand: every TV is blowing out the maximum saturation and brightness possible, all peacocks with their plumage flashing as brightly as possible, in the hopes of attracting another purchase. Nintendo’s behaviors suck in a lot of ways, but their artistic output understands perfectly how to be beautiful and compelling without resorting to the Firefox approach.
(Incidentally, this is also why any site using #rrggbb looks last-century when embedded in, or shown next to, one designed using CSS color(..) clauses. It isn’t anything obvious, but once you know how to see it, it’s like the difference between 18-bit 256color ANSI and 24-bit truecolor ANSI. They’re not RGB hex codes; they’re sRGB hex codes.)
"Do these playworlds really need to be that photorealistic, I wonder? I actually consider it more of a minus if the graphics are too realistic."
It might be a generational thing, too; I was born in the late 80s, and my formative years were spent playing cartoonish games like Commander Keen, Command & Conquer, etc.
Decades ago, when I shot film, I remember discovering that I really liked how photos looked when underexposed by half a stop or so. I never knew why (and I wasn’t developing my own film, so I’ve no idea what the processor may have been doing), but I wonder if this was a contributing factor.
Look at movies that go all in on realism, can't see anything, can't hear anything. That's terrible.
https://www.slashfilm.com/673162/heres-why-movie-dialogue-ha...
There are many, many things artists need to do correctly, many of which have no idea of the whole pipeline. Let's say someone creates a scene with a tree in it. What is the correct brightness, saturation and gamma of that trees texture? And if that isn't correct, how could the lighting artist correctly set the light? And if the texture and the light is wrong the correct tone lmap will look like shit.
My experience is that you need to do everything right for a good tonemap to look realistically, and that means working like a scientist and having an idea of the underlying physical formulae and the way it has been implemented digitally. And that is sadly something not many productions appear to pull off. But if you pull it off everything pops into place.
The added complication with games is of course that you can't just oprimize the light for one money shot, it needs to look good from all directions. And that means it is hard to make it look as good as a film shot, because that risks making it look like crap from other directions which studios aren't willing to risk.
The dragon in The Hobbit isn't just about the tonemapping, it is at least as much (if not more so) a lighting issue. But the two can influence each other in a bad way.
If it is an attempt at realism, reality is not constantly shiny and wet.
If it a subjective artistic choice, it is objectively wrong and ugly.
Is there an expectation that everything look shiny and wet to make it seem more "dynamic"?
Is it an artists' meme, like the Wilhelm Scream in cinematic sound design?
There is a secondary problem in big budget games where modeling work gets farmed out leading to selection for "what looks good in the preview pic." In the preview pic, the asset artist gets to choose background/scene/lighting, and it's an easy trick to choose them to make the specular highlights pop. The person doing integration buys the asset, drops it in wildly different background/scene/lighting, and now the specular highlights are overcooked because the final scene wasn't chosen for the specific purpose of leveraging specular highlights.
tl;dr artists ship the org chart too
Thinking back - films also are always doing some new exciting thing all at once. That wild colored lighting aesthetic of the past decade comes to mind. That's a result of refined color correction software and awesome low-cost LED lights. Or drone shots. So many drone shots.
It's usually a group-think phenomenon where everyone was previously unable to do something and now they can and everyone wants to try it. And then there are successes and management points at those and yells 'we want that, do that!', and distribution follows, and if becomes mandatory. Until everyone is rolling their eyes and excited about another new thing.
It's a silly phenomenon when you think about it - any true artist-director would likely push back on that with a coherent vision.
"Mann sprayed down the city’s nocturnal streets with tens of thousands of gallons of water, so that they took on an unreal, painterly glow." - New York Times
No other image here comes anywhere even close, definitely not Zelda nor GTA5.
Personally I think the whole problem with the first 5 images is that they don’t have enough contrast, and they have too much detail. The color handling isn’t the only reason they don’t look realistic, but making sure every single pixel’s nicely exposed and that nothing gets too dark or too bright is allowing to let all the CG fakeness show through. One of the reasons the RE7 image looks better is you can’t clearly see every single thing in the image.
If you take photographs outside and the sun is in the shot, you will absolutely get some blown out white and some foreground blacks, and that’s realism. The CG here is trying too hard to squeeze all the color into the visible range. To my eyes, it’s too flat and too low contrast, not too high contrast.
For big budget games the solution for this is typically to have brightness calibration when the game first boots up, but the game itself still needs to be designed adaptively so that it's not Too Dark or Too Bright at critical points, otherwise the playability of the title is jeopardized. This runs counter to a goal of photorealism.
https://safebooru.org/index.php?page=post&s=view&id=1821741
and found they did really well because the art was designed to look good on bad screens and poor viewing conditions. I think of it in terms of Ansel Adam's Zone theory in that the ideal image is (1) legible if you quantize it to 11 tones of grey (looks OK printed in the newspaper), but (2) has meaningful detail in most or all of those zones.
I'm kinda disappointed that the Nintendo 3DS version didn't use the stereo effects but they would have had to decided if her hair forms a sheet or a cone.
The zelda screenshot he uses as an example of how good things look without HDR, looks terrible to me. It is all washed out with brightness and bloom, and all the shadows in the landscape that in reality would almolst be black, are very light grey.
Plus isn't not even a horror game. Come on, you are a shooter game. How does a shooter game that you can't see anything even make sense?
I’m not necessarily arguing games should imitate cameras, I really only think over-compressing the dynamic range is bad, and I don’t understand why the author is arguing for that.
Do you have a new technique to decode eye-brain perception in terms of how we perceive visual signals? Do you have a paper indicating how you make this claim for everyone?
And you completely miss what I'm asking too.
Chemical reactions in the rods and cones are only a small portion of vision processing. The rest is in the brain, with a great deal of various processing happening, that eventually comes to cognition and understanding what you see.
And parts of the visual cognition system also synthesize and hallucinate vision systems as well, like the vision hole where the optic nerve meets the eye. But cognitively, the data is there smeared across time and space (as in a SLAM algo putting the data where it should go, not what is measured).
I don’t know what you mean by ‘in the sun’ != ‘at the sun’. I’m the one who said ‘in the sun’ and I was talking about staring at the sun. I’m not sure what your point is, but if you’re trying to say that a game render of looking at the sun is different than the experience of actually looking at the sun, then I wholly agree. A game will (rightly and thankfully) never fully recreate the experience of looking at the sun. If you’re trying to defend &carlosjobim’s claim that human vision doesn’t have an absolute upper luminance limit, then I think you need to back that claim up with some evidence.
I mean, most people reading our comment thread here have their smart phone by their side and can instantly verify that eyes do not blow out whites or compress blacks like a camera. The dynamic range of our eyes is vastly superior to cameras. So aiming to imitate cameras is a mistake by game developers.
Of course, staring straight into the sun or a very bright light or reflection is a different matter.
The dynamic range of human eyes is not vastly superior to cameras. Look it up, or measure. It’s easy to feel like eyes have more range because of adaptation, foveation, iris, etc.
Again, I didn’t argue that games should imitate cameras. But that would be better than what we have in games; movies look way better than the game screenshots in this article.
It looks like a cheap film camera or a home video screenshot. So it gives off a feeling of nostalgia to a sufficiently old person, but this is also the kind of photo you'd reject as a pro, because it's totally overexposed.
- The omission of discussing HDR monitors, and how you can't really capture that on a screenshot. This is a game changer, especially with new games and monitors.
- The omissions of discussing Unreal5 games that have come out in the past few years. (e.g. Talos principle 2, Hellblade 2, Stalker 2)
- Not enough examples of games that do it well, with A/B comparisons of similar settings
- The Nintendo screenshot as an example of doing things right isn't working for me.
Another interesting example of lighting done well is Kingdome Come Deliverance 2. The details don't look nearly as nice as, e.g. UE5 game and it unfortunately doesn't support monitor HDR, but it has very realistic looking lighting and scenes. - Person has a critique of certain media (books, authors, games etc). They are valid critiques.
- You ask what the person thinks is an example of media that doesn't have this problem, or the media they like.
- The examples given are not in the same league, or do the one thing better, and many other aspects poorly.
It’s clear from their critique of the first screenshots that their problem is not with HDR, but contrast levels. Contrast is a color grading decision totally separate from HDR tonemapping.
There’s then a digression about RED and Arri that is incorrect. Even their earliest cameras shot RAW and could be color matched against each other.
Then they assert that tone mapping is hampered by being a 1D curve, but this is more or less exactly how film works. AAA games often come up with their own curves rather than using stock curves like Hable or ACES, and I would assume that they’re often combined with 3D LUTs for “look” in order to reduce lookups.
The author is right about digital still cameras doing a very good job mapping the HDR sensor data to SDR images like JPEGs. The big camera companies have to balance “accuracy” and making the image “pleasing,” and that’s what photographers commonly call their “color science.” Really good gamut mapping is part of that secret sauce. However, part of what looks pleasing is that these are high contrast transforms, which is exactly what the author seems to not like.
They say “we don’t have the technical capability to run real film industry LUTs in the correct color spaces,” which is just factually incorrect. Color grading software and AAA games use the same GPUs and shader languages. A full ACES workflow would be overkill (not too heavy, just unnecessarily flexible) for a game, because you can do you full-on cinema color grading on your game and then bake it into a 3D LUT that very accurately captures the look.
The author then shows a screenshot of Breath of the Wild, which I’m nearly positive uses a global tonemap—it just might not do a lot of dynamic exposure adjustment.
Then they evaluate a few more images before praising a Forza image for being low contrast, which again, has nothing to do with HDR and everything to do with color grading.
Ultimately, the author is right that this is about aesthetics. Unfortunately, there’s no accounting for taste. But a game’s “look” is far more involved than just the use of HDR or tone mapping.
And before we had OLED gaming monitors which can actually now display good HDR at 1000+ nits.
This was definitely during a transitional phase with mostly fake HDR techniques that needed tone-mapping. Now we have real HDR that doesn't need tone-mapping, or only a small amount of tone-mapping above the display peak nits point.
It’s worth pointing out these monitors for the most part can not sustain it or achieve it at anything other than the smallest possible window sizes, such as the 1-3% window sizes at best.
> Now we have real HDR that doesn't need tone-mapping, or only a small amount of tone-mapping above the display peak nits point.
For the reasons outlined above (and other) tone mapping is still heavily required.
It’s worth noting that OLED TVs do a significantly better job at displaying high nits in both percentage of the display and in sustaining it. It’s my hope the monitors eventually catch up because I waited a long time for it to become monitor sized.
Sure, but the parts of the image that are anywhere near 1000 nits are usually quite small and are things like muzzle flashes or light fixtures or centers of explosions, or magic effects etc.
https://www.rtings.com/monitor/reviews/asus/rog-swift-oled-p...
This is OLED gaming monitor that came out 2 years ago measures 904 nits on a 10% sustained white window.
Sure, but plenty of things are bright enough in combination at varying window sizes that combined the panels have to drop down significantly. So you might get 1000 nits for a muzzle flash but ~200nits at best for a “bright sunny day.”
The problem is way too many people (I’m not suggesting you) don’t realise this and just think they are “getting 1000nits!”
>https://www.rtings.com/monitor/reviews/asus/rog-swift-oled-p...
Yes, I own this display and it’s one of the better ones for brightness which is why I grabbed it.
However even on the latest firmware, It has a bunch of issues including with colours in HDR unfortunately. It also has incredibly aggressive ABL. Still a great display, but with more limitations compared to the TVs than I’d like still. They’ll get there though hopefully in few more generations.
But why though? I suspect that either I am not good at this kind of thing, or this is a purist thing, like „don’t put pineapples on pizza because they don’t do that in Italy“.
I don’t want games to look realistic. A rainy day outside looks gray and drab, there is nothing wrong with rainy days in games not looking like the real thing, but awesome and full of contrasts.
In photography and cinematography contrast and color curves are near ubiquitously modified artistically to evoke a certain feeling. So even without 3D renderings added colors are adjusted for aesthetic over raw realism.
I don't know what the author wants, but perhaps it's some kind of industry insider view similar to where "true artists' make movies that are so dark you can't see anything, and the dialog is quiet mumbling and the sound effects are ear-shattering. Perhaps there's an equivalent to that in games.
If only they had written an article about what they wanted...
> but perhaps it's some kind of industry insider view similar to where "true artists' make movies that are so dark you can't see anything
Nope, it's not that.
But the biggest problem with the screenshots is they literally aren't HDR. So how can we judge their HDR?
So both of these mean you have to jack up the sensation so people can feel something.
A fifth image of it done well was added in an edit.
Also I don't necessarily see a need to make everything look like physical film.
A real masterpiece of modern graphics is a game made by two brothers called "Bodycam"
gampleman•1d ago