I seriously doubt many Switch users would bail on the system because of “fake” HDR. They probably don’t care about HDR at all. As long as Mario remains Mario, they’re happy.
If the system was SDR only I would be disappointed but fine.
But they made it HDR. They made a big deal about it. And it doesn’t work well. It’s impossible to calibrate and ends up just looking washed out.
It’s broken.
And I don’t appreciate the insinuation that Nintendo fans will buy any piece of junk they put out. See: Wii U.
It was just an easy at-hand example.
I also liked the VirtualBoy. But I bought it and a bunch of game from Blockbuster for $50 total when they gave up on it. So my value calibration was very different from those who paid retail.
The visual difference between the N64 and GC was enough that it made sense to focus on upgraded graphics. When you play an N64 game, there's always the initial shock of "wow these graphics are a bit dated".
But you don't get that feeling when playing Melee, or Wind Waker, or many of the other artfully done GC games.
Essentially, somewhere around the GameCube era, graphics became good enough that the right artist direction could leap a game into the "timeless graphics" category.
And so it makes sense that Nintendo said "let's stop chasing better graphics, and instead focus on art direction and gameplay".
And of course it does not matter, Nintendo still sells because it's Mawio (and I say this with all the love, I'm a huge Mario fan myself).
Outsold both the PS3 and XBOX360 by 15M units though. Given the lower hardware costs of the Wii (I've seen estimates of ~$160 compared to $840 for the PS3 and $525 for the Xbox 360 - both higher than launch price btw!), I'd suggest Nintendo made the right choice.
It wasn't until the Wii that Nintendo stepped out of the hardware race. Somehow this has been retconned into Nintendo never focusing on hardware.
If they thought it would sell more systems, they'd compete. The Switch 2 is evidence that it doesn't matter.
Fair point, but on the other hand... that was 20 years ago. So it's easy to understand why that gets rounded off to "never".
I’m having a blast with MarioKart but the track usually looks washed out. Some of the UI and other things have great color on them but most of the picture just looks like the saturation was turned down a bit.
Very disappointing as a Mario game and its colorful aesthetic is the kind of thing that should be able to look great in HDR.
macOS puts a slightly higher brightness than it required and artificially (in software) changes absolute white (0xFFFFFF) to greyish color (0xEEEEEE). So when a HDR content is required it will remove mask around that content. Safari ideally, probably that’s on Firefox why tone mapping doesn’t work well
The video looks the same in both Safari and Firefox, whereas the images are dim in Firefox on both my MBP display and external monitor.
How is that the right call?
2) More than past Mario Karts, World needs to visibly delineate the track into multiple sections: the track itself, the "rough" off track, the border between those two, and the copious number of rails you can ride and trick on. Rails in particular are commonly bright primary colors in order to better stand out, often more primary color coded and saturated than the track itself. Green Pipes, yellow electrical wire tie downs, red bridge rail guards, etc.
3) Bonus gamut for particle effects is kinda not required and probably distracting when drifting around a curve avoiding attacks.
4) It feels pretty good to me, but maybe I need to adjust some settings on my LG C1 to get the full bland experience?
It’s a little better than I had it set. But it’s still a problem. As this article shows, it just wasn’t designed right.
Perhaps the worst offender I've ever seen was the Mafia remake by Hangar 17, which loads every time with a sequence of studio logos with white backgrounds that cut from black. The RGB(255,255,255) backgrounds get stretched to maximum HDR nits, and the jump from RGB(0,0,0) (especially on an OLED) is absolutely eye-searing.
I literally had to close my eyes whenever I'd load the game.
Of course there are individual wonky sites which will still flash but if applicable, those two things should reduce the occurrences significantly.
Why would it be any more impactful on OLED than any given FALD display capable of putting out >1000 nits sustained?
Perceived intensity in HDR is dominated by luminance, not just contrast ratios on paper.
OLEDs do have effectively infinite contrast (since black pixels are off) and it's why I love them, but that doesn’t inherently make white flashes more intense on them via any other display type unless the peak brightness is also there to support it.
Or in other words, a 800 nit flash on OLED is going to be less intense than a 1600 nit one on a FALD LCD. Brightness is the bigger factor in how harsh or impactful that flash will feel, not just the contrast ratio.
It's not down to your panel technology in this case, but the limitation of any given panels peak and sustained brightness capabilities.
This kind of scenario is in fact where FALD is strong. OLED really starts to pull ahead in more complex scenes where the zone counts simply can’t match up to per pixel control.
I love OLED's motion clarity (I still use CRTs, that’s how much I care). But I dislike its inability to maintain brightness at larger window sizes and VRR flicker is a dealbreaker across the board.
My FALD display on the other hand, delivers the most jaw dropping HDR image I've ever seen, producing an image so spectacular it's the biggest graphical/image jump iver seen in greater than a decade, but its motion resolution is garbage and yes, in some specific content, you'll get some minor blooming. It's nice that it doesnt have VRR flicker though.
My OLEDs win for motion, my FALD wins for HDR + lack of VRR flicker, it's very unfortunate that there’s no perfect display tech right now. Since most of my content is bright (games), I’m happy to trade some blooming in starfields for overall better performance across the other 80% of content. Other peoples content will differ, like perhaps they love horror games, and will choose the OLED for the opposite reason and I'd get that too.
I still use OLED in the living room though, it doesnt seend the kind of abusive usage my monitors do and OLED TVs are way brighter than OLED monitors, not as bright as i'd like, but bright enough i'm not gonna go out and replace it with a FALD, not until the new Sony based stuff drops at the very least.
https://www.theverge.com/news/628977/sony-rgb-led-backlight-...
When you drive towards the sun, what is more fun? A realistic HDR brightness that blinds you, or a „wrong“ brightness level that helps the background stay in the background without interrupting your flow? Similarly, should eye candy like little sparks grab your attention by being the brightest object on screen? I’d say no.
The hardware can handle full HDR and more brightness, but one could argue that the game is more fun with incorrect brightness scaling…
The game should look like a normal Mario game at a minimum. It should use its additional color palette available in HDR to look better, and the additional brightness to make make effects pop as you describe.
The problem is that’s not what it’s doing. Some things pop better, but it’s not because they’re using extra colors. It may be a little brightness, but mostly it’s that everything else just got toned down so it looks kinda washed out.
If they did nothing but use the expanded color palette and did not use the additional brightness at all I would be a lot happier than with what we have right now.
I haven’t turned it back to SDR mode but I’m legitimately considering it. Because I suspect the game looks better that way.
And the article is about they missed out on the optionality of using the additional gamut, but that additional gamut wouldn't intrinsically look better.
It's easy enough to edit a screenshot to show us what could have been, but even in that single screenshot there are things that look worse: like the flames gained saturation but lost the depth the smoke was adding, and some reasonable atmospheric haze vanished.
(similarly the game in the side-by-side has some downright awful looking elements, like the over-saturated red crystals that punch a hole through my HDR display...)
Given Nintendo's track record for stylization over raw image quality, I'm not sure why this isn't just as likely them intentionally prioritizing SDR quality and taking a modest-but-safe approach to HDR... especially when the built-in screen maxes out at 450 nits.
Compare any of the retro tracks to their World counterpart, then say that again. The game’s general palette and design is so washed out and bland compared to the DS, Wii, 3DS, and Tour versions of those tracks.
If there's a track that's actually less saturated than it was then, it's definitely not the result of an SDR-first workflow.
It could, but that's different than shouldn't. There could be good reasons for it, such as not wanting the primary gameplay to lie outside a color palette available to people playing on their TV in sRGB or in the common complete shit HDR modes that lie about their capabilities.
It would be neat if more was used, but nothing about being HDR means that you should, or that it's even a good idea, to rely on the maximum capabilities.
> I haven’t turned it back to SDR mode but I’m legitimately considering it. Because I suspect the game looks better that way.
To be honest, without a TV with proper HDR, SDR mode will often look much better. The problem is that TVs are often quite awful when it comes to color volume, specular lighting and calibration. The SDR mode is often very untrue to the content, but stretches things within the TV capabilities to make it bright and vivid to look nice. The HDR mode on the other hand has to give up the charade, and in particular SDR tonemapped content, which if the TV didn't lie would have looked identical to SDR mode, looks really awful.
A favorite of mine is to switch Apple TV's between SDR and (tone-mapped) HDR mode and see how different the main menu and YouTube app looks. I have yet to find a TV where UI doesn't look muted and bland in HDR.
It's like a keyword bingo for usually poor implementations. I grant that maybe the implementation is good for any specific game you care to mention - but history has shaped my habits.
The presence of in-game music is a poor implementation indicator?
I am a big proponent of "there's no wrong way to enjoy a game" but wow. In nearly 50 years of gaming that's a new one for me. Congratulations. But do whatever works for you... and I mean that sincerely, not sarcastically or dismissively.
Truly, in 50 years, none of this has occurred to you or you’ve never witnessed it in friends or family? That seems hyperbolic
I understand that it helps some people get "into the flow" or something. But I don't have an issue with that. I occupy my mind trying to grasp the mechanics and be good at using them to play. If the gameplay doesn't have enough then papering over it with music doesn't do it for me.
And I'm not always looking for "intense" stuff. I like to chill out too. But I've played quite a few games over the years and so the gameplay has to have something to keep me entertained.
I enjoy music with Rocket League because I don't play competitive and so some music playing while I'm hanging out with others on my couch shooting-the-shit as it were is fine. It's more of a "social setting music" than "game music".
After all these years, I don't miss it at all.
That seems hyperbolic
I think maybe you skimmed instead of reading. I'm referring to the parent poster's practice of turning music off ASAP on first boot as standard operating practice, because they find it to be part of what they consider "keyword bingo for usually poor implementations" Here's what they said: Every game I first start requires a trip to turn off music,
in-game VoIP, HDR, bloom, lensflare, screenshake if possible.
It's like a keyword bingo for usually poor implementations
But as you suspected: yes! I can imagine turning off game music, in general. Thank you for believing in me!I have turned the music off in many games.
But any one of these aspects can individually be crap and often are.
If I have a need for ingame VoIP I'll turn it back on. But I don't want to default to hearing randoms on the internet espousing bullshit using mic-always-on.
If it turns out the game is good with music, I'll turn it on. Rocket League works.
If one of my friends is raving about the graphics, I'll turn HDR or bloom on - though I haven't ever agreed with them enough to leave it that way.
So by default, yes I turn those things down or off before I even start playing.
Further detail for music. Often poorly implemented: repetitive, sounds similar to game sound effects, changes and gets louder when there's a rush horde. All distracting and ruining immersion.
I quit playing Day of Defeat because of the end of round music that I couldn't do anything about. I either couldn't hear footsteps next to me (part of the gameplay) or I was deafened with the end of round blaring. I don't have time to put up with poor UX when there are plenty of other games to play.
As I get older and find it harder to discern different sounds it is just easier to focus on what I want - the game and gameplay. It's the same thing as drivers turning down the stereo when they want to pay attention to finding a park or driveway - it's a distraction to what I'm here for.
I like music, and like to listen to it, either as a dedicated thing to do or while doing house chores or cruising long distances in the car. But generally not while gaming.
Thankfully so far, these are still options I can set to taste. I guess when the "the way it's meant to be played" crowd takes over I'll do something else with my time.
Other games like Senua did actually manage to pull off an amazing sun/scene though. Because its slower they can use it to accentuate, for example you walk around a corner and go from the darkness into full on sunlight which is blinding, but then falls off so as to become bearable.
the monitor can display what it can display. format of transfer doesn't change hardware capabilities, just how you express what u want towards them
this was posted before here on HN and i dont think its wrong. though ofcourse, technically, there are differences, which might be in some applicaitons ( but usually are not) exploited to the viewers benefit. (?? maybe?? someone with real good eyes :)) https://yedlin.net/DebunkingHDR/index.html
maybe my interpretation is wrong, but i dont think it is far off if it is. specification differences and differences in human perception are not the same thing
Imagine if every individual song, speaker/earbud, and MP3 player had a different implementation of 8/16 bit music, and it was up to you to compare/contrast the 8 variations to decide if 8 or 16 bits was more auditorially satisfying.
You get this with like levels and mixers, but not with such a fundamental quality as definition. Sure, you can chase hi-fi, but I feel like music has less artificial upscaling and usually hi-fi is a different product, not a toggle buried in various menus.
I don't know, it's kind of crazy.
anything beyond 16bit (which is streamed to your speakers) is not audible though in the sense of playing it to the speaker. it matters in processing (dsp). (44 or 48khz inguess?)
I can’t articulate why it bothers me. Except maybe the implied assumption that the author’s real voice & style benefit more than they are harmed from being submerged in what is ultimately mathematically derived mush.
If you consider your writing bad enough to warrant a LLM fluffing pass, I consider it no better than the 99% of worse than mediocre, lazy, attention grabbing bullshit without real intellectual quality that pollutes the internet.
Call it lazy but I think for the reader reading a rephrased LLM article is more enjoyable than trying to parse some borderline “Kauderwelsch”(German for gibberish) :)
"Mainstream" or "majority" in context of Nintendo is a $20-40k/yr white collar household with 2 kids. The REAL mainstream. Some would have real ashtrays on a dining table. ~None of them had bought any of TVs over 42" with 4K resolution and HDR support in past 10 years.
Though, I do wonder how globally mainstream is such a household buying Nintendo hardware. Admittedly it could be somewhat of a local phenomenon.
Either six-figure Nintendo gamers are hoarding boxes full of Switch 1 in the attic and completely destroying the statistics, or everyone at that income bracket is sophisticated enough to desire one.
Frankly, this is why I'm wondering how normal it is globally, because I believe Japan is not supposed to be a village of broke Vulcans. Maybe the nerds are in fact hoarding tons of Switches.
TVs are a very cost effective home entertainment device, and 4k HDR is the default these days.
OLED has ABL problems, so they can do HDR400, but anything with a higher brightness than that is problematic.
I feel like HDR support is so divergent that you're not ever going to match "properly mastered" content in the way the article author wants. That's why no developers want to spend time on it.
Anything that isn’t an oled simply cannot do HDR. it’s just physically impossible to get the real contrast.
I like some of the choices on Mario Kart World with HDR, but a lot of it just needs to be toned down so the things which do blow out the colors are impressive but also fit instead of just everything being turned up to 11.
Sounds like an incredibly cost-effective optical illusion!
Why are people able to craft an image/video that bypasses my screen brightness and color shift settings?
If I wanted to see the media in full fidelity, I wouldn't have my screen dimmed with nightshift turned on in my dark bedroom.
It's not OP's fault. My mind is just blown every time I see this behavior.
The fact that it just goes full 1600 nits blast when you view a video from a sunny day is a terrible UX in most cases and it’s the reason why I have HDR turned off for video recording, even though I might miss it later. To make matters worse, it also applies to 3rd party views.
The current implementation means that only the occasional image/video behaves that way, and only if it were crafted that way.
My first reaction when I saw the launch/gameplay video was why does this look so washed out? Now I kinda know why - thank you!
You want low latency and long battery life, HDR has an impact on the two
Have people forgotten what a handheld is supposed to be? portable device on a battery
Come ON.
the game perhaps started development when they had a different screen planned for the console?
according to rumors, the console is at least 1year late
i'm just pointing out a fact, i'm not saying that everything they do make sense
It might be true that an HDR-capable panel was cheaper than one that was only SDR-capable.
That doesn't mean that Nintendo was obligated to incur the (probably very modest) increase in energy drain from additional processing load and (probably quite a bit less modest) increase in energy drain from making the screen far brighter on average than when running in SDR mode. They could have just run the panel in SDR mode and limited HDR mode to when one was running on AC power in a dock. Or even never enabled HDR mode at all.
NOTE: I've not looked into whether or not enabling HDR in a given game has a significant effect on battery life. I'm also a big fan of manufacturers and game developers creating good HDR-capable hardware and publishing good HDR-capable software whenever they reasonably can. Higher contrast ratios and (-IMO- more importantly) wider color gamuts are just nice.
There are about a thousand other things in any given game that matter more to me than HDR tone mapping, and I'm happy for developers to focus on those things. The one exception might be a game where you spend a lot of time in the dark - like Resident Evil or Luigi's mansion.
Looking at his example video where he compares Godfall Ultimate footage to Mario Kart - I quite dislike the HDR in Godfall Ultimate. Certain elements like health bars, red crystals, and sparks are emphasized way too much, to the detraction of character and environment design. I find Mario Kart to be much more tasteful. That's not to say that Mario Kart World couldn't be better looking in HDR, but the author doesn't really do a compelling job showing how. In the side-by-side examples with "real" HDR, I prefer the game as-is.
You need a real HDR display (800 nits+), FALD or OLED for contrast, some calibration, and software that uses it well (really hit and miss at least on Windows).
Once all the stars align, the experience is amazing. Doom Eternal has one of the best HDR implementations on PC, and I suggest trying it once on a good display before writing HDR off as a gimmick.
There’s something about how taillights of a car in a dark street in Cyberpunk look, and that just can’t be replicated on an SDR display afaict.
Then you have some games where it’s implemented terribly and it looks washed out and worse than SDR. Some people go through the pain and mod them to look right, or you just disable HDR with those.
I’d vouch for proper HDR any day, that being said I wouldn’t expect it to improve Mario Kart much even with a proper implementation. The art style of the game itself is super bright for that cheery mood, and no consumer display will be able to show 1000nits with 99% of the frame at full brightness. It’ll likely look almost the same as SDR.
One reason for keeping Apple hardware around is a decent display test bench. I do the best I can with image work, but once it leaves your hands it's a total lottery.
So far this is my experience of HDR.
What surprised me is why a new game from a big studio, designed to support "HDR", would not be designed all in linear space to begin with. Because then doing tone mapping correctly for different display technologies becomes easy. However my knowledge is mostly from the photography world so perhaps someone with game knowledge can weigh in.
Here's some good examples: https://www.shadertoy.com/view/MslGR8
jldugger•7mo ago
Somehow I doubt this survey is representative of the typical Mario Kart player. And to those for whom it is a concern, I don't think SDR is high on the list relative to framerate, pop-in, and general "see where I'm going and need to go next" usability.
Loughla•7mo ago
mcphage•7mo ago
That really is the joy of Mario Kart. You think you’re going to beat me, kid? You’re 12 and I’ve been playing Mario Kart for 30 years.
(And then they do… oof)