While the step from 1080p 1440p to 4K is a visible difference, I don't think going from 4K to 8K would be a visible since the pixels are already invisible at 4K.
However the framerate drop would be very noticeable...
OTOH, afaik for VR headsets you may want higher resolutions still due to the much larger field of vision
I even doubt that. My experience is, on a 65" TV, 4K pixels become indistinguishable from 1080p beyond 3 meters. I even tested that with friends on the Mandalorian show, we couldn't tell 4K or 1080p apart. So I just don't bother with 4K anymore.
Of course YMMV if you have a bigger screen, or a smaller room.
Without HDR the differences are negligible or imperceptible at a standard 10' viewing distance.
I'll take it one step further: a well-mastered 1080p Blu-Ray beats 4K streaming hands down every time.
It really isn't.
What you are likely seeing is HDR which is on most (but not all!) 4K content. The HDR is a separate layer and unrelated to the resolution.
4K versions of films are usually newly restored with modern film scanning - as opposed to the aging masters created for the DVD era that were used to churn out 1st generation Blu-Rays.
The difference between a 4K UHD without HDR and a 1080p Blu-Ray that was recently remastered in 4K from the same source is basically imperceptible from any reasonable viewing distance.
The "visible difference" is mostly better source material, and HDR.
Of course people will convince themselves what they are seeing justifies the cost of the upgrade, just like the $200 audiophile outlet and $350 gold-plated videophile Ethernet cable makes the audio and video really "pop".
There was no hope of actual 8k gaming any time soon even before the AI bubble wrecked the PC hardware market.
Attempting to render 33 million pixels per frame seems like utter madness, when 1080p is a mere 2 million, and Doom/Quake were great with just 64000. Lets have more frames instead?
(Such a huge pixel count for movies while stuck at a ‘cinematic’ 24fps, an extremely low temporal resolution, is even sillier)
So anyone who wants only "real frames" (Non upscaled, non generated) will need to lower their settings or only play games a few years old. But I think this will be something that becomes so natural that no one even thinks about it. Disabling it will belike someone lowering AA settings or whatever. Something only done by very niche players, like the CS community does today where some are playing 4:3 screens, lowering AA settings for maximum visibility not fidelity and so on.
Basically 400MB for 12 bytes/pixel (64bit HDR RGBA + 32bit depth/stencil)
vs the 64000 bytes that Doom had to fill...
I doubt I’m unique.
I got a 65inch TV recently, and up close HD looks pretty bad, but at about 3m away it's fine.
The last ones I saw for sale were below 600 usd in physical stores from name brands (LG). Mine was just under 1000 when I got it.
Why we can’t buy the same panels as monitors is a mystery to me.
But if you have it wall mounted at eye level or a deep desk you're likely okay.
Personally, I'd consider that large of a screen, a bad working area.
My 55 is borderline too big already, and the main issue is actually the height. Tilting your head or rolling your eyes back to see the top gets noticeably uncomfortable pretty quickly.
I made a special mount so the lower edge basically rests on the desk surface which basically solved that issue, but I don't think I could have made it work if it was any bigger.
Also at 65 the pixel density is much lower, so you'd probably want it mounted further away. But if you do, the monitor will cover the same FOV as a smaller monitor anyway.
My dream is that someone starts making 8K 50" monitors with displayport inputs (HDMI is a mess) and sells them for the same price as these tv's used to cost!
For my computer monitor, I ended up going with a cartoonishly large 85 inch 8k. It was somewhat of an adventure getting it into my house, but once it was set up I absolutely love it.
I don’t really see the point of 8k for anything but computer monitors but it’s absolutely great for that.
I'm sure there's plenty of content (especially streaming content in mediocre bitrate) where people would be hard-pressed to tell the difference.
But I think if people went back to 1080p _panels_; they'd actually rather quickly notice how much worse the text-rendering is, and that the UI looks off for them.
Moving up to 8k would definitely be a smaller step-change in clarity than 1080p->4K and many people wouldn't feel it's worth spending extra; but I don't think it would be literally indistinguishable.
Depending on whether you want a TV experience sitting further back or Cinema is coming Home as Sony's tag line. I believe there is room for something greater than 4K. Especially when TV industry trend suggest TV purchase size is increasing every year. 40" used to be big, then it became entry level, now all top of the line TV dont even offer anything before 50", and the median is moving closer to 65". 80"+ price will come down in the next 5 years as LCD takes over again from OLED. I dont understand why but also wont be surprised if median size move pass 70".
In 2015 I wrote on AVSforum how 8K makes zero sense from Codec, computation, network, transport and TV. However I would never imagine median TV size moved up so quickly, and also I cant see how we could afford 100"+ TV at the time. Turns out I am dead wrong. TCL / CSOT will produce its first 130" TV in 2 years time. For those ultra wealthy they could afford 160" to 220" MicroLED made out of many panels. There will be 10% of population who could afford ultra large screen size. And I am sure there is a market for premium 4K + content.
There is definitely a future with 4K+ content and panel. I just hope we dont give up too early.
The median (and roughly 99th percentile for that matter) TV as well as being 65" is being used with Netflix et al. though, and that content already looks worse than you can buy on disc.
8k doesn't need to wait for TV sizes any more, right, but now it needs to wait for home internet speeds (and streaming provider infrastructure/egress costs) for it to make sense.
magicalhippo•4d ago
I mean my local cable TV is sending crap that's way worse than 720p YouTube videos and most people don't care at all.
I guess the primary benefit of an 8k display is that stuck or dead pixels are much less annoying than on a 4k panel of the same size.
I'm fine with 4k for my living room. Give me more HDR, less chroma subsampling and less banding.
VerifiedReports•50m ago
steinvakt2•32m ago