I know the answer is that users have used hdmi longer
As you mention, the HDMI Consortium prohibits TV manufacturers from using DisplayPort. Many of the things that CEC and friends does aren't really needed in PC land. And if the Consortium is going to prohibit TV land from using DisplayPort, why go to the trouble to implement and standardize the parts of CEC & etc. that are only really useful for TVs, home entertainment centers, and the like?
IANAL, but this seems anti-trust-ish.
nic-cage-you-don't-say.gif
US antitrust/consumer-protection people have been asleep at the wheel for decades now. I'm doing my (tiny, tiny) part by avoiding HDMI wherever it's at all reasonably possible and recommending to folks I know that they consider doing the same.
Is it? Let's read on.
> The pc market is a tiny minority of customers. Most of the real volume is sadly just a soundbar plus a tv plus some device like Apple TV. ... I wish [TVs] had DP but the anti consumer licensing does not allow it.
You're aware of the fact that HDMI licensing seems to prohibit the installation of DisplayPort ports on TVs. Good. That's what I said, upthread:
> [T]he HDMI Consortium prohibits TV manufacturers from using DisplayPort. Many of the things that CEC and friends does aren't really needed in PC land.
So. Given that review of the previously-presented information:
Why would the DisplayPort folks (and/or manufacturers of equipment using DisplayPort) go to the trouble to standardize device control protocols that are only really useful in TVs and the like? As I said, that doesn't make any sense for a port that doesn't get used on TVs and associated TV support hardware... does it?
A 2025 monitor with DP 1.4 from 2016. Shame.
Sure, that's fuckin stupid... if HBR3 can't handle the monitor's native resolution, refresh rate, and bit depth.
But, on the gripping hand (and with the greatest of respect) why the hell did you buy the crappy thing? There are many DP 2 monitors out there. This is a little glib... but if someone's selling something that's bad, don't buy it. The dire video card and monitor situation has kicked me off of my regular upgrade cycle for at least five years. I'm not happy about it, but it's better than getting something that's not fit for purpose (and signaling to the manufacturers that it's okay to manufacture unfit products).
That feels like a bit of a stretch. There are a handful of options now, but most were released within the past year[1] and have only recently become available for purchase, and most of these are ~$1k or higher[2]. The cheapest one I could find right now is $714, but it's only 1440p (and 480hz - Sony INZONE M10S[3]). If snarfy needed something before then, and/or didn't have a huge budget to devote to a monitor, you can't really blame them.
You can get a perfectly serviceable 4k 160hz monitor with DP 1.4 for ~$300 right now[4], and that makes a lot more sense for most people.
Monitor manufacturers are generally stingy with DP ports, often including more HDMI ports, even when they can't support the full resolution and refresh rate of the monitor. It is frustrating.
[1]: https://www.rtings.com/monitor/tools/table/171238
[2]: https://www.newegg.com/p/pl?N=601469993%20601420164%20601438...
[3]: https://www.amazon.com/gp/product/B0D9R7HCVG
[4]: https://www.amazon.com/GIGABYTE-Monitor-3840x2160-160Hz-Free...
You appear to have missed this part of snarfly's post:
> A 2025 monitor with DP 1.4 from 2016. Shame.
Had snarfly not purchased that monitor within the past year, I would not have said what I said. I would have something that taken into account the state of available monitors at that point in time.
For the curious - https://rog.asus.com/us/monitors/27-to-31-5-inches/rog-swift...
[0] https://www.unigraf.fi/resource/introducing-displayport-2-0/
* "GPMI set to deliver up to 192Gbps and 480W through a single USB cable": https://news.ycombinator.com/item?id=43607155
* "China launches HDMI and DisplayPort alternative – GPMI up to 192 Gbps, 480W": https://news.ycombinator.com/item?id=43602154
Or maybe UniPipe?
(One connection for all things(electrical(since it does power besides media)))
I just want a damn widescreen hidpi display (I don't care about HDR or HRR), but I've yet to see one, let alone one that seems any good.
At least we can get some decent speed at FHD now (1920 Hz). I doubt any manufacturer will bite though sadly, even though OLEDs should be capable of maintaining refresh rate compliance. 4K@480 is still a nice improvement at least, even if a fairly incremental one. I do expect those to appear on the market.
120hz+ at lower resolutions first seems a lot more useful, unless one is doing some 360 degree video thing that has relatively very low amount of pixels per degree of angle. Doubling framerate is only double the cost, while doubling width&height is quadruple the cost.
Luckily the M4 Mac mini comes with an HDMI 2.1 port allowing 42Gbps data rate after deducting overhead, and that's the one I'm currently using to attach the 6K display. Only the M4 Pro/Max-equipped Macs offer Thunderbolt 5 with DisplayPort 2.0/2.1 (~77Gbps data rate).
And people are asking if the 6K display can do high refresh rate like ProMotion at 120Hz…
I just wish the TV and monitor industries could just stop fighting and get a unified ultra high bandwidth standard to work. I'm so fed up with the hard choice we're forced to make regarding DP vs HDMI.
How are you finding the scaling on that?
I currently have an aging 27" 5K iMac and want/need to upgrade. I'm not really interest in going >27" because of desk space issues, but there are a limited (though finally growing) number of 27" 5K monitors—besides the more-than-I-want-to-spend Apple Studio Display.
32" may be sufficiently not-large for my wants/tastes, so does 32" 6K work with macOS 'properly'?
Actually it has 4% more pixels than 6K Pro Display XDR (6144x3456 vs 6016x3384) and 44% more pixels than 5K Studio Display (5120x2880). It is the highest resolution computer monitor I can get at the moment. (8K and higher resolutions are mostly TV sets or cinema equipment, not optimized for computer use.)
I was even trying to attach two of these 6K babies to the M4 Mac mini. According to Apple spec it should work, but no matter how I tried, the second attached display only gets max 4K signal. Maybe it needs M4 Pro.
> 8K and higher resolutions are mostly TV sets or cinema equipment, not optimized for computer use.
Does this mean "not offering true RGB 8K processing even internally, only compressed YCbCr 4:2:0"?
Another reason is PPI. macOS works best ~220PPI for 2x HiDPI mode so 8K would be about 40". But current batch of 8K TV comes in sizes over 60", which is too big for most desks.
There're also convenience factors like DDC control (where you can control brightness and volume using software on your computer) which most modern computer monitors support but I've never seen any TV supporting that. Without HDMI CEC on the computer side, you can't even auto wakeup the TV when you wake up the computer.
Multiple plugs is also an option, as seen on the 32" 8K Dells 8 years ago.
It's the same LG panel (LM315STA-SSA1) used by Dell's ugly-as-hell U3224KB, but in an all-aluminum case and stand weighing about 9KG. Assembled in China at half the cost of U3224KB and various Chinese brands are selling it with their logos etched on the back.
Crikey, CAD 4200:
* https://www.dell.com/en-ca/shop/dell-ultrasharp-32-6k-monito...
Though more real estate and resolution, not sure I really want to go there (even for half the cost). I think I'll stick with 27" 5K:
> In just the past few months, we've taken a look at the ASUS ProArt Display 5K, the BenQ PD2730S, and the Alogic Clarity 5K Touch with its unique touchscreen capabilities, and most recently I've been testing out another new option, the $950 ViewSonic VP2788-5K, to see how it stacks up.
* https://www.macrumors.com/review/viewsonic-vp2788-5k-display...
Not an Apple guy, so this confuses the bejeezus out of me. The standard PPI on MacOS is 72, and the Pro Display XDR promptly clocks in at 216 PPI, so exactly 3x standard PPI for MacOS. This would suggest a scaling factor of 3x, not 2x. What's going on?
I do vaguely recall about them "raising the standard PPI to double the original", but then we're looking at 1.5x scaling, and according to other vague recollections, they don't do fractional scaling. Another vague recollection is a claim that they abandoned 72 as the standard PPI (in favor of 96? or what?), which would potentially check out (192 PPI would be 2x), but then it's 12.5% more dense than ideal.
[citation needed] ?
Where did you get that number from? Not saying you're wrong (or right), I'm just curious where you came up with that number? Perhaps 'back in the day' MacOS 9-based hardware does that? [1][2]
Per [3], it seems that non-Retina displays should be 100-120 ppi and Retina should be 200-230.
[1] https://www.photoshopessentials.com/essentials/the-72-ppi-we...
My information does come from historical sources, although I'd be hesitant to conclude it doesn't apply beyond them. In the sources you cite, they detail for example how Apple moved on from defining a reference constant to just speaking in terms of logical-to-physical pixel mappings. This took a while for me to wrap my head around, since I considered that basically nonsensical, but it means them having taken exclusive ownership of the reference PPI, preventing third parties from strongly relying on it or expecting physically precise sizes. This doesn't necessarily mean such a constant (or constants per device class) do not exist, it just means it's under wraps. 1x has to mean something in terms of density, otherwise apparent sizes of screen elements wouldn't be transferable across devices. Unless they really are just eyeballing it or just "letting jesus take the wheel", which would be entertaining, but I find that unlikely.
> Per [3], it seems that non-Retina displays should be 100-120 ppi and Retina should be 200-230.
Maybe. I'd be cautious to just accept it, some person writing that on their blog is not any more credible than me writing anything here as a comment. Random bits of Apple documentation still references 72 PPI as the 1x reference point for example. [0] [1]
It's possible though that in the spirit of the aforementioned, they did "secretly" move to 110 PPI or something as their reference constant, and so by that 2x is indeed 220, giving GP what they see, but that just rubs me so strange. But I mean, if when you open the display settings on a Pro Display XDR or similar, and it says the scaling factor is 2x, and you can reproduce this kind of scaling selection across a wide array of devices all hinting at 110 PPI being 1x, then there we go, apparently their current reference PPI then is 110.
[0] https://developer.apple.com/design/human-interface-guideline...
> Point size based on image resolution of 72 ppi for @1x and 144 ppi for @2x designs.
> Point size based on image resolution of 144 ppi for @2x and 216 ppi for @3x designs.
[1] https://developer.apple.com/documentation/appkit/nstouch/dev...
> The range of the touch device in points, such as 72 ppi.
(admittedly a pretty random source to cite this last one, doesn't explicitly support anything)
Apple had been experimenting with the idea of Resolution Independence during the transition to “high resolution” LCD displays (1280/1440/1680/1920 horizontal pixels) before giving up to integer 2x (and later 3x on iPhone to overcome deficiency of OLED sub pixels) scaling on what they now call Retina displays. This was the rapid transit to 100~110PPI for non-retina and 220 PPI for retina on desktop and over 300PPI on mobile.
Modern Mac can also do non-integer scaling (eg 1.2x 1.5x) well enough on retina displays if you need extra working area. For example all current Apple Silicon MacBook Air default to non-integer scaling on its internal screen.
https://www.phoronix.com/news/HDMI-2.1-OSS-Rejected
Displayport is the better technology in every way possible.
jekwoooooe•7mo ago
Are 90% of features still optional? You can be hdmi 2.1 compliant without VRR, QMS, etc.