I remember having to work hard to make my non-Apple display look 'right' years ago on an Intel-based mac due to weirdness with scaling and resolutions that a Windows laptop didn't even flinch handling. It was a mix of hardware limitations and the lack of options I had to address the resolution and refresh rates I had available over a Thunderbolt doc that I shouldn't have to think about.
I honestly hope they finally fix this. I would love it if they allowed sub-pixel text rendering options again too.
You’re rendering to a framebuffer exactly 2x the size of your display and then scaling it down by exactly half to the physical display? Why not just use a 1x mode then!? The 1.75x limit of framebuffer to physical screen size makes perfect sense. Any more than that and you should just use the 1x mode, it will look better and perform way better!
I have a 32:9 Ultrawide I would love to use on macOS but the text looks awful on it.
(I use my M4 Mac with 4K displays, and 5120x2880 (2560x1440@2x) buffers. That sort of thing does work, though if you sit closer than I do then you can see the non-integer scaling. Last time I tried a 3840x2160 buffer (1920x1080@2x), that worked. I am still on macOS Sequoia though.)
Text rendering looks noticeably better rendered at 2x and scaled down. Apple's 1x font antialiasing is not ideal.
Especially in Catalyst/SwiftUI apps that often don't bother to align drawing to round points, Apple's HiDPI downscaling has some magic in it that their regular text rendering doesn't.
Apple is free to make its own choices on priority, but I'm disappointed when something that's considered the pinnacle of creative platforms sporting one of the most advanced consumer processors available can't handle a slightly different resolution.
Actually, I don't even think it's possible to run HiDPI mode at the native resolution scale from within the macOS settings app, you'd need something like `Better Display` to turn it on explicitly.
It's also why when you put your Mac into "More Space" resolution on the built-in or first-party displays, it tells you this could hurt performance because thats exactly what the OS is going to do to give you more space without making text unreadable aliased fuzz, it renders the "apparent" resolution pixel doubled, and scales it down which provides a modicum of sub-pixel anti-aliasing's effect. Apple removed subpixel antialiasing a while back and this is the norm now.
I have a 4K portable display (stupid high density but still not quite "retina" 218) on a monitor arm I run at, as you suggest, 1080p at 2x. Looks ok but everything is still a bit small. If you have a 4K display and want to use all 4K, you have the crappy choice between making everything look terrible, or wasting GPU cycles and memory on rendering an 8K framebuffer and scaling it down to 4K.
I'm actually dealing with this right now on my TV (1080p which is where I'm writing this comment from). My normal Linux/Windows gaming PC that I have hooked up in my living room is DRAM-free pending an RMA, so I'm on a Mac Mini that won't let me independently scale text size and everything else like Windows and KDE let me do. I have to run it at 1600x900 and even then I have to scale every website I go to to make it readable. Text scaling is frankly fucked on macOS unless you are using the Mac as Tim Cook intended: using the built-in display or one of Apple's overpriced externals, sitting with the display at a "retina appropriate" distance for 218ppi to work.
Apple support tortured me with all kinds of diagnostics, with WontFix few weeks later. Wrote email and it got fixed in Sonoma :)
https://egpu.io/forums/mac-setup/4k144hz-no-longer-available...
Fucking with DP 1.4 was how they managed to drive the ProDisplay XDR.
If your monitor could downgrade to DP 1.2 you got better refresh rates than 1.4 (mine could do 95Hz SDR, 60Hz HDR, but if my monitor said it could only do 1.2, that went to 120/95 on Big Sur and above, when they could do 144Hz HDR with Catalina).
I would be absolutely unsurprised if their fix was to lie to the monitor in negotiation if it was non-Apple and say that the GPU only supported 1.2, and further, I would be also unsurprised to learn that this is related to the current issue.
People at the time were trying to figure out the math of "How did Apple manage to make 6K HDR work over that bandwidth?" and the answer was simply "by completely fucking the DP 1.4 DSC spec" (it was broken in Big Sur, which was released at the same time). The ProDisplay XDR worked great (for added irony, I ended up with one about a year later), but at the cost of Apple saying "we don't care how much money you've spent on your display hardware if you didn't spend it with us" (which tracks perfectly with, I think, Craig Federighi spending so much time and effort shooting down iMessage on Android and RCS for a long time saying, quote, "It would remove obstacles towards iPhone families being able to give their kids Android phones").
You could always try calling, too! I cold called Marc Benioff at Salesforce and he actually picked up the phone.
Tim Apple's Apple has been fu#$%& me again..
As an article, it is not 100% coherent, but there is a valid data and a real problem that is clear.
There might be a problem but it’s hard to know what to trust with these LLM generated reports.
I might be jaded from reading one too many Claude-generated GitHub issues that look exactly like this that turned out to be something else.
They’re likely all on Studio Displays.
Fractional scaling (and lately, even 1x scaling “normal”) displays really are not much of a consideration for them, even if they’re popular. 2x+ integer scaling HiDPI is the main target.
The article could just be AI slop since it just contains hyper in depth debugging without articulating what the problem is.
https://bjango.com/articles/macexternaldisplays/
- 24" you need 4k
- 27" you need 5K.
- 32" you need 6k.
Windows subpixel aliasing (Clear Type) manages a lot better with lower pixel density. Since Windows still has a commanding market share in enterprise, you might be right about the industry standard for HiDPI but for Apple-specific usage, not really.Meanwhile, Apple had this but dropped it in 2018, allegedly under the assumption of "hiDPI everywhere" Retina or Retina-like displays. Which would be great...except "everywhere" turned out to be "very specific monitors support specific resolutions".
The article doesn't mention it.
They've got a good thing going, but they keep finding ways to alienate people.
Anyway I will run the diagnostic commands and see what I get.
24 inch 1080p 24 inch 4k (2x scaling) 27 inch 1440p 27 inch 5k (2x scaling) 32 inch 6k (2x scaling)
Other sizes are going to either look bizarre or you’ll have to deal with fractional scaling.
Given that 4k is common in 27/32 inches and those are cheap displays these kinds of problems are expected. I have personally refused to accept in the past that 27 inch 4k isn’t as bad as people say and got one myself only to regret buying it. Get the correct size and scaling and your life will be peaceful.
I would recommend the same for Linux and Windows too tbh but people who game might be fine with other sizes and resolutions.
I really like my 1440p monitors at home more than the 4K monitors at work. At work, I'm always dealing with scaling and font size issues, but at home everything looks perfect. So I think you're onto something here: 1440p just seems to be a better resolution on a 27" panel.
If the theory about framebuffer pre-allocation strategy is to hold any water, I would think that 5k and 6k devices would suffer too, maybe even more. Given that you can attach 2x 5k monitors, the pre-allocation strategy as described would need to account for that.
Just another case of Apple intentionally going against established open standards to price gouge their users.
I wouldn't mind it as much if I didn't have to hear said users constantly moaning in ecstasy about just how much better "Apple's way" is.
High quality desktop Linux has been made real by KDE, and the AI-fueled FOSS development boom is accelerating this eclipse of proprietary nonsense like this.
If you're a developer, you should be using a system that isn't maintained by a company that intentionally stabs developers in the back at every turn. (Unless you're into that. U do u.)
That one also wasn't a hardware limitation as it ran my displays just fine in bootcamp, but macOS would just produce fuzzy output all the way.
It's infuriating.
wronglebowski•2h ago
MarcelOlsz•2h ago
moeadham•36m ago
jakehilborn•25m ago
smcleod•1h ago