> Rosetta was designed to make the transition to Apple silicon easier, and we plan to make it available for the next two major macOS releases – through macOS 27 – as a general-purpose tool for Intel apps to help developers complete the migration of their apps. Beyond this timeframe, we will keep a subset of Rosetta functionality aimed at supporting older unmaintained gaming titles, that rely on Intel-based frameworks.
https://developer.apple.com/documentation/apple-silicon/abou...
On the other side of the fence in Windows land, adoption of features much more trivial than an architecture change are absolutely glacial because devs know they can lean on back compatibility in perpetuity.
Apple’s approach is perhaps too aggressive but Microsoft’s is far too lax. I don’t think it’s realistic to dump a binary and expect it to work forever with no maintenance.
The moment Windows actually has non-x86 hardware which people actually want to buy I bet native app support comes pretty quick.
Snapdragon X laptops are here already, and reviews are generally positive (of course, use and workload dependent, nobody should get one to game, but for web browsing and long battery life it's perfect).
Looking at the M1, there was no "if", "but", "for certain", etc wording. It was just better all around, even comparing gaming performance vs older MacBooks. Even running Windows ARM in a VM was better than buying a Windows laptop (x86 or ARM) in terms of both performance and battery. Based on your description, I don't think the Snapdragon X laptops are the same story?
macOS x86 games + macOS arm64 games == same number of games. Whats the loss angle here?
My guess is that Mac users simply don’t buy games for their macs
1. Metal precedes Vulkan
2. Hardly anyone supports Vulkan https://www.carette.xyz/posts/state_of_vulkan_2024/
Vulkan is the primary rendering API on Android and Linux these days and also well supported on Windows by GPU vendors. Applications that don't use it yet generally don't because they don't need it and still OpenGL (ES) (or for Windows applications, use D3D instead).
It's fascinating that history is changing so fast now that even events of 10 years ago can be claimed to be something they weren't.
- 2013: AMD starts developing Mantle, and it is only available as part of AMD Catalyst, and only on Windows.
- 2013 (at least): Apple starts working on their own graphics API.
- June 2014: Metal 1.0 on iPhones is announced by Apple (it means it had already been in development much earlier, that's why I wrote "2013 at least" above)
- July 2014: Khronos starts work on Vulkan
- August 2014: The announcement of the Vulkan project and the call for participation.
- June 2015: Apple announced Metal on Macs.
- Sometime in 2015: AMD discontinues Mantle and donates it to Khronos.
- December 2015: Vulkan 1.0 specification announced by Khronos group
- February 2016: The full spec and SDK for Vulkan 1.0 released
So, reality: Metal had been released two years before Vulkan even had a specification.
> also well supported on Windows by GPU vendors.
Keyword: GPU vendors. Not by Microsoft.
> Vulkan is the primary rendering API on Android and Linux these days
But not on Macs, iPhones, Windows, XBox, and Playstation.
And yet, "omg why doesn't Apple support this late-to-the scene quite shitty API that it must support because we say it must".
Wouldn't it make sense to simply implement the technically best solution everywhere?
Also the question of "technically best solution" is also fraught with subjective definitions of "better". You also want competition in approaches. E.g. Vulkan might not have happened (an we would be stuck wit OpenGL) without the newer more modern GPU APIs from Metal and DX12.
2. I can easily list 5x as many AAA games, engines, or libraries supporting a Vulkan backend vs a Metal backend, does that mean Apple shouldn't support Metal? I'd say no, but maybe you're seeing this kind of logic differently. I'd be fully in support of macOS/iOS/iPadOS/visionOS adding native DirectX support, if that's what you mean. It just seems less likely than Vulkan (for several reasons).
I think the only game I've put serious time into on my Mac was Hades 1, which I pretty much finished before the console ports happened.
Actually, I think the last time I installed Steam on a Mac I did so in a CrossOver Games bottle, so even there Steam would have seen that install as a non-Mac user.
EDIT: Also just because I think it's interesting to note this, Steam Deck accounts for most of Linux's headroom there, if you subtract the SteamOS numbers from the rest of Linux it's at 2.07% instead, which is much closer to Mac.
Exactly this. As a piece of anecdata, I play World of Warcraft every day on my Mac, but I don't have steam installed.
I remember first implementing this in Planimeter Game Engine 2D, we got a massive resolution list from SDL (through LÖVE, which is what we're built on).
If I remember correctly, we filtered the list ourselves by allowing users to explicitly select supported display ratios first, then showing the narrowed list from there. Not great. Technically there's a 683:384 ratio in there.[1]
But it did enough of the job that users who knew what resolution they wanted to pick in the first place didn't have to scroll a gargantuan list!
[1]: https://github.com/Planimeter/game-engine-2d/blob/v9.0.1/eng...
WoW on Linux with wine used to be a superior experience to WoW on Windows. Because the system didn't lock up for half a minute when I alt-tabbed.
[Haven't played WoW in a long time though, maybe in Windows 10 and up you have an easier time alt-tabbing between heavy 3d apps.]
Also I really can't think of the way Mac's mouse settings in WoW work better than the Windows ones. Scrolling, mouse movement are all having more settings under Windows to customize them compared to the experience on Mac (even with all the tools such as Mos to help with the bad external mouse handling in Mac).
So what exactly is so superior? I play WoW on both platforms and when I have a choice (that is, when playing at my home desktop - on my summer cabin I only have the Mac option) it's always the Windows one I select. It's nice that it works in Mac so I can play everywhere as I only have Mac laptop, but the sometimes weird graphics bugs, worse performance and odd issues with rendering displacing my WAs and other stuff just doesn't inspire confidence in Mac gaming compared to using Windows. And the mouse movement.. ugh.
I know why they do it though. Apple can't take their undeserved 30% cut of Mac games the same way they take their iOS cuts.
We had a teacher years ago who said something that remains true until today: "everything is about money, especially the ones that appear to have non-monetary reasons."
Even if they did, 100w should be room enough to play relatively recent titles, specially indie ones. Nothing really excuses Apple from this contempt it has for the gaming market.
Considering how the typical self-identifying "gamer" conducts themselves online, I think Apple might be on to something...
Steamdeck runs on 45W, and that's plenty enough power to have fun.
you swallow the taste of the terrible hardware of a steam deck for the support of valve/proton/devs and the ecosystem.
Now, handhelds with newer hardware are definitely going to trounce the steam deck without having to trade away battery life, but I think they did the best they could at the price point and time.
In general it is assumed that documentation is the boring part and paying attention to it is a sign of quality. But where do you put people who prefer writing and/or teaching to thinking long and deep about product issues ?
You're right though.
In a sense, this is far worse than what most tech companies do since it is a life-and-death issue and Boeing has made a very conscious effort to hide details here, rather than just being lazy.
There's something of a history of aerospace vendors omitting "implementation details" that end up contributing to serious accidents (e.g. if you get an Airbus far enough out of the normal envelope protections, you lose stall warning), and an equally sordid history of flight and maintenance crews improvising procedures to the observed (rather than designed/specified) behavior of aircraft systems.
Arguably, the single biggest systematic risk in the current pilot training system is that crews overlearn to the implementation details of their training, rather than the actual principles and flight manuals (e.g. training inadvertently training for quick engine shutdowns, when the consequences of shutting down the wrong engine in reality are much more serious).
And Airbus control laws and protections are well defined and studied by pilots training for them.
it isn't, it's just ISO/NADCAP conforming.
https://learn.microsoft.com/en-us/windows-hardware/drivers/u...
"The characteristics of the endpoint determine the size of each packet is fixed and determined by the characteristics of the endpoint."
https://web.archive.org/web/20221214171028/https://learn.mic...
Sometimes they actually have examples of how to use it, but most are just Javadoc level, and of minimal use.
They now tell people to watch WWDC videos as if those relatively short videos contained the same amount of information a proper API documentation does.
Libraries and SDL, GLFW and Sokol handle this for you, and I had to poke inside them to actually figure out how to get the same performance of other games.
In a nutshell, the trick is simple, you call [NSApplication finishLaunching] and do a while-loop yourself instead of calling [NSApplication run].
Why would Apple be deliberately sabotaging the experience? They would gain nothing from it. That argument makes even less sense when you consider most of the games mentioned in the article are on the Mac App Store, Apple can take their cut.
https://apps.apple.com/us/app/control-ultimate-edition/id650...
https://apps.apple.com/us/app/shadow-of-the-tomb-raider/id14...
https://apps.apple.com/us/app/riven/id1437437535
https://apps.apple.com/us/app/cyberpunk-2077-ultimate/id6633...
https://apps.apple.com/us/app/stray/id6451498949
This is an obvious case of Hanlon’s Razor. Anyone who develops for Apple platforms and has had to file Feedbacks is aware of Apple’s incompetence and lack of care.
On iOS there is no choice.
I am a very casual gamer and sometimes weeks go by without playing. My first experience with Steam was with Civ VI. Long flight, no internet, great I’ll play some Civ! But instead of opening up Civ, I was forced to open Steam which would then not allow me to play my own game because I hadn’t authenticated recently enough for them. Or I would try to play and Steam would say, oh first you need to download some huge update before you’re allowed to play your single player entirely offline game.
I know theoretically GoG is supposed to solve this issue but no Mac game I wanted was available there. Finally Cyperpunk 2077 launched on multiple stores and I bought on GoG. Even then, the default option became to use the GoG launcher. If I wanted DRM free download, there was some immediately complicated off putting set of instructions, downloading something like 20+ files, etc.
App Store experience, I click download, it downloads. I open it, it opens the app and not some launcher. Everything just works.
You can also right-click the game and 'Browse local files' and the game's regular executable is usually right there.
I'm currently playing the Oblivion remake, and launch that through a mod manager rather than Steam (though on Windows), even though the game was installed via Steam.
It depends on the game, they do offer some kind of DRM, which requires Steam to be open when launching the game, but it's optional for the developer to use it or not. See https://www.pcgamingwiki.com/wiki/Digital_rights_management_...
PCGamingWiki also usually has information on whether the game is DRM-free or not, e.g.: https://www.pcgamingwiki.com/wiki/Hades#Availability
* Steam is constantly updating, every time you open it, and until recently videos would almost always fail to play on my Mac.
* The Epic Games launcher is so atrocious that calling it “feature-rich” feels like a bad joke. I find it so bad that I don’t even open it to get the free games, opting instead for the website, and even then I am super selective about any game I get (fewer than 10%) because I always think I’ll have to deal with that app. In its current state, there is zero chance I’ll ever by a game on there, all because of the app. My “favourite feature” is how if you queue a bunch of games to install and then set a few others to uninstall, those are added to the same queue and you have to wait for the installs to finish before the uninstalls get a chance. So if you are low on disk space, now you have to, one by one, cancel each of the installs and tell them to start again, so they are added to the bottom of the queue.
* GOG Galaxy was the biggest disappointment. I was expecting to like it but it only lasted an hour on my machine before I trashed it. It felt old and incomplete.
Compared to the MAS, Epic is a good launcher for games. Take a look at CP2077. On the MAS you can't just download the language you need, you have to get all of them. This increases the download by 60GB. No other platform has this issue. So it ends up being 160GB which is nuts and more than half the storage on a base model M4 Mac. It's insanely barebones and half assed for gaming.
Two things people think can't be done, but are:
https://www.xbox.com/en-us/play
Save the Xbox one to home screen for full screen experience, turning your iPhone or even iPad 13" into a Logitech gCloud if you add https://www.amazon.com/dp/B0D7CHHB42
I mean... yeah why would they change?
The profit maximising trade-off for Apple might however be where they are right now. Not sure.
The difference in power consumption is insane. Nuc was 67 watts. The mini is 4-10. I don’t have enough data for a long term average on the mini yet, but it’s ludicrously efficient.
The good thing is that the Ryzen boards can be upgraded on RAM and Storage but that seems to be changing with soldered DDR5 on many mini PCs now.
My raspberry pi is sitting at 3 watts, and runs linux software just fine.
A Mac Mini uses multiple times that, and can barely run any software I care about (i.e. linux).
They're both consumer computers, and yes there's a massive difference in their capabilities, but "most efficient ever" is a really strong claim.
But I wouldn't compare the experience to a Mac Mini, and I wouldn't call a raspberry pi a "consumer computer". It solidly falls into the amateur dev board category. Heck, it doesn't even have a power button.
In your comparison you talk about software compatibility, but the Mac can run Linux and many open source tools & apps found on it. So the Pi is only ahead if you arbitrarily limit the comparison to a specific distro that's not Asahi or an application like sxiv, and in addition are the kind of user who wants that software and doesn't care about Mac exclusive programs. Which is valid, but has nothing to do with a metric like power efficiency.
It works well. Quite why networking from VM is more reliable than networking from the host is a question I’d like to ask Apple. SMB does not reliably start at boot.
Imagine if you could turn on a Mac mini, have docker containers start and then have them access network shares reliably.
The Ubuntu VM on the same host can. That’s ridiculous.
You’re both right, and wrong.
The Mac uses 3-4W most the time. The reason mine is using more is that it has an Ubuntu VM running Pihole (so it is running Linux). This doubles/triples the power usage.
If I wasn’t such a chicken, I’d run it on the host in docker, but I’ve learned the pain of having Pihole on the same host as other services. Dead dns is not not good.
I really wish I could run Ubuntu on the metal. Something as simple as having docker containers run on boot is unreliable, and the god awful hack I have made to get it to work is so disgusting it’s shameful.
Both are on 10gbe, and I suspect the card in the Nuc was a bit of a power draw.
I have a few bits of hardware on them so that I can force a hard reboot remotely if needed. Crude, but effective.
https://www.tp-link.com/us/home-networking/smart-plug/hs110/
Well, Apple doesn't sell 'em with RGB LEDs built-in, so I dunno.
...more seriously though: I don't know how you can say that with a straight-face. On paper, the Apple Silicon M3 and M4 seem comparable to NVIDIA's mid-tier offerings from 3-4 years ago... when rendering to a 1080p-sized display; assuming you'd like to render your games at native Retina or even 5K resolution (given that Apple would really rather you used their display hardware for the best experience, right?) then you'll have to limit yourself to games from the last decade (assuming they've even been ported to macOS in the first place) or convince yourself that this year's AAA titles are fully playable as a slideshow.
...and that's assuming your Mac doesn't start thermal-throttling itself too.
Also, you'll also need a very non-Apple mouse: I'm flabbergasted by how even still in 2025 you cannot independently left-click and right-click on Apple's Magic Mouse[1]; while, ergonomically a multitouch mouse is great for many things, but navigating a wheel-driven UI (e.g. weapon-wheels) is not one of them.
[1] https://superuser.com/questions/188431/apples-magic-mouse-an...
if all I can't play is 2025 AAA games..I'm not missing out on much. It ran BG3 solidly anyways! Obviously not maxed out but it was playable and enjoyable.
now i have a sudden urge to buy a mac mini and install a tiny case window with LEDs and an itty-bitty LCD display for power/thermal monitoring.
Introducing the notch creates this "here but not really usable" area that is specific to some displays, and Apple touts it as something that will just work thanks to their software layer.
Yet every app has to deal with it one way or another and most apps that care about screen estate or resolution will need hacks.
It will be the same for every feature that pushes the boundaries and offers an abstraction layer as an answer to how the issues will be solved.
The problem here is 100% that Apple, for some reason, provides both 16:10 "safe" resolutions and the actual whole screen resolutions. All those non-16:10 resolutions are a trap, because they cause the compositor to downscale the window to fit below the notch anyway -- they serve literally no point. There is no situation where a user would want a window that's 16:10.39 or whatever and then downscaled to fit within the bounds of the 16:10 safe area, so there is no reason to provide those 16:10.39 resolutions in the list of display modes.
Apple botched the implementation, but the hardware would've been perfectly reasonable and caused no issues or complexities for either users or developers if Apple hadn't botched the software.
https://www.reddit.com/r/MacOS/comments/qveb0u/force_true_fu...
Or a (user-configurable) setting per app, and if the user configures the app to run below the notch, the OS should return the 16:10 resolutions and make the app draw below the notch; and if the user configures the app to take up the whole screen, the OS should return the full resolutions and make the app draw to the whole screen.
There should be no case where the app gets told about the 16:10.34 resolutions through normal display mode query APIs, but then gets scaled by the OS to fit in the 16:10 area below the notch.
White-hot take: you’re allowed to own a PC and a Mac. They aren’t like matter and antimatter, you won’t collapse the galaxy or anything.
Well that's where you're wrong then. It's perfectly possible to game on a laptop. Over the last decade with the development of the lightning ecosystem and docks it's become a very low friction endeavor as well.
There are a myriad of quality "gaming" laptops out there, but obviously that's not for everybody, and that's one of many reasons why workstation laptops exist (basically the same thing - ability to run at high loads for extended periods of time - without the gamer aesthetic).
There are many casual gamers out there who don't want to shell out for a dedicated gaming device, nor should they win a laptop is a perfectly adequate device to game on. It's not that hard to comprehend.
> There are a myriad of quality "gaming" laptops out there, but obviously that's not for everybody, and that's one of many reasons why workstation laptops exist (basically the same thing - ability to run at high loads for extended periods of time - without the gamer aesthetic).
For the last few years, I thought 'gaming' was more GPU heavy and 'workstations' were heavier on the CPU and RAM?
Though I wonder how much that has changed recently or will change soon: after all with the rise of (local) AI, perhaps more work will move to the GPU?
On workstations you can get both. The Dell Precision 7XXX, HP Zbook Fury, and Lenovo P5x/P7x series are the primary high-end notebook workstations, and they are all nearly infinitely configurable with a myriad of CPU, GPU, memory, storage, display, and connectivity options.
I myself have a Precision 7560 from 2021 that has a Xeon W 11955M, and an RTX 3080 Laptop GPU that's roughly between a desktop 3060 and 3070 in performance due to the power limit of 90 W.
Before that I had a Precision 7530 from 2018 that had a Xeon E 2176M.
Both Precisions have 4 DDR4 slots for up to 128 GB, 3 M.2 slots (and an additional one for the WiFi module), and full repair/service manuals online.
If you're talking x86, only if you're okay with hearing loss :)
Or really have no choice. Student who has space/funds for one device and it has to be portable, for example.
Push this logic one or two notches further and people should write and build code on desktop only with an e-ink portrait monitor.
Specialization has its place but asking a generic computing device to gracefully handle most applications isn't a big ask IMHO.
They're not running the latest top notch games or anything exotic, it should be fine with no tinkering.
And this is not a matter of laptop vs desktop because most of these issues (not the notch) will be present for a mac studio.
In fact apple itself has started advertising gaming in macs in WWDCs last years. So it is only fair that people will complain about it.
But in general gaming in macs is perfectly feasible (maybe not the latest most demanding graphics game at max settings, but most of the stuff). You may miss certain graphics features that may not be available for macs, but otherwise performance is not bad. Even playing windows versions of games through whisky or CrossOver is perfectly feasible.
Just like people like the hand held form factor of the Nintendo Switch, but it's still entirely fine for them to complain about the system's low specs. Especially when the Steam Deck and the Rog Ally show that you can do better.
Additionally, laptops are more than capable of playing demanding games these days.
> White-hot take: you’re allowed to own a PC and a Mac.
I do own a console, a pc and two macs. But for what I've paid for the macs Apple should make it easy for me to play games on their hardware if i so choose.
There is some hope: Lies of P is actually very playable on the minimum requirements M2/16G ram. And you get the Mac version on Steam with the Windows version so you're not limited to a single platform.
That's more of an aspiration than a statement of fact.
Eg I'd happily have organised a crowdfunding to give Vladimir Putin a few dozen billion dollars to bribe him away from starting a war. And if you look at Russia's finances that's effectively what happened: he (and even more his oligarchs) predictably lost untold billions in revenue and profit for this war.
Also, migration from poor to rich countries increases workers' pay so much, that you could charge them a hefty sum for the privilege and they would still benefit from coming. However voters by and large don't like it, even if you were to distribute the proceeds amongst them (as a "citizen's dividend" or whatever).
They have non-monetary reasons.
See https://citeseerx.ist.psu.edu/document?repid=rep1&type=pdf&d... for an interesting collection of historical cases drawn from such diverse sources as Apartheid and Post-Apartheid South Africa, Malaysia, Nazi Germany.
There are countless other incentives as tangible as money including meaning, status, security, fame, etc.
If you spend time around people with money you will find they will happily trade it to achieve these.
What this belief signals most strongly to me is your class.
What this "classism" mentality signals is your sense of superiority due to the amount of $ you have in a bank.
You literally said "What this belief signals most strongly to me is your class.". So that wasn't personal?
In the us of a, money is the ultimate status and often used as the measuring stick to the value of ones ideas, personhood, and everything in between.
I cannot stand it but it is what it is.
Do you remember the TV show duck dynasty where a rural family form the south became millionaries selling sporting goods?
How would you compare their socials status to a Berkeley undergrad with parents who work in the bay?
Steve Jobs was dismissive about gaming, even when PC gaming was on the rise, that's the reason.
Games with app purchases are just a lucky turn of events for Apple because they still don't understand what games are. Hence the "oh, we now have desktop-level graphics on iPhone" or some other drivel they pronounce from time to time at WWDC.
From time to time someone in the company manages to push something forward (see Game Porting Toolkit), but those remain very few and far in between.
[1] To be honest, I doubt any of the top brass even uses computers these days looking at the decisions they make wrt Mac OS.
I had a similar thought about the Microsoft manager who just claimed that in 2030 no one will use a mouse and keyboard anymore and people won't want to be "mousing around". If he actually believes that it says a lot about the way he interacts with computers. And how rarely he uses it outside of a quiet office.
That's nothing new: Microsoft still has an institutional obsession with pen-computing (going right back to 1992's "Windows for Pen Computing"); it re-emerges every few years: sometimes it's a major imitative like Windows XP TabletPCs[1], or re-launching OneNote as a "free" Windows feature; or adding "inking" support anywhere it doesn't belong (so if there's one good thing about the invasive MS Copilot awareness campaign: it made me forget about how useless "inking" is and shareholder value pissed away on it.
-----
[1] Speaking from personal experience: I had a Tecra M4 myself, but despite everything: a very high-resolution display, Wacom digitizer and stylus, NVIDIA's best laptop GPU, and the best OS and Office integration yet (the 2005 version of the TabletPC Input Panel in Windows XP is a work of art, honestly) and the launch of a brand new product-line: Office OneNote - but the whole experience was really just still... a noticeable downgrade compared to using a good pen on good lined paper; but especially a significant downgrade from using a mouse and keyboard.
The main reason it was such a disappointment comes down to a small number of intractable problems that probably won't be solved on Windows for a long while still:
1. The time-latency between moving the stylus and the on-screen (hardware scanout sprite!) moving with it.
2. The time-latency between writing with the stylus and the on-screen ink being drawn (which is done in software, through multiple layers of buffers, making it easily 3-5x more laggy than the bufferless hardware cursor sprite.
3. Insufficient display resolution: even with a 125dpi display (compared to the 96dpi most laptops had at the time) of 1400x1050 @ 14 inches, natural handwriting with a pen on paper really needs to be rendered at 300dpi or maybe 600dpi.
4. That awkward gap between the actual LCD display screen and the plastic-y stylus-safe screen covering - this was long before iPad-style laminated displays existed too; which means there's parallax error - making it impossible to take the stylus off the screen and reposition it exactly where you expect: it would always land off-target by 3-5px if you're lucky. This alone kills cursive handwriting because you won't be able to accurately dot-your-Is-and-cross-your-Ts even if you have the best fine motor-control possible.
...and there were other issues too: handwriting recognition remains an evergreen joke going back decades[2]; while my Tecra was better at it than my PocketPC, it was still far short of the MS PDC demos showing a full Letter-sized paper of handwriting transforming into rich-formatted text with zero spelling mistakes. There were plenty of glitches in Windows GDI when you used the screen-rotation feature too (e.g. it disabled double-buffering for most Win32 controls; and everything would awkwardly lock-up for a few seconds after the rotation; iPad OS this surely was not).
(I'm sorry for writing obsessively about this; it's one of my... uh... autistic hangups)
[2] https://www.youtube.com/watch?v=FiorZ4yrbtk
-----
Apple gets a lot of things wrong; but when Apple get something right, the world knows. And what Apple got right was their understanding that pen-computing for text input is just plain unworkable; even on the iPad with Apple Pencil, it's clear they only intend it to be used for tasks like art/drawing or for simple shape-like annotations - neither of which require fine application of the stylus point on-screen. Whereas Microsoft just won't let the dream die; and I've no idea why.
Ah, the usual "inking is useless" hyperbolic drivel. Pens worked fine (enough) for me as longtime Wacom EMR user, including for handwriting, e. g. classroom notes, sketching, and the like, especially via dedicated third-party solutions. And, obviously, there's always room for improvement. Just as it's obvious that for that to happen you have to "keep the dream alive".
And Steve "Who needs a stylus" Jobs is, as overrated as he was, a dead businessman, while the stylus is, as underrated a tool, still kicking. Even in the castrated and enshittified pen-computing product line of the company he helped to build.
I do pen computing since it began. Indispensable tool in my fields of work. The most problematic thing? Ultramobile and modular general-purpose computing is an underserved market; penabled UMGPC (or UMPCs) are practically nonexistant. Sad.
Not knowing much about Macs, I would have thought games were supposed to render full screen around the notch for immersion but respect the safe area for UI and gameplay. Are they supposed to leave a menu bar on macOS?
> Control gets around the issue by just making up its own resolutions.
That's hilarious, I wonder if they had trouble enumerating resolutions and gave up or if they simply couldn't bother.
Depends also how the specific game is implemented, but often that area is just black and inaccessible, as in you cannot even move the cursor there. It is as if the screen does not include the area above the notch anymore, ie how the screen would be if there was no notch, like the m1 air.
So I guess rendering around the notch is the intended experience, but devs don't seem to know or care enough to opt-out, and the bug here is that enumerating display resolutions doesn't take this compatibility mode into account.
[1]: https://developer.apple.com/documentation/bundleresources/in...
Not sure how much of that is still around but it was rampant for many years and likely a key to Windows success in gaming.
Or, even simpler (and AFAIK modern Windows does that too): if the running application doesn't say in its application manifest "I'm a modern application which understands the new things from operating system versions A, B, C and features X, Y, Z", you know it's a legacy application and you can activate the relevant compatibility shims.
They just need to adapt the API to also filter the list of resolutions when the compatibility mode is on.
I put "downside" in quotes because the alternative is just not having those vertical pixels at all, and having the screen end below the sensors.
The notch is wider than need be. As far as I can tell the main reason is aesthetics. But I agree, it is better than nothing
Notch: Camera, ambient light sensor, and a camera status LED
Dynamic island: camera, infrared camera and flood illuminator, proximity sensor, and an ambient light sensor... also half the width and height
Most applications do not handle this well, and I'm willing to bet that will continue to be the case for the foreseeable future.
I can see the allure. If you're a default macOS user with the menu bar not set to slide up when not in use, that space nicely fits. Problem still remains for fullscreen apps though.
And it's not even really a hardware problem. Apple could fix this in software. Just make using the unsafe region an opt-in for developers instead of the default. You want to paint in the unsafe region, call NSAllowUnsafeRegionResolution(). Boom done. By default, when the menu bar isn't shown, expose a rectangle to the viewport.
I was watching someone code in vim on a new mac the other day and the notch covered part of the first like of their editor! Like how does this not completely break you as a human. Maybe I'm a little OCD idk.
Isn't there just like an option to change the resolution and fix this? Apple should really have a "disable notch" mode.
iPhones have the same problem. Ever played a fullscreen game and had part of the UI cut off into the notch or island. Yep, it's a problem.
Having just the rectangular screen means not having the option of making those things visible, or having a smaller screen.
Think of them as something that allows the overall screen area to increase, as bevels shrink.
And then when the corners of the screen are so close to the corner of the laptop, and the corner of the laptop is rounded, it looks weird if the corner of the screen isn't rounded. Like a square peg in a round hole.
Fortunately, it's all context-dependent. So when you watch a video that is inherently rectangular on a Mac, that takes precedence. It's only shown below the notch, and the rounded corners on the bottom disappear.
So it's kind of the best of all worlds. Bigger screen with round corners for actual work (the notch is not particularly objectionable in the menu bar), slightly smaller screen with rectangular corners for video (and games too I assume?).
And I don't about you but I don't hold the screen half of my laptop. Not unless the laptop is closed. So I don't see how that's a concern.
Is the similarity of the word the only reason they often get mixed up? I think another factor is that flat surfaces surrounded by an angled bevel are fairly common. For example. I just noticed that the bezel of one of my monitors is also beveled.
And the other comment about wasting screen space is funny. Yeah we need 1mm of extra phone screen space when 60% of most webpages are covered with ads (separate problem but still amusing in combination).
You want more bezel. Screens with less bezel have less overall edge protection and will break easier due to opening or closing forces or impacts.
Bezel-less screens are fragile pieces of junk.
Another was when my kitten bit the edge of the screen. It didn't even leave a mark but it was just enough concentrated pressure at the edge of the screen that it killed it.
Now I'm a lot more careful, but thank goodness for Applecare.
The rounded laptop corners is a similar design decision. And also it doesn't look weird to me at all compared to rounded corners for windows, especially when they introduce visible desktop garbage in those now-not-perfectly-covering-rectangles corner areas
It's an actual objective fact of life.
You don't get a larger 15"/16"/17" inch screen. You get a screen that size minus the notch because of a psychotic obsession with thinness. And then they struggle to compensate for that with barely working workarounds in software that don't cover even half of cases.
The notch makes for a smaller menu bar but without the notch there would be no menu bar there, it would take the space underneath instead.
Yup. We never had camera before the notch.
> The notch makes for a smaller menu bar but without the notch there would be no menu bar there
Yup. Before the notch we neither had a menu bar that could comfortably fit most menu items even in professional apps, nor did we have a camera.
BTW, you literally are saying "The notch makes for a smaller menu bar". Imagine if I wrote that as the first sentence in my comment, then there would be no misunderstanding
Also my response here: https://news.ycombinator.com/item?id=44909996
Personally, I've never run out of space in my menu bar, So the notch gives me 1cm of extra screen space.
It's nothing to do with thinness. It's about packing the largest possible display into the laptop's width/height. Sure, you could argue to just make the laptop 1cm higher for that bezel, but then why not add a notch and get 2cm of extra screen height?
That perceived 1cm is largely meaningless for content. And you get less space in the top menu bar.
> Personally, I've never run out of space in my menu bar
I have 27 icons in my menu bar. Not because I collect them, but because quite a few apps add their icons there and I use a few of them.
On the laptop screen it manages to show 10.
IntelliJ idea has 12 top-level menus (I swear they had more). On a laptop the top menu bar manages to show 10 items on the left of the notch, and has to move two more to the right. This both splits the menu for no reason, and reduces the space for icons even further.
The notch has been around for 4 years now, and Apple still hasn't provided a solution for the problem they introduced.
And, of course, when you want to truly take advantage of "more content" you can't because the "safe screen space" without the notch is still squarely below the notch, and apps have to to be very careful to actually use that, or the notch will get in the way.
> It's nothing to do with thinness.
Yes, it does. In this case with thinness of bezels.
> Sure, you could argue to just make the laptop 1cm higher for that bezel
Yes, you could do that if you didn't have an institutional psychosis about thinness everywhere.
Luckily I have mostly used Macs on external screens the last few years. But it ticks me off every time I actually use a MacBook as a laptop.
It's actually optional, the functionality is built into Mac OS (just check "show all resolutions" in display setting and pick the notchless resolution). You get that bezel you want, along with the full menu bar.
> Yes, it does. In this case with thinness of bezels.
Device thickness is not the same thing as bezel thinness. If you make a phone/laptop thinner, all you get is less battery life. If you make bezels thiner, you get more display area.
As a lot of people told you, you can just disable it. I've been doing that for 4 years, just set your resolution to a 16:10 ratio and you're good to go. The resolution is exactly the same as it was before they introduced the notch
Personally I like the fact that Apple gives us the choice. I dislike the notch and prefer my menu bar below because I use apps like intellij. My wife likes the notch and keeps it. So, both of us can have what we want.
Maybe Apple could have made it slightly easier to disable it by having an option instead of choosing a 16:10 resolution but, to be honest, most of the people who dislike it tend to be power users who can figure it out.
If you want to avoid the extra space, it's as easy as using a 16:10 resolution size. The menubar will drop down to the 16:10 space.
To change it you have to first display the hidden list by enabling a “display resolutions list” toggle.
That is not something a “I love Apple because it just works” person can figure out.
Now for the bezels, mind that an equidistant circle (or squircle) radius converges to zero. So to avoid having a large inner radius, avoid a large outer radius. The Macbook is not a tablet, you do not hold its corners in your hand.
However, at some point Apple must have decided that the squircle is its entire visual identity and that hard corners on the XY plane are bad. That creates design problems it would not otherwise have.
The fact that developers are led into the trap of not pixel matching to the display however just shows a lack of attention to detail.
They are on my M4 MBA. They're software-rounded rather than in the hardware. You can move the pointer over the lower rounded corners; you can't over the upper ones.
Also, it's worth noting that notches are a hardware-specific. I haven't tried any of the MBAs with notches, but the MBPs have good-enough black levels that the notch just kind of vanishes into the black background when using full-screen apps.
IMO the notch is pointless, but they need space for the front camera. With OLED they can just turn the pixels off when it suits the application and it becomes like a big bevel, which was the alternative anyway.
Bezel not bevel FYI.
You have to hoop jump with janky tools[1] that actually let you see and access the icons silently hidden because they overshot the notch.
Apple chose not to have hinting so as to make the text more dimensionally accurate. That did look blurry ... so Apple then doubled the screen resolution.
obligatory: https://www.folklore.org/Round_Rects_Are_Everywhere.html
I feel like the only one that's still amazed by tech sometimes. I still look at planes and feel amazed we puny humans did that. It's never good enough.
Guess it's just a matter of taste. I'm never bothered by it.
1. Don't have a front-facing camera on your laptop. Your actual laptop screen can be as close to the edge of the laptop case as technology will allow, with no bevel.
2. Have a front-facing camera on your laptop, with no bevel. Your actual laptop screen is now square, but has to be 5mm shorter now. There's a 5mm strip at the top of your laptop that can't be used for anything.
3. Have a front-facing camera on your laptop, with a bevel. Now the actual laptop screen has a "dead space" in middle at the top, but can again be as close to the edge of the laptop case as technology will allow. There's a narrow strip at the top that can't be used for anything, but the area to the left and right of the bevel can.
(A number 4 would be to somehow have a front-facing camera that operates without needing to displace screen area. Not clear how this would work without some complicated mechanism to extend the camera out the top of the screen, which would come with its own problems.)
Now, the vast majority of the time, you're going to be using your Mac in a windowed environment, with menus on the left, indicators on the right, and absolutely nothing in the middle.
In the case of #2, this menu bar has to take real estate from the top of your shorter screen, meaning that your windows are all 5mm shorter. #3 allows the menu bar and indicators to take up that space which on #2 is completely dead, freeing up extra space for your actual applications.
And the key thing is this: For modern full-screen games, in #3, you (apparently) can't use the areas to the side of the bezel; but this is the same situation you're in in #2.
IOW, as another commenter has said: The bevel design doesn't take away screen in the middle; it adds screen space on the side of the bevel.
That said, the API here seems obviously mad: What's the point of giving you a resolution if it's going to silently resize it behind your back? It should either give you the resolution which it won't resize, or throw an error when you try to make one higher.
> There's a 5mm strip at the top of your laptop that can't be used for anything.
Well, it can - on mine I have a physical switch that allows me to block the lens, and it saved me a few times. (I just wish I had a similar one for the microphone.)
I don't know, Lenovo figured it out, and it's no more ridiculous than Apple's solution. Their notch just goes the other way. It's actually a little bit better, because it gives me a space to put my thumb when I'm opening the laptop to, you know, use it.
Sometimes it seems to me like Apple engineers do not use Apple products, because if they did, there's no way we'd have problems like disappearing icons without a way to get them back (Hell, Windows has had this ability for DECADES, why can't Apple "invent" this technology?). We wouldn't have edge-to-edge glass screens which seem to exist solely to get fingerprints on the screen. We wouldn't have closed lids touching the keyboard, leaving weird patterns on the smudgy glass.
The only other option is that design at Apple is not driven by designers or engineers, but by executives who have no clue. And that's worse.
When you look at the remaining bezel along with the camera unit, the lens is the same size. They really could have built a bezelless mac more or less if they only stretched out the components in that camera unit.
These software rounded corners disappeared in software when CRTs gave way to flat panels, but they are indisputably part of the Apple design aesthetic.
I have no idea why people praise these laptops, for a cpu I guess. Their design is a joke and feels like a Playskool laptop toy.
At least when the old Apple forced “phablets” as the standard phone size they included a software work around for reaching the top of the screen.
Not adding a dropdown menu for the toolbar is brain dead.
https://github.com/jordanbaird/Ice
https://github.com/FelixKratz/SketchyBar
I think there's a native solution (finally) coming in the next MacOS.
It's a valid complaint, don't get me wrong... but insisting on not fixing it as some sort of moral stance is only hurting yourself.
People are weird, man.
I can do a lot of things instead of write that comment. And installing work arounds for a trillion dollar company with an attitude problem is not one of them. Give attitude, get attitude.
What’s weird is being a complacent drone even though you understand the issue.
(Upholding HN tradition here)
How is vague "safety" is better than a simple descriptive rect_below_notch?
Your container views can extend the safe areas for their children as well. In our apps, which allow users to run their own custom projects, we increase the safe area for our UI so that users can avoid it in their own rendering
Safe area is a fairly neat and functional API. The unfortunate thing is the older `CGDisplayCopyAllDisplayModes` API is just lumping all resolutions together
But also you don't need to degrade Mac dev experience for games to tackle ipads the games will not be developed on, aliases exist
I assume they also wanted to choose something that would still be named appropriately if they had other kinds of notches or things in the future (and that even now it already accounts for, say, the curved corners where things also can't display).
Your suggestion here and in the responses would be bucking industry standard terminology for content creation.
Your suggestion for unobstructed area is also incorrect because you cannot guarantee it’s unobstructed depending on the display it’s on. Different displays and configurations might have differing levels of overscan causing content to be clipped.
> The distances from the screen’s edges at which content isn’t obscured.
> because you cannot guarantee
That's a condition you've added without justification. Nothing in the term requires it. The term's core benefit is that it's more clarity. Nothing more (in terms of extra guarantees), but nothing less.
But also, you can guarantee, read the API docs again:
> Content in the safe area is guaranteed to be unobscured
Like,
> “safe area to display in”.
So it's not a "safe area", it's a "potentially a safe area"
> bucking industry standard terminology
yes, bad standards should be bucked, preferably before they become standards, but the next best time is immediately after that or any time later.
What is an absolute safe area or an unobstructed area in video then to you? How do you guarantee such a thing?
Please define one as such that accounts for all display types with appropriate verbiage. I’ll personally advocate for it in the film industry if you can come up with a term that is better than what the entirety of cinematic history has been able to think of.
Apple only claim that it’s unobstructed by their specific display edges and overlays but it doesn’t account for other display concerns. Hence safe area is still the most intuitive name.
Would you also like to go and argue with people about alpha channels?
I'm not as bad at this as you are, I've explained why it's bad several times without a reference to my understanding, but you fail to understand that primitive explanation, so now think others also can't.
> Lots of things have historical names, especially in computer graphics, that are now standard.
So? Lots of things also change. You forgot to finish your thought.
> How do you guarantee such a thing?
I don't, I use an API that does
> Please define one as such that accounts for all display types with appropriate verbiage.
Your favorite term doesn't do that.
> I’ll personally advocate for it in the film industry if you can come up with a term that is better than what the entirety of cinematic history has been able to think of.
Oh no, that's a recipe for disaster, please stick to worshipping historic standards, don't ruin a tiny chance of progress!
> Apple only claim that it’s unobstructed by their specific display edges and overlays but it doesn’t account for other display concerns. Hence safe area is still the most intuitive name.
Apple only claim that unobstructed API is unobstructed by their specific display edges and overlays but it doesn’t account for other display concerns. Hence unobstructed area is still the most intuitive name because it still is the closest to the reality, and because "safe" is still not safe, so offers no benefit.
> Would you also like to go and argue with people about alpha channels?
What, are they sacred?
None of your suggestions actually improve the situation. They’d just not scale the way you think they would to other display types when you query the safe area.
For example, Apple specifically calls out why they use the safe area term to correspond to different display types
> In tvOS, the safe area also includes the screen’s overscan insets, which represent the area covered by the screen’s bezel.
Unobstructed is longer, and provides no extra information of value. Go work in a visual industry and see how long before you get tired of saying title unobstructed and action unobstructed.
Again, you’re coming at this form the perspective of a lay person in the realm of displays and getting angered that the term isn’t immediately intuitive to you personally, rather than the entire industry that has a standard term that they use and have for many decades.
Well, yes.
I have both a powerful gaming PC with Windows and all that, and a PS5, but some games I just like on my MacBook.
A few months ago I would have assumed you're correct.
But recently I've played some games on my laptop with Lutris on Ubuntu which uses Wine or Proton under the hood.
The performance and stability is excellent. Although I haven't done any testing, subjectively it feels superior or at least equal to Windows. I've played several intense titles which are pushing my laptop's GPU (Nvidia 3070m) to it's limit, most recently the new Indiana Jones game (which is excellent FYI).
It is easier for me to reboot into Windows.
My is a lovely machine for the most part but overheats a lot when gaming, so much so that it worries me. So I don't bother.
There's a kind of entitlement among developers coming from PC/Windows expecting Apple to give them everything on a silver plate, yet they gladly bend over when porting to consoles without a peep.
I mean yeah Apple documentation sucks ass sometimes but sheesh you know the notch is fucking there, you have to account for it. You gotta do some specific shit for PS/XBox/Switch too, can't always just expect a one-click port (yet). Hell, there isn't even any public/3rd-party documentation or articles or YouTube tutorials for consoles but you don't see anyone bitching about that.
Group all resolutions returned that are the same +-5% together and choose the lowest one in the desired bucket.
reactordev•5mo ago
This was an issue I also discovered on Xbox 360 in 2008. TV’s have overscan and depending on that setting, your resolutions will be off.
However, at the time, we couldn’t create render targets that matched the overscan safe area. XNA added a Screen SafeArea rect to help guide people but it was still an issue that you had to consciously develop for.
Now, we can create any back buffer size we want. It’s best to create one 1:1 or use DLSS with a target of 1:1 to the safe area for best results. I’m glad the author went and reported it but ultimately it’s up to developers to know Screen Resolution != Render Resolution.
Anyone using wgpu/vulkan/AppKit/SDL/glfw/etc need to know this.
DaiPlusPlus•5mo ago
(Besides, TV overscan is a solved problem: instead of specifically rendering a smaller frame games should let users set a custom FoV and custom HUD/GUI size - thus solving 3 problems at once without having to compromise anything).
rustystump•5mo ago
Many games do let users set the things you mention but it is not always so simple. For example, handling rounded edges and notches is a huge pain.
reactordev•5mo ago
Basically, if you rendered an avatar image in the top left of the screen, perfectly placed on your monitor, on the TV its head would be cut off. So you change to safe area resolution and it’s perfect again (but on your monitor safe area and screen resolution are the same, except Apple apparently). Make sense?
You can see how if your screen says it’s 4k, but really it’s short 40 pixels, you render at 4k - the screen will shrink it by 40 pixels and introduce nasty pixel artifacts. TV overscan goes the other way. Interesting find by the author about the notch.