Also just noticed this:
"And now with M5, the new 14-inch MacBook Pro and iPad Pro benefit from dramatically accelerated processing for AI-driven workflows, such as running diffusion models in apps like Draw Things, or running large language models locally using platforms like webAI."
First time I've ever heard of webAI - I wonder how they got themselves that mention?
I wondered the same. Went into Crunchbase and found out Crunchbase are now fully paywalled (!), well saw that coming... Anyway, hit the webAI blog, apparently they were showcased at the M4 Macbook Air event in 2024 [1] [2]:
> During a demonstration, a 15-inch Air ran a webAI’s 22 billion parameter Companion large language model, rendered a 4K image using the Blender app, opened several productivity apps, and ran the game Wuthering Waves without any kind of slowdown.
My guess is this was the best LLM use-case Apple could dig-up for their local-first AI strategy. And Apple Silicon is the best hardware use-case webAI could dig-up for their local-first AI strategy. As for Apple, other examples would look too hacky, purely dev-oriented and depend on LLM behemoths from US or China. Ie "try your brand-new performant M5 chip with LM Studio loaded with China's Deepseek or Meta's Llama" is an Apple exec no-go.
1. https://www.webai.com/blog/why-apples-m4-macbook-air-is-a-mi...
2. https://finance.yahoo.com/news/apple-updates-bestselling-mac...
Now that they own the SoC design pipeline, they’re really able to flex these muscles.
Very often the intel chips in macbooks were stellar, they were just seriously inhibited by Apples terrible cooling designs and so were permanently throttled.
They could never provide decent cooling for the chips coupled with their desire to make paper thin devices.
Curiously they managed to figure this out exactly when it became their silicon instead (M1 MacBook Pros were notably thicker and with more cooling capacity than the outgoing Intel ones)
And this would eventually evolve into MacOS.
But I have had 2 iMac power supply die one me, the grounding problem on a MBP and a major annoyance with power noise leaking from a Mac Mini (makes for some nasty audio output, hilarious when you consider they supposedly target creative who clearly need good audio output).
You always find people raving about Apple's engineering prowess but my experience is that it's mostly a smoke show, they make things look good, miniaturise/oversimplify beyond what is reasonable and you often end up with major hardware flaws that are just a pain to deal with.
They always managed to have good performance and a premium feeling package but I don't think their engineering tradeoffs are actually very good most of the time.
As far as I can tell, the new Mac Mini design still has grounding issues, and you will get humming issues, which is beyond stupid for a product of that caliber. At this point I don't care about having the power supply inside the dam box, just use a brick if you must to prevent that sort of problem. This is particularly infuriating since they made the iMac PSU external, which is beyond stupid for an AiO.
But common sense left Apple a long time ago and now they just chase specs benchmarks and fashionnable UIs above everything.
That is probably the least of reasons why people buy Apple - to many it's just a status symbol, and the OS is a secondary consideration.
https://www.google.com/search?q=apple+products+as+status+sym...
EDIT: I seem to be getting downvoted, so I will just leave this here for people to see I am not lying:
https://www.businessinsider.com/tim-cook-apple-is-not-a-hard...
[1] They used that exact term, and it has stuck with me ever since.
Apple's product revenue in this fiscal year has been $233B, with a gross margin of $86B.
Their services revenue is $80B with $60B gross margin.
https://www.apple.com/newsroom/pdfs/fy2025-q3/FY25_Q3_Consol...
Look, I totally understand making an off-hand comment like you did based on a gut feeling. Nobody can fact-check everything they write, and everyone is wrong sometimes. But it is pretty lazy to demand a source when you were just making things up. When challenged with specific and verifiable nubmers, you should have checked the single obvious source for the financials of any public company. Their quarterly statements.
Regardless of revenue, Apple isn't a hardware company or a software company. It's a product company. The hardware doesn't exist merely to run the software, nor does the software exist merely to give functionality to the hardware. Both exist to create the product. Neither side is the "main" one, they're both parts of what ultimately ships.
Watch this and maybe you might change your mind:
Modern Apple is also quite a bit more integrated. A company designing their own highly competitive CPUs is more hardware-oriented than one that gets their CPUs off the shelf from Intel.
Yes, it's $70B a year from iPhones alone and $23B from the totality of the Services org. (including all app store / subscription proceeds). Significantly more than 50% of the company's total profits come from hardware sales.
We should be comparing profit on those departments not revenue. Do you have those figures?
It is well known that companies often sell the physicval devices at a loss, in order to make the real money from the services on top.
Apple is and always has been a HW company first.
Steve Jobs consistently made the point that Apples hardware is the same as everyone elses, what makes them different is they make the best software which enables the best user experience.
Here see this quote from Steve Jobs which shows that his attitude is the complete opposite of what you wrote.
The above link is a video where he mentions that.
It is true that Apple’s major software products like iOS and MacOS are only available on Apple’s own hardware. But the Steve Jobs justification for this (which he said in a different interview I can’t find right now so I will paraphrase) is that he felt Apple made the best hardware and software in the world so he wanted Apple’s customers to experience the best software on the best hardware possible which he felt only Apple could provide. (I wish I could find the exact quote.)
Anyway according to Steve Jobs Apple is a software first company.
If you care about software you have to make your own hardware.
I'll allow that perhaps Apple considers hardware a means to an end. But what an end.
It was coherent, (relatively) bug free, and lacked the idiot level iOSification and nagging that is creeping in all over MacOS today.
I haven't had to restart Finder until recently, but now even that has trouble with things like network drives.
I'm positive there are many internals today that are far better than in Snow Leopard, but it's outweighed by user visible problems.
It shouldn't surprise you I think that Android Jelly Bean was the best phone OS ever made as well, and they went completely in the wrong direction after that.
Programs absolutely could have much more controllable auto save before for when it made sense.
Speaking of security it didn't have app sandboxing either.
This is what I mean about iOSification - it's trending towards being a non serious OS. Linux gets more attractive by the day, and it really is the absence of proper support of hardware in the class of the M series that prevents a critical mass of devs jumping ship.
being poor, I need to sell my Macbook to get money to pay of my 16e, then sell the 16e and use that money to but a Pixel 9, then probably a but a Thinkpad Carbon X1. Just saying all that to show you the lengths I am going through to boycott/battle the enshitification.
At least its open source and free I guess.
Adding extra features that aren't necessarily needed is enshittification, and very not-unix.
This, and while in this case it is specifically unwise on security terms, there are plenty of other example where the feature are completely cosmetic and deviates from the core user requirements/scenario.
That would be the end of open source, hobbyists and startup companies because you'd have to pay up just to have a basic C library (or hope some companies would have reasonable licensing and support fees).
Remember one of the first GNU projects was GCC because a compiler was an expensive, optional piece of software on the UNIX systems in those days.
It's not even about open source or closed source at this point. It's about feature creep.
Why parse whatever is in the logs, at all?
Imagine the same stuff in your SSH client, it would parse the content before sending them over because a functionality requires it to talk to some server somewhere, it's insanity.
There's also Krita, which artists love.
That this comment keeps oscillating between upvoted and downvoted (with significant spikes in both directions) is an interesting insight into the span of opinions on HN between the hustler types who hate the idea of software that doesn't turn a quick buck, and the crafters :-)
> Hardware and software both matter, and Apple’s history shows that there’s a good argument to be made for developing integrated hardware and software. But if you asked me which matters more, I wouldn’t hesitate to say software. All things considered I’d much prefer a PC running Mac OS X to a Mac running Windows.
https://daringfireball.net/2009/11/the_os_opportunity
At the time I'd only been a Mac user for a few years and I would have strongly agreed. But definitely things have shifted— I've been back on Windows/WSL for a number of years, and it's software quality/compatibility issues that are a lot of what keeps me from trying another Mac. Certainly I'm far more tempted by the hardware experience than I am the software, and it's not even really close.
It's a server or developer box first and a non-technical user second.
On Linux there is variety and choice, which some folks dislike.
But on the Mac I get whatever Apple gives me, and that is often subject to the limitations of corporate attention spans and development budgets.
And arbitrary turf wars like their war against web apis/apps causing more friction for devs and end users.
Should Emacs and Vim both be called "Editor" then?
To me, this is actually a great example of the problems with Linux as a community, that GUI applications seem to just be treated as placeholders (e.g., all word processors are the same?), but then its inconsistent by celebrating the unique differences between editors like Vim and Emacs. Photoshop, Excel, Logic Pro, Final Cut Pro are, in my opinion, crown jewels of what we've accomplished in computing, and by extension some of the greatest creations of the human race, democratizing tasks that in some cases would have cost millions of dollars before (e.g., a recording studio in your home). Relegating these to generic names like "spreadsheet", makes them sound interchangeable, when in my opinion they're each individual creations of great beauty that should wear their names with pride. They've helped improve the trajectory of the human race by facilitating many individuals to perform actions they never would have had the resources to do otherwise.
I've used some distributions in which they were. Tooltips and icons were provided to disambiguate. Worked for me.
Other distributions name applications explicitly, some place them in a folder together named "Editors".
None of the distributions I've used place either in a corporate branded subfolder as is typical on Windows and Mac.
Freedom of choice is wonderful.
But, to your point, even I'll admit the fact that the Photoshop is called "Adobe Photoshop 2025" is annoying lol.
If I close my laptop for a few days, I don't want significant battery drain. If I don't use it for two weeks, I want it to still have life left. And I don't want to write tens of gigabytes to disk every time I close the lid, either!
If you're talking about hardware interaction from the command line, that's very different and I don't think there's a fix.
I want good window management. Linux gives me a huge number of options. MacOS - not as much.
One can just hand wave "Apple must support Linux and all" but that is not going to get anything done.
Edit: Hard to call intentionally preventing support for web apis a power user thing. This creates more friction for basic users trying to use any web app.
Edit2: lol Apple PR must be all over this, went from +5 to -1 in a single refresh. Flagged for even criticizing what they intentionally break.
I understand that this post is about MacOS, but yes, we are forced to support Safari for iOS. Many of these corporate decisions to prevent web apps from functioning properly spill over from MacOS Safari to iOS Safari.
On iOS you cannot even keep a web app running in the background. The second they mutlitask, even with an audio/microphone active, Apple kills it. Are they truly adding battery life or are they cheating by creating restrictions that prevent apps from working?
Being able to conduct a voice call through the browser seems like a pretty basic use case to me.
For a simple example, no app remembers the last directory you were working in. The keys each app uses are completely inconsistent from app to app. And it was only in Windows 11 that Windows started remembering my window configuration when I plugged and unplugged a monitor. Then there’s the Windows 95-style dialog boxes mixed in with the Windows 11-style dialog boxes; what a UI mess. I spoke with one vendor the other day who was actually proud they’d adopted a ribbon interface in their UI “just like Office” and I verbally laughed.
From a hardware perspective, I still don’t understand why Windows and laptop manufacturers can’t get sleep working right. My Intel MacBook Pro with an old battery still sleeps and wakes and lasts for several hours, while my new Windows laptop lasts about an hour and won’t wake from hibernate half the time without a hard reboot.
I think Windows is the “good enough” for most people.
While overall I may say MacOS is better, I would not say it's better in every way.
Believe it or not, I had a better experience with 3rd party window managers in Windows than on MacOS.
I don't think the automation options in MacOS are better than AutoHotKey (even Linux doesn't have something as good).
And for corporate work, the integration with Windows is much better than anything I've seen on MacOS.
Mac HW is great. The OS is in that uncanny valley where it's UNIX, but not as good as Linux.
Did you try Keyboard Maestro https://www.keyboardmaestro.com/main/ (I've never used AutoHotKey and I'd be super curious if there are deficiencies in KM relative to it, but Keyboard Maestro is, from my perspective, a masterpiece, it's hard to imagine it being any better.)
Also I think this statement needs a stronger defense given macOS includes Shortcuts, Automator, and AppleScript, I don't know much about Windows automation but I've never heard of them having something like AppleScript (that can say, migrate data between applications without using GUI scripting [e.g., iterate through open browser tabs and create todos from each of them operating directly on the application data rather than scripting the UI]).
So, Windows' saving grace is being able to run a different operating system inside it? Damning with faint praise if I ever heard it...
You can't, really. Almost everyone resorts to buying an HDMI dongle to fake a display. Apple solved the problem at such a low level, the flexibility to run something in clamshell mode is broken, even when using caffeine/amphetamine/etc etc etc.
So, tradeoffs. They made their laptops go to sleep very well, but broke functionality in the process. You can argue it's a good tradeoff, just acknowledge that there WAS a tradeoff made.
If I’m wrong, someone tell me how to do it! On an M4 MacBook Air running latest OSX release.
Oh god, I'm going to have to bite the bullet and switch to 11, huh?
The one thing that has been saving me from throwing my PC out the window in rage has been the monitor I have that supports a "keep alive" mode where switching inputs is transparent to the computers connected to it. So when switching inputs between my PC and laptop neither one thinks the monitor is being disconnected/reconnected. If it wasn't for that, I'd be screaming "WHY ARE YOU MOVING ALL MY WINDOWS?" on a regular basis. (Seriously, why are you moving all my windows? Sure, if they're on the display that was just disconnected, I get you. But when I connect a new display, Windows 10 seems to throw a dart at the display space for every window and shuffle them to new locations. Windows that live in a specific place on a specific display 100% of the time just fly around for no reason. Please god just stop.)
A friend of mine lost a ton of messages when upgrading the OS (and therefore Mail). A number of others were affected by the same issue. There have been show-stopper bugs in the core functionality of Photos as well. I don't get the impression that the basics are Apple's focus with respect to software.
But I’ve certainly never struggled with getting WiFi to work on a Mac, or struggled with getting it to sleep/wake, or a host of other problems you routinely have on both Windows and Linux.
It’s not even close.
To compare Apples to apples, you'd have to look at a Framework computer and agree that wifi is going to work out of the box... but here I'm meeting you on a much weaker argument: "Apple's software basics are /not/ rock solid, but other platforms have issues too"
I don't find your original anecdote convincing:
> A friend of mine lost a ton of messages when upgrading the OS (and therefore Mail).
E.g., what does this mean? They lost mail messages? How did they verify they had those messages before and after? E.g., file-system operations? GUI search? How much do they know about how Mail app stores message (e.g., I used to try understand this decades ago, but I expect today messages aren't even necessarily always stored locally)? How are you syncing mail messages, e.g., using native IMAP, or whatever Gmail uses, or Exchange? What's the email backend?
E.g. without deeper evidence this sounds more like a mail message indexing issue rather than a mail-messages-stored-on-disk-issue (in 2025, I'd personally have zero expectations about how Mail manages messages on disk, e.g., I'd expect local storage of message to be dynamically managed like most applications that aren't document-based use a combination of cloud functionality and local caching, e.g., found this in a quick search https://apple.stackexchange.com/questions/471801/ensure-maco...), but if you have stronger evidence I'd love to hear it. But as presented your extrapolating much stronger conclusions than are warranted by the anecdote in my opinion.
I want to be able to set different networking options (manual DNS, etc) for different wifi networks, but as far as I can tell, I can only set them per network interface.
There's something like "locations" but last time I tried using that, the entire System Settings.app slowed to a crawl / beachballed until I managed to turn it back off.
> or struggled with getting it to sleep/wake
My m1 MBP uses something like 3-5% of its battery per hour while sleeping, because something keeps waking it up. I tried some app that is designed to help you diagnose the issue but came up empty-handed.
... but yes on both counts, it's light years better than my last experience with Linux, even on hardware that's supposed to have fantastic support (thinkpads).
In my case it works roughly ~50% of the time. Probably because of the Thunderbolt monitor connected to power it, idk.
> the basics are still rock solid
The basics like the OS flat out refusing to provide you any debugging information on anything going wrong? It's rock solid allright. I had an issue where occasionally I would get an error "a USB device is using too much power, try unplugging it and replugging it." Which device? Why the hell would Apple tell you that, where is the fun in that?
Key remapping requires installing a keylogger, nor can you have a different scroll direction between mouse and touchpad. There still isn't window management which for the sizes of modern monitors is quite constraining.
> still has UNIX underneath
A very constrained UNIX. A couple of weeks ago I wanted to test something (pkcs11-tool signing with a software HSM), and turns out that Apple has decided that libraries can only be loaded from a number of authorised locations which can only be accessed while installing an application. You can't just use a dynamic library you're linking to, it has to be part of a wider install.
You can remap with config files: https://hidutil-generator.netlify.app
Long story short, I was very happy with the "it just works" of ChromeOS, and only let down by the lack of support for some installed apps I truly needed in my personal life. I tried a Mac back in 2015 but couldn't get used to how different it was, and it felt very bulky compared to ChromeOS and much slower than the Linux machine I'd had, so I switched to a Pixelbook as was pretty content.
Fast forward to 2023 when I needed to purchase a new personal laptop. I'd bought my daughter a Pixelbook Go in 2021 and my son a Lenovo x1 Carbon at the same time. Windows was such a dumpster fire I absolutely ruled it out, and since I could run all the apps I needed on ChromeOS it was between Linux & Mac. I decided to try a Mac again, for both work & personal, and I've been a very happy convert ever since.
My M2 Pro has been rock solid, and although I regret choosing to upgrade to Sequoia recently, it still makes me feel better than using Windows. M4 Pro for work is amazingly performant and I still can't get over the battery efficiency. The nicest thing, imho, is that the platform has been around long enough for a mature & vibrant ecosystem of quality-of-life utilities to exist at this point, so even little niggles (like why do I need the Scroll Reverser app at all?) are easy to deal with, and all my media editing apps are natively available.
They got away with pushing ads, online and enterprise services, Copilot, etc. to every desktop user.
Not once in 10 years have I had ti troubleshoot while she uses her personal macOS, but a Dell Latitude laptop in 2025 still can’t just “open lid, work, close lid”.
And it’s slower. And eats more battery.
Quoth the Tao of Programming:
8.4
Hardware met Software on the road to Changtse. Software said: "You are Yin and I am Yang. If we travel together, we will become famous and earn vast sums of money." And so they set forth together, thinking to conquer the world.
Presently, they met Firmware, who was dressed in tattered rags and hobbled along propped on a thorny stick. Firmware said to them: "The Tao lies beyond Yin and Yang. It is silent and still as a pool of water. It does not seek fame; therefore, nobody knows its presence. It does not seek fortune, for it is complete within itself. It exists beyond space and time."
Software and Hardware, ashamed, returned to their homes.
AMD was also lagging with drivers, but now we see OpenAI swearing they gonna buy loads of their products, which so many people were not favor of liek just 5-7 years ago.
https://arstechnica.com/gadgets/2023/08/report-apple-is-savi...
Apple's chip engineering is top tier, but money also buys them a lot of advance.
Hardware is naturally limited in scope due to manufacturing costs, and doesn't "grow" in the same way. You replace features and components rather than constantly add to them.
Apple needs someone to come in and aggressively cut scope in the software, removing features and products that are not needed. Pair it down to something manageable and sustainable.
macOS has way too many products but far too few features. In terms of feature-completeness, it's already crippled. What OS features can macOS afford to lose?
(I have the same complaint about AWS, where a bunch of services are in KTLO and would be better served by not being inside AWS)
Furthermore, they do also engage in the traffic and sale of digital programmes wrought by the hands of other, independent artisans.
But this is the exception.
There aren't a lot of tangible gains left to be made by the software teams. The OS is fine, the office suite is fine, the entertainment apps are fine.
If "performance" is shoving AI crap into software that was already doing what I wanted it to do, I'd rather the devs take a vacation.
Who knows, maybe the era of "exciting computing" is over, and iteration will be a more pleasant and subtle gradient curve of improvements, over the earth-shattering announcements of yore (such as the advent of popular cellular phones).
May be steve is true. We don't know what we want until some one shows it .
The UI itself is supposed to be intense to render to some degree. That's crazy because most of the time it looks like an Android skin from 2012.
And on top of this all -- absolutely nobody asked for this. No one asked for some silly new UI that is transparent or whateveer.
Apple (post Apple II) has always been a systems company, which is much different. Dell is a hardware company.
Hopefully that will bring whatever they’re doing right to other teams.
Biggest grief with MacOS software:
- Finder is very mediocre comparing to even File explorer in Windows
- Scrollbar and other UI issues
Unfortunately I don't think Asahi is going to catch up, and Macbook is so expensive, so I'll probably keep buying second hand Dell/Lenovo laptop and dump a Linux on top of it.
I still agree that second hand Thinkpads are ridiculously better in terms of price/quality ratio, and also more environmentally sustainable.
But I could be wrong. Maybe the earlier Macs didn't have great software either -- but at least the UI is better.
I do miss window shading from MacOS 8 or 9, though. I think a whimsical skin for MacOS would be nice, too. The system error bomb icon was classic, the sad-Mac boot-failure icon was at least consolation. Now everything is cold and professional, but at least it stays out of my way and looks decent.
Unfortunately, backwards-compatibility requirements prevented the addition of process memory isolation before OS X. One result of not having this protection was that an application with a memory bug could overwrite memory location zero(the beginning of a critical OS-only area), or any other memory area, and then all bets were off. Some third-party utilities, such as Optimem RAMCharger, gave partial protection from this by using the processor's protected mode, and also removed the occasional need for users to manually set the amount of memory allocated to a program. However, many programs were not compatible with these utilities.
It really is awful. Why the hell is there no key to delete a file? Where's the "cut" option for moving a file? Why is there no option for showing ALL folders (ie, /bin, /etc) without having to memorize some esoteric key combination?
For fuck's sake, even my home directory is hidden by default.
> - Scrollbar and other UI issues
Disappearing scrollbars make sense on mobile where screen real estate is at a premium and people don't typically interact with them. It does not make sense on any screen that you'd use a mouse to navigate.
For years, you couldn't even disable mouse acceleration without either an esoteric command line or using 3rd party software. Even now, you can't disable scroll wheel acceleration. I hate that I can't just make a consistent "one click = ~2 lines of text" behavior.
I could go on and on about the just outright dumb decisions regarding UX in MacOS. So many things just don't make sense, and I feel like they were done for the sole purpose of being different from everyone else, rather than because of a sense of being better.
Command + backspace.
Cmd+delete? I don't really want it to be a single key as it's too easy to accidentally trigger (say I try to delete some text in a filename but accidentally bump my mouse and lose focus on the name)
MacOS doesn't have enough 'openness' to it. There's no debug information, lack of tools etc. To this day I can still daily drive a XP or 98/2000 machine( if they supported the modern web) because all the essentials are still intact. You can look around system files, you customize them edit them. I could modify game files to change their behaviour. I could modify windows registry in tons of ways to customize my experience, experiment lot of things.
As a 'Pro' user my first expectation is options, options in everything I do , which MacOS lacks severely.
All the random hardware that we see launching from time to time have drivers for windows but not for Mac. Even linux has tons of terminal tools and customisation.
MacOS is like a glorified phone OS. It's weirdly locked down at certain places that drive you crazy. Tons of things do not have context menus(windows is filled with it).
Window management sucks, there's no device manager! Not even cli tools! (Or maybe I'm not aware?) Why can't I simpy cut and paste?
There's no API/way to control system elements via scripting, windows and linux are filled to the brim with these! Even though the UI is good looking I just cannot switch to an Apple device (both Mac and iPhone) for these reasons. I bought an iPad pro and I'm regretting. There's no termux equivalent in iPadOS/iOS , there are some terminal tools but they can't use the full processing power, they can't multi thread. They can't run in background, it's just ridiculous. The iPad Pro is just a glorious iPhone. Hardware doesn't make a device 'Pro' software does. Video editing isn't a 'Pro' workflow in the sense that it can be done in any machine that has sufficient oomph. An iPad Pro from 5 years ago will be slower than an iPad Air of today, does that make the air a 'Pro' device? No!
It's a bad idea to add an option entirely for the purpose of making the product not work anymore.
> Window management sucks
I'm always mystified reading these kinds of posts on HN because it literally always starts out as "macOS is an OS for babies" and turns out to mean "macOS doesn't have a tiling window manager". Like, cmon man, who cares.
> there's no device manager! Not even cli tools!
`ioreg -l` or `system_profiler`. Why does this matter?
> There's no API/way to control system elements via scripting
https://developer.apple.com/library/archive/documentation/Ac...
https://developer.apple.com/documentation/XCUIAutomation
> I'm always mystified reading these kinds of posts on HN because it literally always starts out as "macOS is an OS for babies" and turns out to mean "macOS doesn't have a tiling window manager". Like, cmon man, who cares.
For me, not so much the window management, but task management. I very strongly believe that the task bar (I guess the Dock bar in MacOS) should have a separate item for each open window of an app. If I have 3 Firefox windows open, that should be 3 entries in the task/dock bar so I can switch between them in a single click. I can do this in Windows, can't do it in MacOS.
One of the problems I have with MacOS is that it's not obvious how to start a second instance of an app. Sure, some apps will have a "New Window" option. But what about apps that don't, like Burp Suite? If I bring up the launcher, then click Burp Suite when one is already loaded, it just shows me the existing one.
A weakness of this is you can duplicate apps and launch the duplicate, even though they have the same bundle ID, so they might still fight over things.
Don't both of those exist now?
The reason the Mac is more "app-centric" is Conway's law; developers own apps so it's thought if you tried breaking apart an app it would fail, since previous "document-centric" efforts like OpenDoc failed.
Powershell now is lot more powerful than what Apple can dream to offer. MacOS is an opinionated OS for people who want to do simple tasks. MacOS apart from good looks offers nothing else.
Sums up how I feel about MacOS perfectly.
Which is why I'm so utterly baffled that it's become so popular among tech workers.
The tiling window manager thing is epidemic on Hacker News, and I think the explanation is two fold: Hacker News obviously leans towards programmers, programmers in general don't like the mouse, tiling window managers, as a general rule, are about avoiding needing to manage windows with the mouse.
The problem with that viewpoint, to me, is that, programming is literally the only complex modern computing task I can think of that isn't mouse-centric. E.g., if you're doing CAD, spreadsheet work, media editing, 3D, audio editing, all of those tasks are mouse-centric and the tiling thing just feels silly to me in that context (like I'm going to put Cinema 4D in a tile?). So it solves a problem I don't have (managing, what, my IDE and terminal windows? this isn't even something I think about) and makes seems like it would make things I think are hard today, even harder (arranging the Cinema 4D Redshift material graph, render preview, object manager, and geometry view where I can see the important parts of each all at the same time, which I do by arranging overlapping windows carefully).
Command+Backspace.
You don't cut, you move files, so copy and then choose the move option.
Apple's Hardware Chief, John Ternus, seems to be next in line for succession to Tim Cook's position.
I remember using iTunes when fixing the name of an album was a modal blocking function that had to write to each and every MP3, one by one, in the slowest write I have ever experienced in updating file metadata. Give me a magnetised needle and a steady hand and I could have done it faster.
A long time ago they had some pretty cool design guides, and the visual design has often been nice, but other than that I don't think their software has been notable for its quality.
Curious if I'm missing something though, is there another entity with a stronger suite than that? Or some other angle to look at this? (E.g., it seems silly to me to use an MP3 metadata example when you're talking about the same company that makes Logic Pro.)
It's not exactly clear to me what niche Apple occupies in this market. It doesn't feel like "native Mac UI" is a must-have feature for DAWs or IDEs alike, but maybe that's just my perspective.
> It lacks support for lots of plugins and hardware, and costs loads for what is essentially a weaker value prop than Bitwig or Ableton Live.
This is an obviously silly statement, not only is Logic Pro competitively priced ($200, relative to $100-$400 for Bitwig, $99-$750 for Live), but those applications obviously have different focuses than Logic Pro (sound design and electronic music, versus the more general-purpose and recording focus of Logic Pro, also you'd be hard pressed to find anyone who doesn't think Logic Pro comes with the best suite of stock plugins of any DAW, so the value prop angle is a particularly odd argument to make [i.e., Logic Pro is pretty obviously under priced]).
But all this isn't that important because many of these applications are great. DAWs are one of the most competitive software categories around and there are several applications folks will vehemently defend as the best and Logic Pro is unequivocally one of them.
> Most bedroom musicians are using Garageband or other cheap DAWs like Live Lite, and the professional studios are all bought into Pro Tools or Audition.
This is old, but curious if you have a better source for your statement https://blog.robenkleene.com/2019/06/10/2015-digital-audio-w...
Found a more recent survey https://www.production-expert.com/production-expert-1/2024-d...
> We can see that Pro Tools for music is the most popular choice, with Logic for music second and Pro Tools for post coming third.
Note that I'd say Logic Pro's popularity is actually particularly notable since it's not crossplatform, so the addressable market is far smaller than the other big players. It's phenomenal popular software, both in terms of raw popularity and fans who rave about it. E.g., note the contrast in how people talk about Pro Tools vs. Logic Pro. Logic Pro has some of the happiest users around, but Pro Tools customers talk like they feel like their hostages to the software. That difference is where the quality argument comes in.
I'm sorry to say it, but I genuinely think you're detached from the way professionals evaluate software. While I enjoyed my time on macOS when Apple treated it like a professional platform, I have no regrets leaving it behind or it's "quality" software. Apple Mail fucking sucks, iCloud is annoying as sin, the Settings App only got worse year-over-year and the default Music app is somehow slower than iTunes from 2011. Ads pop up everywhere, codecs and filesystems go unsupported due to greed, and hardware you own gets randomly depreciated because you didn't buy a replacement fast enough.
If that's your life, go crazy. People like you helped me realize that Macs aren't made for people like me.
I definitely didn't say this. Pro Tools likely has higher marketshare than Logic Pro, but I don't think anyone would conflate that with quality. I only brought up marketshare because you framed Logic Pro as being unpopular, which is just objectively not true.
> I'm sorry to say it, but I genuinely think you're detached from the way professionals evaluate software.
I literally think I've spent more time trying to understand this than practically anyone else e.g., https://blog.robenkleene.com/2023/06/19/software-transitions... but also my blog archives https://blog.robenkleene.com/archive/, it's one of the main subjects I think about and write about.
Note that how professionals evaluate software is tangential to what "quality" means in the context of software. E.g., I don't think anyone would argue Adobe is the paragon of software quality, but they're arguably the most important GUI software there is for creative professionals.
Both topics are very interesting to me, what software professionals use and why, and what constitutes quality in software.
> In the other response, you're telling off a perfectly valid criticism of Apple software because they won't fulfill your arbitrary demand for a better-looking DAW. Are you even engaging with the point they're trying to make?
I'm not sure what this means, who's talking about a "better-looking DAW" and which point am I not engaging with?
I also think you're confusing what I wrote. It's not a competition.
I have just found that Apple's hardware on desktop has been stronger than their software, in my experience (periodic sporadic use, ~2006->now).
[1] and now from a sibling comment I hear that perhaps people regard that tool as bad, so there you go, they jury is clearly out
E.g., I'd rank something like VS Code "lower quality" because when I launch VS Code, I can see each layer of the UI pop in as it's created, e.g., first I see a blank window, then I see window chrome being loaded, then a I see a row of icons being loaded on the left. This gives an impression of the software not being solid, because it feels like the application is struggling just to display the UI.
> I also think you're confusing what I wrote. It's not a competition.
> I have just found that Apple's hardware on desktop has been stronger than their software, in my experience (periodic sporadic use, ~2006->now).
I disagree with this, the only way to make an argument that Apple has deficiencies in their software is to demonstrate that other software is higher quality than Apples. Otherwise it could just be Apple's quality level is the maximum feasible level of quality.
> unremarkable (notes, calendar, contacts!?) and awkward (pages, numbers, keynote).
This is laughable, Notes is unremarkable? Give me a break, and Keynote is awkward? Have you ever Google'd how people feel about these applications?
I'd argue a critic only has value if they're willing to offer their own taste for judgement.
Now the M1 range, that really was an impressive 'outperform' moment of engineering for them, but otherwise this is just a clock-work MBA driven trickle of slightly better over-hyped future eWaste.
To outperform during this crisis, hardware engineers worth their salt need to designing long lived boxes with internals that can be easily repaired or upgraded. "yeah but the RAM connections are fiddly" Great, now that sounds like a challenge worth solving.
But you are right about the software. Installing Asahi makes me feel like I own my compter again.
"Linux on Apple Silicon: Asahi Linux aims to bring you a polished Linux® experience on Apple Silicon Macs."
Why the "®" after Linux? I think this is the first time I've seen this.
Aside from that, I think of Apple as a hardware company that must write software to sell their devices, maybe this isn't true anymore but that's how I used to view them. Maintaining and updating as much software as Apple owns is no small task either.
The PowerBook from the mid 1990’s were hugely successful, especially the first ones, which were notable for what we now take for granted: pushing the keyboard back allowing space for palm rests. Wikipedia says at one time Apple had captured 40% of the laptop market. All the while the ’90s roared on, Apple was languishing, looking for a modern OS.
But just once, I'd love to hear someone reply to this and say they really love something like OneNote, and list out why they think it's a "higher quality" piece of software than Apple Notes. Personally, while I observe a lot of bugs in Apple's software, really that's true of all the (GUI in particular) software I use. If I go across all the software I use, Apple's offerings are almost universally on the top end by the metrics I'd measure for quality compared to similar offerings (e.g., something like OneNote is directly comparable to Apple Notes, whereas a custom built notes app that doesn't sync across devices most certainly is not). Apple's apps are usually well-designed, performant, bug free (relatively speaking, there are always bugs in software, but if I put, say, OmniFocus and Reminders next to each other [two apps that have the same purpose that I use every day, Reminders overall has less bugs than OmniFocus]), and they're mostly consistent with each other.
Putting all that together, the breadth of Apple's software offerings, and they're consistent high quality relative to similar offerings from other companies, makes Apple seem to me like the best company in the world today at making GUI software! Which doesn't mean they're perfect, and doesn't mean they can't do better, but is still super impressive.
Software (iOS26), services (Music/Tv/Cloud/Apple Intelligence) and marketing (just keep screaming Apple Intelligence for 3 months and then scream Liquid Glass) ---- on the other hand seem like they are losing steam or very reactive.
No wonder John Ternus is the widely anticipated to replace Tim Cook (and not Craig).
So give the software some slack.
Edit: gigabits indeed. Confusing, my old M2 Max has 400 GB/s (3200 gigabits per second) bandwidth. I guess it's some sort of baseline figure for the lowest end configuration?
Edit 2: 1,224 Gbps equals 153 GB/s. Perhaps M5 Max will have 153 GB/s * 4 = 612 GB/s memory bandwidth. Ultra double that. If anyone knows better, please share.
Edit: Apparently 100GB/s, so a 1.5x improvement over the M3 and a 1.25x improvement over the M4. That seems impressive if it scales to Pro, Max and Ultra.
The advantage of the unified architecture is that you can use all of the memory on the GPU. The unified memory architecture wins where your dataset exceeds the size of what you can fit in a GPU, but a high end gaming GPU is far faster if the data fits in VRAM.
That’s true for the on-GPU memory but I think there is some subtlety here. MoE models have slimmed the difference considerably in my opinion, because not all experts might fit into the GPU memory, but with a fast enough bus you can stream them into place when necessary.
But the key difference is the type of memory. While NVIDIA (Gaming) GPUs ship with HBM memory ship for a while now, the DGX Spark and the M4 use LPDDR5X which is the main source for their memory bottleneck. And unified memory chips with HBM memory are definitely possible (GH200, GB200), they are just less power efficient on low/idle load.
NVIDIA Grace sidestep: They actually use both HBM3e (GPU) and LPDDR5X (CPU) for that reason (load characteristics).
The moat of the memory makers is just so underrated…
Guessing that's their base tier and it'll increase on the higher spec/more mem models.
I wish Apple would take gaming more seriously and make GPTK a first class citizen such as Proton on Linux.
This has been by far the best setup until Apple can take gaming seriously, which may never happen.
No one who was forced to write a statement like [this](https://help.steampowered.com/en/faqs/view/5E0D-522A-4E62-B6...) is going to be enthusiastic about continuing to work with Apple.
1. When is the next transition on bits? Is Apple going to suddenly move to 128-bit? No.
2. When is the next transition on architecture? Is Apple going to suddenly move back to x86? No.
3. When is the next API transition? Is Apple suddenly going to add Vulkan or reinvigorate OpenGL? No. They've been clear it's Metal since 2014, 11 years ago. That's plenty of time for the industry to follow if they cared, and mobile gaming has adopted it without issue.
We might as well complain that the PlayStation 4 was completely incompatible with the PlayStation 3.
> What would it even be? 128 bit? Back to x86? Notarization++? Metal 4 incompatible with Metal 1?
Sure, I can think of lots of things. Every macOS update when I worked in this space broke something that we had to go fix. Code signature requirements change a bit in almost every release, not hard to imagine a 10-year-old game finally running afoul of some new requirement. I can easily see them removing old, unmaintained APIs. OpenGL is actively unmaintained and I would guess a massive attack vector, not hard to see that going away. Have you ever seen their controller force feedback APIs? Lol, they're so bad, it's a miracle they haven't removed those already.
You see, the existence of that "almost" is already less confidence than developers have on every game console as well as Linux and Windows.
The attitude in the apple developer ecosystem is that apple tells you to jump, and you ask how high.
You could complain that Playstation 4 software is incompatible with Playstation 3. This is the PC gaming industry, there are higher standards for the compatibility of software that only a couple companies can ignore.
"This is the PC gaming industry"
Who said Apple needed to present themselves as a PC gaming alternative over a console alternative?
Macs are personal computers, whether or not they come from some official IBM Personal Computer compatibility bloodline.
Sega Saturn - 9 million
Wii U - 13 million
PlayStation 5 - 80 million
Nintendo Switch - 150 million
Nintendo Switch 2 opening weekend - 4 million in 3 days
Sure.
https://en.wikipedia.org/wiki/List_of_best-selling_mobile_ph...
https://store.steampowered.com/stats/stats/
If you consider time zones (not every PC gamer is online at the same time), the fact that it's not the weekend, and other factors, I'd estimate the PC gaming audience is at least 100M.
Unfortunately, there's no possible way to get an exact number. There are multiple gaming PC manufacturers, not to mention how many gaming PCs are going to be built by hand. I'm part of a PC gaming community, and nearly 90% of us have a PC built by either themselves or a friend/family. https://pdxlan.net/lan-stats/
I mean, it's at least partially true. I used to play BioShock Infinite on my MacBook in high school, there was a full port. Unfortunately it's 32 bit and doesn't run anymore and there hasn't been a remaster yet.
Anyway, the whole situation was quite bad. Many games were still 32-bit, even if macOS itself had been mainly 64-bit for almost 10 years or more. And Valve didn't help either, the Steam store is full of 64-bit mislabeled as 32-bit. They could have written a simple script to check whether a game is actually 64-bit or not, instead they decided to do nothing and keep their chaos.
The best solution would have been a lightweight VM to run old 32-bit games, nowadays computer are powerful enough to do so.
You don't buy Apple to use your computer they way you want to use it. You buy it to use it the way they tell you to. E.g. "you're holding it wrong" fiasco.
In some ways this is good for general consumers (and even developers, with limited config comes less unpredictablilty)... However this generally is bad for power users or "niche" users like Mac gamers.
Not to mention many subscription services on iOS that don’t allow you to subscribe through the App Store.
That is true, but now they are in a position where their hardware is actually more affordable and powerful than their Windows/x86 counterpart - and Win 11 is a shitload of adware and an annoyance in itself, layered ontop of a OS. They could massively expand their hardware sales to the gaming sector.
I'm eyeing at a framework Desktop with an AMD AI 395 APU for gaming (I am happy with just 1080p@60) and am looking at 2000€ to spend, because I wan't a small form factor. Don't quote me on the benchmarks, but a Mac Mini on M4 Pro is probably cheaper and more powerful for gaming - IF it had proper software support.
Why would I do anything bespoke at all for such a tiny market? Much less an entirely unique GPU API?
Apple refusing to support OpenGL and Vulkan absolutely hurt their gaming market. It increased the porting costs for a market that was already tiny.
Because there is a huge potential here to increase market share.
Note that games with anticheat don't work on Linux with Proton either. Everything else does, though.
Of course some anticheats aren't supported at all, like EA Javelin.
https://forums.ea.com/blog/apex-legends-game-info-hub-en/dev...
I just redid my windows machine to get at TPM2.0 and secure boot for Battlefield 6. I did use massgrave this time because I've definitely paid enough Microsoft taxes over the last decade. I thought I would hate this new stuff but it runs much better than the old CSM bios mode.
Anything not protected by kernel level anti cheats I play on my steam deck now. Proton is incredible. I am shocked that games like Elden Ring run this well on a linux handheld.
In my case, for software development, I'd be happy with an entry-level MacBook Air (now with a minimum of 16GB) for $999.
1. Implementing PR_SET_SYSCALL_USER_DISPATCH
2. Implementing ntsync
3. Implementing OpenGL 4.6 support (currently only OpenGL 4.1 is supported)
4. Implementing Vulkan 1.4 with various extensions used by DXVK and vkd3d-proton.
That said, there are alternatives to those things. 1. Not implementing this would just break games like Jurassic World where DRM hard codes Windows syscalls. I do not believe that there are many of these, although I could be wrong.
2. There is https://github.com/marzent/wine-msync, although implementing ntsync in the XNU kernel would be better.
3. The latest OpenGL isn't that important these days now that Vulkan has been widely adopted, although having the latest version would be nice to have for parity. Not many things would suffer if it were omitted.
4. They could add the things needed for MoltenVK to support Vulkan 1.4 with those extensions on top of Metal:
https://github.com/KhronosGroup/MoltenVK/issues/203It is a shame that they do not work with Valve on these things. If they did, Proton likely would be supported for MacOS from within Steam and the GPTK would benefit.
Since I am playing mostly MSFS 2024 these days I currently use GeForce Now which is fine, but cloud gaming isn’t still quite there yet…
Death Stranding is a great looking game to be sure, but it's also kinda hard to get excited about a 5 year old game achieving rtx 2060 performance on a $2000+ system. And that was apparently worthy of a keynote feature...
Codeweavers?
I've been trying to get Unreal Engine to work on my Macbook but Unity is an order of magnitude easier to run. So I'm also stuck doing game development on my PC. The Metal APIs exist and apparently they're quite good... it's a shame that more engines don't support it.
edit: for now I'll get that win 10 ESU
Inference speed and fast feedback matter a lot more than perfect generation to me.
I personally wish they would learn from the failure of Metal.
Also unleashes? Really? The marketing madness has to stop at some point.
I think Metal's ergonomics advantage is a much slimmer lead when you consider the other high-level APIs it competes with.
API that has dependency on objective-c runtime doesn't sound very good
M4: May 2024
M4 pro/max: Oct 2024
https://www.apple.com/newsroom/2024/05/apple-introduces-m4-c...
https://www.apple.com/newsroom/2024/10/apple-introduces-m4-p...
My hope is that they are taking longer because of a memory system upgrade that will make running significantly more powerful LLMs locally more feasible.
All in all, apple is doing some incredible things with hardware.
Software teams at apple really need to get their act together. The M1 itself is so powerful that nobody really needs to upgrade that for most things most people do on their computers. Tahoe however makes my M1 Air feel sluggish doing the exact same tasks ive been last couple of years. I really hope this is not intentional from Apple to make me upgrade. That would be a big let down.
An infinitely small percentage of people can take advantage of 320Mhz. It's fine.
https://support.apple.com/en-gb/guide/deployment/dep268652e6...
I was at a Wi-Fi vendor presentation a while back and they said that 160 Mhz is pretty improbable unless you're leaving alone and no wireless networks around you. And 320 Mhz even less so.
In real life probably the best you can get is 80 Mhz in a really good wireless environment.
WiFi does currently get anywhere near the bandwidth that these huge channels advertise in realistic environments.
For 6ghz? Yeah, not uncommon.
2.4Ghz is pretty much only used by IoT, you generally don't care about channel width there. When your client device (laptop, phone) downgrades to 2.4Ghz it might as well disconnect because it's unusable.
5Ghz get stopped by a drywall, so unless your walls are just right to bounce off single, you need AP in every room. Ceiling mounting is pretty much required and you're pretty much free to use channels as wide as your device support and local laws allow.
6Ghz get stopped by a piece of paper, so the same as 5Ghz except you won't get 6Ghz unless you have haev direct line of sight to the AP.
https://support.apple.com/guide/deployment/wi-fi-ethernet-sp...
No devices support 320Mhz bandwidths, and only supports 160Mhz on 6GHz band on MacBooks and iPads. Some iPhones support 160Mhz on 5GHz as well.
Reducing Broadcom's influence over the WiFi ecosystem alone would be a large benefit.
I have a work provided M2 Pro with 32GB of RAM. After the Tahoe upgrade it feels like one of the sluggish PCs at the house. It is the only one that I can see the mouse teleporting sometimes when I move it fast. This is after disabling transparency in Accessibility settings mind you, it was even worse before.
I just got one example while passing the mouse quickly through my dock (I still use the magnify animation) and I can clearly see it dropping a few frames. This never happened in macOS 15.
Electron used to override a private function that makes the Mac OS sluggish on Tahoe, and apparently no one uses Electron apps while doing testing at Apple.
What I can say is that while the situation is much better than at Day 1, the whole Tahoe experience is not as fluid as Sequoia.
Also, it doesn't really matter to me if this was a private function or not, if this was Windows or Gnome/KDE people would blame the developers of the desktop instead.
Even if you disqualify the devs from being mad, everyone else gets to be mad.
In an ideal situation, they would have noticed the widespread use of this private function a long time ago, put a note on the bug report that it works around, and after they fixed the bug they would have reached out to electron to have them remove that access.
If you owe the bank $100 and don't pay, that's your problem: you'll get in trouble for it, and the bank isn't going to be unduly harmed.
If you owe the bank $100 million and don't pay, that's the bank's problem: the loss of that $100 million is going to hit the bank hard, whether or not they're the ones who are in the right and regardless of how much trouble you get in over it.
Likewise, if you're a small time app developer and you use a private method that gets yanked and your app breaks, that's your problem: your users are going to be pissed at you, you'll take the reputational damage, and even if your users are also pissed at the OS vendor they represent such a small group of individuals that the OS vendor isn't going to be unduly harmed by that.
If, on the other hand, you develop one of the most widely used frameworks and you use a private method that gets yanked and your app breaks, that's the OS vendor's problem: the number of people who are pissed off at them (rightly or wrongly) is now much larger and they're going to take some reputational damage over it, whether or not they're the ones who have the moral high ground and regardless of how much reputational damage you also take.
And that's exactly what we're seeing here: it doesn't matter that Electron used an API they weren't supposed to, people are pissed at Apple about this and Apple, rightly or wrongly, has to contend with that reputational damage if they don't take steps to prevent this sort of thing before it happens (like letting the developers know that private-on-paper API is going to be yanked in advance, or making it mechanically impossible for anyone outside of Apple's own code to invoke that API long before someone depends on it).
> has to contend with that reputational damage if they don't take steps to prevent this sort of thing before it happens (like letting the developers know that private-on-paper API is going to be yanked in advance, or making it mechanically impossible for anyone outside of Apple's own code to invoke that API long before someone depends on it).
Again, that is what dev builds are for. Developers had months to verify their software still works on an OS that has confirmed release date and has very high ration of users that install the latest and greatest.
That's why this quote is relevant to this situation: it's totally Electron's fault for not adequately testing their framework against Apple's latest developer builds, but Apple could have absolutely done more to minimize the chance that Electron would make a mistake like this and cause lots of folks to be mad at Apple over it.
Should Apple be required to? No. Will they still suffer reputational damage if they don't and something like this happens? Yes.
Welp
https://leb.fbi.gov/articles/featured-articles/fifty-years-o...
Apple's private API situation was also much more nuanced, back in the days if Adobe was using an API, private or not, it probably wouldn't be degraded in any way until the core applications moved forward. Current Apple might not give a damn though.
What portion of, say, Slack devs actually run a MacOS beta at work? Are they regular devs, or are they in QA/test? It seems to me like the latter is the far more appropriate team for this.
This is 100% on electron, they didn’t do their due diligence that every Mac & iOS dev goes through every summer before the next release. It’s been two decades of the same song and dance every year. There’s just no excuse.
My simple guess is that slipped QA or wasn’t escalated from Apple’s feedback.
Considering the amount of electron apps, expecting all developers and all users to update all their app (and guessing which one is Electron based) isn’t good user-experience.
Let’s say the change is needed in the OS, you’d expect transition time. Also, a good UX on OS would be to notify user this app is using some API in a way that could reduce experience. but guessing and expecting only the developer and user parties without the OS side is making less sense imho.
This happens in pretty much every Electron app as far I know, and lots of Electron apps are like Spotify, VSCode or Slack are very likely to be in the Top 10 or at least Top 100 most used apps. And yes, I would expect Apple to test at least the most popular apps before releasing a new version of their OS.
> Maybe the thought is that an email from Apple to the dev saying fix your code would more compelling???
Of course not. Apple controls the SDK, they could workaround this in many different ways, for example instead of changing how this function was implemented they could introduce a new method (they're both private so it doesn't matter) and effectively ignore the old method (maybe also they could add a message for developers building their application that this method was removed). It would draw ugly borders in the affected apps but it wouldn't cause this issue at least.
why do we think this would be a solve as the devs clearly ignored the previous message about not using a private method?
If anything the fact that devs can actually access private symbols is an issue with how Apple designed their APIs, because they could make this so annoying to do that nobody would try (for example, stripping symbols).
Also, the fact that devs need to access private symbols to do what they need to do also shows that the public API is lacking at least some features.
Another thing, if this only affected the app itself that would be fine, but this makes the whole system slow to a crawl.
So while devs share some of the blame here (and I am not saying they don't), I still think this whole situation is mostly Apple's fault.
I think the failures here are that Apple should have tested this themselves and the Electron devs should have tested and resolved this during the beta period.
I don't think it's that clear cut. It looks like it was a workaround for a MacOS rendering bug going back to at least 2017, landed in 2019 and had no apparent downsides for six years[1].
The PR removing the private API code also included someone verifying that Apple had fixed the original bug some time in the intervening years[2].
I probably wouldn't have taken this approach personally (at the very least file the original rendering issue with Apple and note it with the code, though everyone knows the likelihood of getting a even a response on an issue like that), but it wasn't some cargo culted fix.
[1] https://github.com/electron/electron/pull/20360
[2] https://github.com/electron/electron/pull/48376#issuecomment...
Imagine now that you're a non tech savvy user, that probably doesn't update apps as often, they are probably wondering why "my laptop is so slow after updating". But like I said in other thread, maybe this is on purpose to make people upgrade.
If they stripped symbols, they’d get flak for not having good stack traces. I think it boils down to “if you’re huge, you’re never doing it right”.
Many of Apple's OS frameworks are managed code (ObjC/Swift); and in ObjC, calling across a library boundary is always done with a message-send — i.e. a dynamic-dispatch, necessarily symbol-table-based call into an entrypoint of the target library. So anything Apple did to "strip symbols" would make it impossible for them to have one of their libraries call into another of their libraries.
https://www.pcworld.com/article/2816273/how-microsofts-windo...
Sure, people in Hacker News now know that the issue is "that Electron bug", but I am sure lots of other people that are less tech savvy just kept questioning what the hell is happening and maybe even considered an upgrade. But maybe that is the whole point.
1. Apple should test every (common?) app and any change to the OS that they make that makes an app worse shouldn't be done regardless of why they wanted to make that change. 2. Even though Apple tells people not to use private APIs, if a program uses a private API anyway Apple should build a workaround into their OS instead of letting apps suffer their own repercussions. 3. Apple should test everything ahead of time and then go around telling all the app developers that there's a problem, as if those app developers are going to do anything about it.
No matter what Apple did here, their actual choices boiled down to:
1. Add workarounds for misbehaving broken apps, giving those apps no incentive to fix their issues, and forcing Apple to support those workarounds indefinitely; this also undermines their "don't use private APIs, they could break later" position. This is the kind of thing that made Windows into an unmaintainable sack of cruft.
2. Do what they did, which is change the API and let broken apps be broken to the user's detriment. Everyone blames Apple even though it's objectively not their fault.
2. Add some kind of non-workaround that caused problems for the app and not the user; e.g. have this private API rate limited or something so that the app ends up blocking in the call. Could cause problems for actual consumers of this API, and people would still blame Apple but in this case it would be more of their fault than option 2.
In the end, Apple can't spend their time fretting over what bad developers do wrong; they spend their time on their OS and software and if a developer writes bad software and causes problems then so be it.
Then the bugs could be reported to the various app developers, and they would have been able to get some notice. Many would have acted on it. Many of the top apps have dedicated Apple contacts already. Seems like a completely reasonable expectation?
But very few users seem to care about performance or polish, so why not save a few bucks and build your desktop software with some cheap JavaScript devs?
If electron didn't exist, it would be QT, or we'd only see native apps on Windows like the old days, and nothing at all on macOS and Linux (or just web apps).
It's not a tech issue but a cultural/management problem.
Personally I try to avoid Electron apps as much as possible, but it's pretty much unavoidable now. Docker Desktop, Bitwarden, 1password, slack, VSCode, dropbox, GitHub Desktop, Obsidian, Notion, Signal, Discord, etc. All the major apps are electron. Even in the Windows world Microsoft stopped making native and makes heavy use of their own version of Electron (EdgeWebView2) for their own apps. The freaking start menu is react native ffs.
The industry has lost its collective mind in favor of being able to hire cheap javascript talent
Swift itself is great and stable enoug now. I really like the language. SwiftUI though still needs work and is still missing functionality that you have to fall back on AppKit for so there's tons of bridges to embed AppKit views in your SwiftUI hierarchy (like NSTextView still relies on AppKit, as does some drag and drop functionality) so at a certain point you might as well just continue using AppKit.
To these companies, a "native app" is just "a web app with its own start-menu icon, no browser chrome, and access to OS APIs."
(In other words: if PWAs had access to all the same OS APIs that Electron does, then these companies wouldn't ship native apps at all!)
Given how high profile the impacted app are, yes it's their responsibility to test it. Even Microsoft does better there (or at least used to). Contacting electron and finding a solution would have been an easy step to take
https://github.com/tkafka/detect-electron-apps-on-mac
About half of the apps I use regularly have been fixed. Some might never be fixed, though...
From: https://www.reddit.com/r/MacOS/comments/1nvoirl/i_made_a_scr...
Hence I am not surprised that they ignore their existence.
Or also the other way around, nobody who develops electron apps cares to test their app on macos in the beta releases (beta testing for developers was long out afaik).
Except if it was like that JIT JVM bug that caused apps to crash and was not in the beta release.
You have it the other way around. It should be, apparently no one making Electron bothered to test on the numerous developer and public betas to make sure their hacky override of undocumented APIs (which Apple explicitly says not to use) didn't break.
they ship-of-thesseus the crap out of their OS but replacing with parts that need these new hardware features that run slow on older chips due to software-only implementations.
I got the first generation iPad Pro, which is e-waste now, but I use it as a screen for my CCTV, it cannot even display the virtual keyboard without stuttering like crazy, it lags switching apps, there's a delay for everything, this thing was smooth as butter on release.
I was considering just replacing the battery and keeping it for several more years but now I feel forced to upgrade which has me considering whether I still want/need an iPad since I'd also have to buy a new magic keyboard since they redesigned it, and they bumped the price ($1299 now vs. $999 when I got the 4th gen) so I'd be looking at $1700. Trying to hold out for an iPad Air with ProMotion.
I may be in the minority here, but I think 5 years is too short of a lifespan for these devices at this point. Early days when things were advancing like crazy, sure. But now? I have 8 year old computers that are still just fine, and with the M-series chips I'd expect at least 10 years of usable life at minimum (battery not withstanding)
What in the actual world of software engineering?
The iPad Air 13 with a M3 is a really nice experience. Very fast device.
> You cannot restore to any iOS versions other than signed ones. All SHSH blobs are currently useless.
So, anything newer than iPhone X can’t be downgraded
If this is true, then apples engineers or managers made a huge mistake in their implementation.
You’re inexperienced in the world not your knowledge.
I also have an M2 Pro with 32GB of memory. When I A/B test with Electron apps running vs without, the lag disappears when all the unpatched Electron apps are closed out.
1. https://avarayr.github.io/shamelectron/
Here's a script I got from somewhere that shows unpatched Electron apps on your system:
Edit: HN nerfed the script. Found a direct link: https://gist.github.com/tkafka/e3eb63a5ec448e9be6701bfd1f1b1...
My MacOS experience has been first party software is getting worse.
Look, if userspace apps can break system functionality, to the level that simple mouse cursor is not responsive, it suggests that there is something fundamental broken in the OS.
Yes, everyone should blame and shame Electron, but here the bug is firmly in the OS.
Electron is most likely using a whole ton more. Apple is sending a message. "Fix your crap or expect more."
Don't really care who is to blame, but they should have identified this, and either warn developers, or warn users. Or provide a tool for identifying guilty apps in your machine, and let users decide how to proceed.
Like I said, *this* is their warning.
- Open a GitHub issue explaining those private APIs shouldn't be used.
- Even better, open a PR fixing their use.
- Make those API calls a no-op if they come from an Electron app.
- Fix those API calls not to grind the OS to a halt for a seemingly simple visual effect.
- Create a public API allowing the same visual effect on a tested and documented API.
Choosing to (apparently violently) downgrade the user experience of all Electron app users, without a possibility to update at the launch day, if a deliberate decision and not an overlooked bug, is a rather shitty and user-hostile move, don't you think?
The _cornerMask override was a hack that shouldn't ever have existed in the first place, and it's not the only use of private APIs in the electron code base.
Apple is very clear about how they want you to make software for their OSes. It's 100% on electron that they choose to do it this way regardless.
I'd go as far as to say Electron itself is a hack that shouldn't exist, but sadly everyone has decided it's the only way they are going to make desktop software now.
(And shit like this is exactly why runtimes like the JVM or the .NET CLR are designed to install separately from any particular software that uses them. Each of their minor [client-facing-ABI compatible] versions can then be independently updated to their latest OS-facing-bugfix version without waiting for the software itself to ship that update.)
Long-term, this is a maintenance nightmare. These hacks can stick around for decades, because there's no backpressure on downstream to actually fix things. It's not about "team velocity", it's about keeping yourself sane.
> - Open a GitHub issue explaining those private APIs shouldn't be used.
> - Even better, open a PR fixing their use.
Apple has a history/culture of secrecy. Whenever they provide public source code, it's a dump thrown over the fence. There is most likely some team inside that actually cares, but they can't "just" open an issue. My guess is that this is their message.
> [...] is a rather shitty and user-hostile move, don't you think?
Yes, I agree, the general direction they've been taking has been increasingly user-hostile for a very long time; let alone the developer story.
But sometimes there's also a perfectly reasonable excuse, from both "technical" and "organizational" POV. That's just my take, a skunkworks effort to get Electron to fix their crap. I would do the same.
The most effective way would be for Apple to actually seek feedback on requirements and then actually implement public APIs for functionality that people need.
If it is accessible from userspace it is by no means private.
Does it mean the API is private in the sense of "unstable" interface? It could very well break the userspace app relying on undocumented behavior, however, crucially here, anything that is exposed to userland WILL at some point be used by some application, be it legitimate or malicious, and it should not break the OS in any way. That's basic hygiene, not even security.
inb4: yes, userspace app could trigger e.g. millions of io operations and millions of number crunching threads and thus cripple the rest of userspace (or at least the rest of userspace at given priority level), yet the system part should still run within performance envelope. Insert "Task Manager (Not Responding)" meme.
So we are talking about public/private access specifiers in source code, which only matter in cooperative setting. But that's IMO highly naive view as compute, especially OS, is objectively an adversarial environment. Some actors, at some point WILL figure out the memory layout and use that in an attack. There have been literally decades of whack-a-mole against bad actors.
I maintain my stance that any fields/members/methods loaded into a userspace program should not be capable of breaking the system.
I'd point fingers towards the electron core devs for this one, and not devs building apps on top of electron (since they likely didn't know that's how electron was doing it).
There are cases where OS companies noticed the use of private APIs and made cleaner public ones (the most obvious was the file system syncing stuff used by Dropbox and others, which used to use private APIs until Apple provided a public one).
And either way, applications shouldn't be able to break the system like this. You can reasonably expect error handling to catch this, even if the error is "a private API was called".
This is on Apple. 90% at least. Maybe 10% on Electron.
If I can walk on land that says private property, it's not private. I'll remember to use that argument when I get ticketed for trespassing.
There are APIs that are explicitly declared verboten for third-parties to use because they aren't intended for outside use. That doesn't make them magically inaccessible, but it does mean that when their unanticipated use breaks things, that's on the people who ignored the warnings.
I agree that this shouldn't be able to have the huge impact that it does and that Apple ought to have made their OS more resilient, but your logic is weak.
> Even in legal matters, it's already the case that laws that aren't enforced are worthless, cf. driving 5-10 mph over the speed limit being normal.
Just because all but one cop of the force ignore people driving over the speed limit doesn't mean the one who pulls you over is isn't able to write you a speeding ticket. Try that with a judge. It might work, but the law is very much still enforceable. This isn't like failing to protect a trademark.
Dude. Dudette. Duderino. Did you think this through before you hit post? I'm talking about enforcement. If you're getting a ticket, it's literally being enforced. And if it isn't, you get squatters! Thanks for the point in support, I guess?
I think this is the most braindead knee-jerk HN comment I've ever gotten as a reply, congratulations.
[Ed.: god, please, this genuinely hurts my brain.]
> but it does mean that when their unanticipated use breaks things, that's on the people who ignored the warnings.
Yeah. When it breaks things for them. Not when it breaks the entire OS' UI.
Let's stay with your analogy. Things change, Electron apps break? That's analog to finally getting around to calling the cops on squatters after dozing on it. Things change, your UI goes belly up due to Electron? That's you deciding to pay the bill for electricity and indoor plumbing for the squatters. No, wait, even better: you decided you finally want to build a new house on your plot, and now have to deal with getting the squatters out first. It'll happen, but you'll have to unnecessarily sink time and money into that. Apple's dealing with evicting Electron off their private APIs. What a nice analogy.
Of course the squatters are technically wrong. But why did you leave your front door open, and neglected and didn't check in for years? The part where you're making it hard for yourself is on you, mate. You're not going to get your lost time back. Why didn't you grab a lock at home depot?
> Just because all but one cop of the force ignore people driving over the speed limit
This is generally policy, not individual cops' discretion.
Wax idealistic all you want, but just imagine the discussion we’d be having if Apple had sigabort-ed all these misbehaving electron apps on day one. “No userland APIs, private or otherwise, should be able to crash your app!!!” Is the argument I would be responding to right now.
> > userspace app could trigger <...> and thus cripple the rest of userspace (or at least the rest of userspace at given priority level), yet the system part should still run within performance envelope
If userspace triggers what is an effectively a DoS attack and you cannot login to root terminal and get things sorted out that's a system not designed for adversarial userspace.
> but just imagine the discussion we’d be having if Apple had sigabort-ed all these misbehaving electron apps on day one
A more general context we are discussing here is resource exhaustion. There are myriads of scenarios where userspace can cause some form of resource exhaustion. My argument is that a 1) well designed 2) general use system should implement resource management in a way (e.g. priority queues) that userspace-inflicted resource exhaustion does not affect performance envelope of the system itself. Otherwise the system is open to unprivileged DoS attacks with only recourse being power cycling.
If your userspace app overcommits memory to some monstrous degree, what should the system do?
1. Enter a swap feedback, crippling the system down to unusability.
2. OOM-kill a process based on some heuristics.
3. freeze userspace, leaving privileged space functional.
The OS got a little slower, that’s it. It was never in some unrecoverable state. One could soft close the offending processes at anytime and regain the lost perf. I’m willing to bet you could hide or minimize the window to mitigate the issue, because the bug is very specific to the layout and render run loop, which auto-pauses on lost visibility by default.
That said, I haven’t even noticed the slowdown on my work machine, but I only use Teams. it’s always been dog shit slow, just par for the course.
I too have a work-provided laptop and a personal one bought the same month, with identical specs (the only difference is the US vs UK keyboard layout). The work-provided one is at least an order of magnitude slower to do anything thanks to enterprise crapware.
I’ve been debating making a Tumblr-style blog, something like “dumbapple.com,” to catalogue all the dumb crap I notice.
But, like, man - why can't I just use the arrow keys to select my WiFi network anymore? I was able to for a decade.
And the answer, of course, is the same for so much of macOS' present rough edges. Apple took some iPadOS interface elements, rammed them into the macOS UI, and still have yet to sand the welds. For how much we complain on HN about Electron, we really need to be pissed about Catalyst/Marzipan.
Why does the iCloud sign in field have me type on the right side of an input? Why does that field have an iPadOS cursor? Why can't I use Esc to close its help sheet? Why aren't that sheet's buttons focusable?
Why does the Stocks app have a Done button appear when I focus its search field? Why does its focus ring lag behind the search field's animated size?
Where in the HIG does it sign off on unfocusable text-only bolded buttons, like Maps uses? https://imgur.com/a/e7PB5jm
...Anyway.
1. I won't focus on a bunch of Siri items, but one example that always bugs me: I cannot ask Siri to give me directions to my next meeting. The latest OS introduces an answer for the first time, though. It tells me to open the calendar app on my Apple watch, and tap on the meeting, and tap the address. (I don't have an Apple watch.)
2. Mail.app on iOS does not have a "share sheet." This makes it impossible to "do" anything with an email message, like send it to a todo app. (The same problem exists with messages in Messages.app)
3. It is impossible to share a contact card from Messages.app (both iOS and MacOS). You have to leave messages, go to contacts and select the contact to share. Contacts should be one of the apps that shows up in the "+" list like photos, camera, cash, and plenty third party apps.
4. You still have to set the default system mail app in MacOS as a setting in the Mail.app, instead of in system settings. Last I checked, I'm pretty sure you couldn't do this, without first setting up an account in the Mail.app. Infuriating.
You can’t directly share the mail message, but you can “share” selected text or you can use the “print” option to generate a PDF of the message and “share” that instead. Not very discoverable but might cover at least some of what you want to do.
Also not sure if it’s new with iOS 26 but for the contacts thing you can at least skip the “leave messages and search for the contact in the contacts app” part. There’s button in the contact info that will take you directly to the contact in the contacts app. It does feel silly that you can’t share direct from the card in messages though.
I'd love to agree that comically amateurish, but apparently there's something about settings dialogs that make them incredibly difficult to search. It takes Android several seconds to search its settings, and the Microsoft start menu is also comically slow if you try to access control panels through it, although it's just comically slow at search in general. Even Brave here visibly chokes for like 200ms if I search in its preferences dialog... which compared to Android or Windows is instant but still strikes me as a bit to the slow side considering the small space of things being searched. Although it looks like it may be more related to layout than actual searching.
Still. I dunno why but a lot of settings searches are mind-bogglingly slow.
(The only thing I can guess at is that the search is done by essentially fully instantiating the widgets for all screens and doing a full layout pass and extracting the text from them and frankly that's still not really accounting for enough time for these things. Maybe the Android search is blocked until the Storage tab is done crawling over the storage to generate the graphs that are not even going to be rendered? That's about what it would take to match the slowdown I see... but then the Storage tab happily renders almost instantly before that crawl is done and updates later... I dunno.)
Funny I'm defending them, but I think this is not even a papercut in my opinion, while they have far bigger issues.
It should be searching, what, a few hundred strings? What is it doing? Is it making a network call? For what?
Anyway, barely related, but it does bring into question the quality of modern software.
Interested in collaborating on this? Perhaps a simple open-source static blog built with Astro?
But it's a glorified Kindle and YouTube box, so I'm hesitating a little bit.
Which is fine, since it’s exclusively used to watch a kids show for a half an hour a day.
…but it’s also super sad to see a once fantastic piece of kit to degrade so much primarily due to software.
It feels very much like how I imagine someone living in the late 1800's might have felt. The advent of electricity, the advent of cars, but can't predict airplanes, even though they're right around the corner and they'll have likely seen them in their lifetime.
Maybe, but for lots of scenarios even M5 could still benefit from being an order of magnitude faster.
AI, dev, some content scenarios, etc…
* My iPhone as a remote for my Apple TV has randomly stopped deciding it can control the volume - despite the "Now Playing" UI offering an audio control that works.
There auth screens drive me crazy:
* Why cannot I not punch in a password while Face ID is working? If I'm skiing, I know Face ID isn't gong to work, stop making me wait.
* Likewise, on Apple TV the parental control input requires me to explicitly choose to enter a Pin Code. Why? Just show me the Pin Code screen. If I can approve from my device, I will.
* Similarly, if I use my phone as a remote, why do I need to manually click out of the remote to get to the parental control approval screen. I'm literally using my phone. Just auto-approve.Funny, a similar thing has been driving me crazy on my Ubuntu 20.04 laptop with fingerprint login. When unlocking, I can either enter a password or use fingerprint. On boot, I am not allowed to enter a password until I fail with fingerprint. If I use fingerprint to log in on boot, I have to enter my password anyways once logged in to unlock my keychain.
I should probably just figure out a way to disable fingerprint on boot and only use it for the lock screen.
Of course the thin Apple remote has a way of getting lost, but it has a Find Me feature which locates it pretty well.
It seems to have been degrading for a long time, but for me it’s been in this past year where it’s crossed into that threshold android used to live in where using the phone causes a physiological response from how aggravating it can be sometimes
I let my guard down and got too deep into the apple ecosystem- I know better and always avoided getting myself into these situations in the last, but here I am
The phone sucks right now - super buggy and they continue to remove/impose features that should be left as an option to the user By Yes, this has always been the knock on apple, but I typically havent had an issue with their decisions - it’s just so bad now
Lesson (re)learned and I will stay away from ecosystems - luckily the damage here is only for media
The minute I can get blue bubbles reliably on an android, I’ll give the pixel a shot again - if that sucks too then maybe I’ll go back to my teenage years and start rooting devices again
I am fully bought into the Apple ecosystem. Not sure yet if I regret it. It is annoying to be so tied down to one company that isn’t going the way I want it to.
There are current workarounds, like isn’t your home Mac as a relay, but nothing super elegant that I know of
this means that i either use ios or i have to be "that guy" always asking everyone to send something in a different format or to please move the conversation to some other app - no one wants to be that guy - apple's got us right where they want us
and to be honest, when texting other people, it makes a huge difference, believe it or not, if your chat bubbles on their screen are blue vs green. it shouldn't matter - people who would care about this aren't people you would want to talk to anyway blah blah - that's all fun and great but it does matter, unfortunately
IMO it’s better than Signal as far as UX goes with enough privacy to be practically for everyday chat. I actively avoid WhatsApp because Facebook. They recently changed their privacy policy to no one’s surprise.
There were many papercuts, but the keyboard being a hundred times worse than Android is what aggravated me every time I had to use the phone, and the straw that broke the camel's back.
I’ve also been unable to get the remote app on my watch to work at all. It’s hard to imagine people working at Apple don’t also run into these issues all the time.
Apple has a higher duty to their shareholders than to their customers.
Not hating on Apple, just stating the hard economic truth.
PS The Earth isn’t flat. We did go to the Moon. Vaccines don’t cause autism.
From the 2005 iPods settlement [0], to the 113 Mio USD Batterygate [1], to Flexgate [2] where Apple only escaped settlement due to plausible deniability.
To quote from Batterygate:
> Apple has agreed to pay millions of dollars to 34 states over its controversial previous practice of deliberately slowing down older iPhones to extend their battery life.
> [...]
> Many believed it was an effort to encourage users to buy new iPhones.
I agree on all your "PS" points, where we seem to differ is that reading is a virtue and not knowing something because you haven't heard of it doesn't constitute a conspiracy theory.
0: https://www.cbsnews.com/news/ipod-class-action-suit-settled/
1: https://edition.cnn.com/2020/11/19/tech/apple-battery-settle...
2: https://www.macrumors.com/2021/07/20/flexgate-class-action-l...
Flexgate is a manufacturing error, that they handled in a consumer hostile way
Batterygate, was an arguably misguided way to support outdated models - prioritising one goal (battery life) over another (speed)
The iPod thing I’ll admit I know nothing about.
It sounds like, for you, planned obsolescence is defined as any instance where a product isn’t manufactured perfectly or degardes over time, regardless of whether it was planned. For me, planned obsolescence should contain at least a hint of planning.
Sad if true. I feel my M1 max sluggish too lately. After bragging that this was the longest lived work machine I had and thinking I'm good to wait for M6. This is not good for business, but IMO you need more than raw power to justify upgrades even for professional use - form factor, screen quality, battery, etc.
I think they bet a lot of hardware money on AI capabilities, but failed to deliver the software, so there was no real reason to upgrade because of AI features in the chip (which is literally what they boast on the first line of the announcement - yet nobody cares about making more cute faces)
It's not 100% their fault. Everyone got onto the LLM bandwagon like it's "the thing" so even if they didn't believe it they sill needed something. Except an OS is not a chat interface, and LLMs do suck at stricter things.
a rant on my part, but a computer from 10 years ago would be fine for what most people do on their computer, only for software bloat..
Counterpoint: my M1 Pro was a turtle for a few weeks and then stopped doing nonsense in the background and is back to its zippy self. (Still buggy. But that would be true on new hardware, too.)
I would love to se a ThinkPad with an M5 running Linux.
I kid, I kid.
Is that good? Their cellular modems have been terrible. I'll reserve judgement until trying one out.
>The M1 itself is so powerful
I think this is a bit of a fallacy. Apple Silicon is great for the power consumption to power ratio, but something like a Ryzen 9 7945HX can do 3x more work than an M1 Max. And a non-laptop chip, like an Intel Core Ultra 7 265k can do 3.5x.
https://browser.geekbench.com/processors/amd-ryzen-9-7945hx
https://browser.geekbench.com/processors/intel-core-ultra-7-...
https://browser.geekbench.com/macs/macbook-pro-16-inch-2021-...
https://browser.geekbench.com/macs/macbook-pro-16-inch-2024-...
Performance claims:
https://www.ookla.com/articles/iphone-c1-modem-performance-q...
Energy claims:
https://appleinsider.com/articles/25/02/27/apples-c1-modem-b...
I also installed the iOS 26 update recently. The competitive advantage of software polish that Apple had seems totally gone.
Add to that bugs in iCloud, AirDrop... I don't think I will be buying any more Apple devices for myself.
Disclaimer, i actually like a bit of "bling", but both Tahoe and IOS so filled with glitches and errors, while the UX is bizarrely inconsistent it really is catastrophically bad.
Naturally Apple as the survivor from all that were not IBM-PC, appears to be the one with vertical integration approach.
I keep hearing this since the Intel 486DX times, and
> Nobody will ever need more than 640K of RAM!
Amusingly enough, adding more ports could do it.
Devices get slower for no perceivable reason, when in reality software at all levels makes higher assumptions about how much power you have, and squanders it more readily.
I am undecided in my thoughts about how malicious this is. Do people think that it is something like wanting to cram more features into the operating systems, and they are careless how it affects the earliest supported models? Or do most people think it is planned obsolescence?
Apple generally offer updates longer than Android, so is it more pronounced on iPhones than Android phones? I remember seeing similar slow-downs on Android phones in the past.
Apple generally offer updates for iOS for less time than Windows. I don't really have a feel for the difference between the two in terms of how much new versions slow down older hardware.
Obviously separating feature updates and security updates would be a way to address, and it's not possible that no one at Apple has considered that idea. They are a business and selling new products is unfortunately a disincentive pushing them away from doing that.
They did make a mistake, though: they should have been up-front about it. They should have advertised it rather than hiding it away.
Not going to happen, but I can dream.
I think many IT departments will be thankful for that as Wifi behaviour can be challenging and hopefully will lower ticket counts.
1] Which I still firmly believe WAS indeed a power-supply design failure that would have forced a massive hardware recall had they not done something (slowing down the os). I believe it encompassed everything from inaccurate CPU power estimates to something actually incorrect with the PCB design, causing brown outs - and not merely a battery-aging red herring as is the reported scandalous reason they were "caught". In fact, I think Apple is GLAD that all it amounted to was some philosophical hullabaloo about protecting your poor aging battery.
To clarify, I suspect the "aging battery" merely exposed the real issue - the incorrect PS design - which Apple successfully covered up.
Quit the Dropbox app, it’s electron, and it’s brand spanking new
WatchOS 26 has rendered my Apple Watch almost useless. It's gone from lasting a whole day including 2 cycling 'workouts' for my commute and the occasional lunch time run (or gym session before work) to now being at 40% battery by the time I make my mid-morning coffee and dead before I get home.
I don't use most of the 'smart' features anyway - I'm mostly using the fitness features - so I'll probably switch to a Garmin at some point.
If that's your use case, I can absolutely recommend getting one. I have a Forerunner 745 and it works great for workouts alongside some smart functions like NFC payments, quick-replies to texts, etc. The battery lasts for days as well, which you can't really beat.
The Garmin Instinct 2X's (and 3) battery lasts for 40 days in smartwatch mode, not counting the solar charging.
The Instinct is an "outdoor watch" with a monochrome display, but it has most features the Forerunners have.
- Suunto (20 to 30 days in smartwatch mode for the Verticals, optional solar charging, flashlight on the Vertical 2)
- Coros (2 to 3 weeks depending on the model), no flashlight
- Withings (30 days, looks like a regular watch)
Coros is good for how long they support their watches, and the fact that they don't restrict features in lesser models. Suunto is great for route planning. Polar is renowned for its training metrics (sleep, recovery etc.) but only fetches a week in smartwatch mode.
Edit: Same experience with iPhone X
Edit2: I still remember the feeling when I got them initially - that Apple is on customer's side, but now I feel totally helpless and i'm being forced to upgrade
I've got a reference macbook air from 2015, which is almost clean, only zoom, teams and chrome for meets are installed and used for calls. And boy, how do I regret making macOS updates.. I can believe teams and zoom are shitbags of modern software slop, and thus started to fail running simple video calls. But even native macOS apps that are barely updated for years like notes and calendar are freezing now. So I can conclude that these anti-backward compatibility updates are highly intentional, because hardware is absolutely fine for decade, i even used this ultra-tiny air for travel work once back in 2022, it was still capable to do all office things and thin client. But last year it just turned into pumpkin.
My question is - maybe installing linux can help bring it back to life.
That really is a reason for me to skip this upgrade and wait for the next release.
They are so scared about cannibalizing mac/ipad sales - they really really want people to own both.
This gets rid of the slow animations, inconsistent window cornering, and other annoyances.
Then (so menus aren't transparent and unreadable): System Settings > Accessibility > Display > Reduce Transparency
If you do those two things your machine should look and feel normal again. I've been running an M1 Max since 2021 and Tahoe was simply a disaster. Removing the glass layer made everything feel good again.
If for some reason you ever want the bad performance and glass back, you change the YES to NO in the Terminal command. Maybe someday it won't suck.
I really don't like what it does to certain transparent drop downs for certain apps as well as the control center.
But It does seem nice to do it on a per app basis (ie "defaults write com.apple.finder com.apple.SwiftUI.DisableSolarium -bool YES")
https://tidbits.com/2025/10/09/how-to-turn-liquid-glass-into...
I personally have no idea but I seem to recall the golden age of open source/unix embrace was under Serlet
"the ability to transform 2D photos into spatial scenes in the Photos app, or generating a Persona — operate with greater speed and efficiency."
And by making Apple AI (which is something I do not use for many reasons, but mainly because of Climate Change) their focus, I am afraid they are losing and making their operating Systems worse.
For instance, Liquid Glass, the mess I was lucky enough to uninstall before they put in the embargo against doing so, is, well, a mess. An Aplha release in my opinion which I feel was a distraction from their lack of a robust AI release.
So by blowing money on the AI gold rush that they were too late for, will they ultimately ruin their products across the board?
I am currently attempting to sell my iPhone 16E and my M1 Macbook Air to move back to Linux because of all of this.
GPT5 is probably cheaper in the sense that gpt5-nano is at least as capable as 3.5 while costing less, but the "normal" models are more expensive for the newer ones, and thats what people are generally going to be using.
Are they really doing that? Because if it's the case they have shockingly little to show for it.
Their last few attempts at actual innovation seem to have been less than successful. The Vision Pro failed to find a public. Liquid Glass is to put it politely divisive.
At that point to me, it seems that good SoC and a captive audience in the US are pretty much all they have remaining and competition on the SoC part is becoming fierce.
But I think $500 billion is a lot of money for AI:
Apple accelerates AI investment with $500B for skills, infrastructure
https://www.ciodive.com/news/Apple-AI-infrastructure-investm...
Imagine using $500 for the operating system and squashing bugs or making the system even more energy efficient? Or maybe figuring out how to connect to an android tablet's file system natively?
I totally understand why someone would refuse to use it due to environmental reasons (amongst others) but I'm curious to hear your opinions on it.
If I can't search my Apple Mail without AI, why would I trust AI?
Why would I trust this when they can't deliver a voice assistant that can parse my sentences beyond "Set a reminder" or "Set a timer"? They have neglected this area of their products for over a decade, they are not owed the benefit of the doubt
Also, I like researching things old school how I learned in college because I think it leads to unintended discoveries.
I do not trust the source you linked to. It is an organization buried under organizations for which I cannot seem to find their funding source after looking for a good 15 minutes this morning. It led me back to https://ev.org/ where I found out one guy used to work for "Bain and Company", a consulting firm, and was associated with FTX funding:
https://oxfordclarion.uk/wytham-abbey-and-the-end-of-the-eff...
Besides "Effective Altruism" makes no sense to me. Altruism is Altruism IMO.
Altruism: unselfish regard for or devotion to the welfare of others
There is no way to be ineffective at altruism. The more you have to think about altruism the further you get from it.
But the organization stinks as some kind of tech propaganda arm to me.
I'm from country (in Europe) where CO2 emissions per capita [0] are 5.57 while number for USA is 14.3, so reading this sentence in that article: "The average American uses ~50,000 times as much water every day..." surly does not imply that one should use ChatGPT because it is nothing. If "average American" wants to decrease emissions then not using LLMs is just start.
[0]: https://ourworldindata.org/grapher/co-emissions-per-capita
Huh. This one baffles me.
Of course, are those same users always running their screens super dim? Are they using pen + paper instead of typing whenever they can?
Maybe they are in USA - every little think counts there.
So, to keep this on point, Apple making a faster chip is not on my climate change agenda and anything but negative.
There will always be local-first use-cases, but its also possible that, you know, we're already near the global maxima of those use-cases, and the local AI coprocessors we've had can do it fine. This would be a severe shock to my perceived value of Apple right now, because my view is: their hardware division is firing on all cylinders and totally killing it. But when you're putting supercomputers into the iPad... maybe that doesn't actually matter. Meanwhile, their software is getting worse every year that goes by.
"The neural engine features a graphic accelerator" probably M6
Gaming on mac is indeed lacking, but that's really not the reason.
If they're studios, you can have stacks of M5 Max Macs.
>Built into your iPhone, iPad, Mac, and Apple Vision Pro* to help you write, express yourself, and get things done effortlessly.** Designed with groundbreaking privacy at every step.
The asterisks are really icing on the cake here.
---
[1] https://news.bloomberglaw.com/ip-law/apple-accused-of-ai-cop...
Yesterday’s hype is today’s humility.
(Emphasis added)
When a company (or most people) today (now) says “AI”, they are not referring to the area of study traditionally called artificial intelligence. They are talking exclusively about transformers or diffusion.
https://web.archive.org/web/20251010205008/https://www.apple...
[1] https://www.apple.com/us-edu/shop/buy-mac/macbook-pro/14-inc...
I'm sure it's a perfectly fine daily driver, but you have to appreciate the irony of a massive chip loaded to the gills with matrix multiplication units, marketed as an amazing AI machine, and yet so hobbled by mem capacity and bandwidth.
But never, ever, through not shipping incremental hardware bumps every year regardless of whether there's anything really worth shipping.
And it's things like not including a charger, cable, headphones anymore to reduce package size, which sure, will save a little on emissions but it's moot because people will still need those things.
Hardware longevity and quality are probably the least valid criticisms of the current Macbook lineup. Most of the industry produces future landfill at an alarming rate.
Logos is King
https://security.apple.com/blog/memory-integrity-enforcement...
1. CPU, via SIMD/NEON instructions (just dot products)
2. CPU, via AMX coprocessor (entire matrix multiplies, M1-M3)
3. CPU, via SME (M4)
4. GPU, via Metal (compute shaders + simdgroup-matrix + mps matrix kernels)
4. Neural Engine via CoreML (advisory)
Apple also appears to be adding a “Neural Accelerator” to each core on the M5?
A Mac Quadra in 1994 probably had floating point compute all over the place, despite the 1984 Mac having none.
I know how just the multitasking nature of the OS probably make this situation happens across different programs, but nonetheless would be pretty cool!
The "neural accelerator" is per GPU core, and is matmul. e.g. "Tensor cores".
- https://machinelearning.apple.com/research/neural-engine-tra...
- https://machinelearning.apple.com/research/vision-transforme...
Things have definitely gotten better with MLX on the software side, though it still seems they could do more in that area (let’s see what the M5 Max brings). But even if they made big strides here, it won’t help previous generations, and the main thing limiting Apple Intelligence (in my opinion) will continue to be the 8 GB of unified memory they still insist on.
I would hope that the Foundation Models (https://developer.apple.com/documentation/foundationmodels) use the neural engine.
https://www.google.com/url?sa=t&source=web&rct=j&opi=8997844...
AMD is likely to back away from this IP relatively soon.
The ANE is for very low power, very specific inference tasks. There is no universe where Apple abandons it, and it's super weird how much anti-ANE rhetoric there is on this site, as if there can only be one tool for an infinite selection of needs. The ANE is how your iPhone extracts every bit of text from images and subject matter information from photos with little fanfare or heat, or without destroying your battery, among many other uses. It is extremely useful for what it does.
>tensor units on the GPU
The M5 / A19 Pro are the first chips with so-called tensor units. e.g. matmul on the GPU. The ANE used to be the only tensor-like thing on the system, albeit as mentioned designed to be super efficient and for very specific purposes. That doesn't mean Apple is going to abandon the ANE, and instead they made it faster and more capable again.
What work product? Who is running models on Apple hardware in prod?
You don't have to believe this. I could not care less if you don't.
Have a great day.
If you had a source to cite then it would remove all doubt pretty quickly here. But your assumptions don't seem to align with how iOS users actually use their phone.
You seem set on MLX and apparently on your narrow view of what models are. This discussion was about ANE vs "tensor" units on the GPU, and someone happened to mention MLX in that context. I clarified the role of MLX, but that from an inference perspective most deployments are CoreML, which will automatically use ANE if the model or some subset fits (which is actually fairly rare as it's a very limited -- albeit speedy and power efficient -- bit of hardware). These are basic facts.
>how iOS users actually use their phone.
What does this even mean? Do you think I mean people are running Qwen3-Embedding-4B in pytorch on their device or something? Loads of apps, including mobile games, have models in them now. This is not rare, and most users are blissfully unaware.
correct and non-controversial
> An enormous number of people and products [use CoreML on Apple platforms]
non-sequitur
EDIT: i see people are not aware of
That seems like a strange comment. I've remarked in this thread (and other threads on this site) about what's known re: low-level ANE capabilities, and it seems to have significant potential overall, even for some part of LLM processing. I'm not expecting it to be best-in-class at everything, though. Just like most other NPUs that are also showing up on recent laptop hardware.
As you said - it won’t help previous generations, though since last year (or two??) all macs start with 16GB of memory. Even entry level macbook airs.
The latter has up to 128GB of memory?
But I think it's also a huge issue Apple makes storage so expensive. If Apple wants local AI to answer your questions it should be able to take your calender, emails, text messages, photos, journal entries etc. into account. It can't do that as nicely as long as customers opt for only 256GB or 1TB devices due to cost
That's the only way to speed up MLX 4x compared to M4.
If anything, these refreshes let them get rid of the last old crap on the line for M1 and M2, tie up loose ends with Walmart for the $599 M1 Air they still make for ‘em, and start shipping out the A18 Pro-based Macbooks in November.
M5 announcement [1] says 4x the peak GPU compute performance for AI compared to M4. I guess in the lab?
Both iPad and MBP M5 [2][3] say "delivering up to 3.5x the AI performance". But all the examples of AI (in [3]), they are 1.2-2.3X faster than M4. So where this 3.5X is coming from? What tests did Apple do to show that?
---
1. https://www.apple.com/newsroom/2025/10/apple-unleashes-m5-th...
2. https://www.apple.com/newsroom/2025/10/apple-unveils-new-14-...
3. https://www.apple.com/newsroom/2025/10/apple-introduces-the-...
I'll believe the benchmarks, not marketing claims, but an observation and a question.
1. AMD EPYC 4585PX has ~89GB/s, with pretty good latency, as long you use 2xdimm
2. How does this compare to the memory bandwidth and latency of M1,M2,M3,M4 in reality with all of the caveats? It seems like M1 was a monumental leap forward, then everything else was a retraction.
Snow Leopard still remains the company's crown achievement. 0 bloatware, 0 "mobile features on desktop" (wtf is this even a thing?), tuned for absolute speed and stability.
Ironically I can still run old 32-bit Windows software in Wine on my M1 Mac. Windows software is more stable on a Mac than Mac software.
And the hardware isn't a showstopper anyway. Apple did x86-64 on AS, Windows' WoW64 does x86-32 on ARM-32 or even IA-64, and I'll bet Windows will do x86-32 on x86-64 if Intel ever drops the 32 mode. Wine 32on64 will run x86-32 on AS already.
If you don’t think Windows is a bloated mess, look up all of the different ways you have to represent a “string” depending on the API you are calling.
Every bit of backwards compatibility increases the testing surface and the vulnerabilities. In fact, an early bug in Windows NT that you could encode DOS shell commands in the browser URL bar from a client and they wouod run with admin privileges if the server was running IIS.
Should Apple have also kept 68K emulation around? PPC?
In Windows they took things a bit too far by not only supporting old stuff but also treating it as first-class. If software is too outdated, it's fair to stick it behind some compat layer that makes it slower, as long as it still runs. But that's not even the biggest problem with Windows, it's Microsoft turning it into adware, also not being Unixlike in the first place.
To answer your last question, yes for PPC at least. 68K is too old to matter. Emulation layer doesn't need to hold back the entire system. If it means less dev resources to spend making glass effects and emojis, fine.
Yes? What kind of mercurial clown world do you live in, where you pay for software and then cheer when it's yoinked off your computer in an OTA update?
Even Windows users aren't whipped enough to lick their OEM's boot like that, Jesus. You'd hope Mac users would still have a spine; Apple doesn't maintain macOS as a charity, you're allowed to disagree with them.
- A 68K emulator
- A PPC emulator
- a 32 bit x86 emulator
- a 32 bit ARM emulator (since ARM chips don’t have hardware to run 32 bit code)
And to think that Windows is a shining example of good operating system design.
Why not include a 65C02 emulator also so you can run AppleWorks 3.0 from 1986?
However, I will absolutely say Windows users have higher expectations from Microsoft than what Mac customers demand from Apple. Macs would get removed by force from many of the places that rely on Windows in professional settings like render farms, factory automation, and defense. There is absolutely zero tolerance for Apple's shenanigans there, and Apple offers those customers no products to take their needs seriously, unlike Microsoft. It's not a coincidence that Apple has zero buy-in outside the consumer market, not a single professional customer wants what Apple is selling if Nvidia or AMD will do the same thing with less-petty software support. We all know why products like XServe failed, poor Apple had too much pride to support the software that the industry had actual demand for.
While we're talking about software darwinism, I think you need to hear this; Darwin objectively sucks from a systems design standpoint, it's why nobody uses XNU unless they're forced to. It's empirically slow, deliberately neutered for third-parties, the user-exposed runtime is loaded with outdated/unnecessary crap and BSD tooling that won't work with industry-standard software, the IPC model is not secure (fight me), the capabilities are arbitrarily changed per-OS, filesystem security is second-rate like Windows/Bitlocker, the default install is bloated with literal gigabytes of deadweight binaries, both LLB and iBoot are mandatory NSA slopware blobs, and their SDK commitment is more fickle than developers playing Musical Chairs.
None of these kernels are good, but XNU is unique in that it is completely disposable to humanity and possesses no remaining valuable features. If macOS stopped working tomorrow, there would be no disruption to any critical infrastructure around the world. If Linux or Windows had a Y2K moment, we'd be measuring the deaths by the thousands. I'm willing to give Apple their due, but you refuse to admit they're lazy - "since ARM chips don't have hardware" my ass, on "hacker" news of all places...
Consider how shitty the x86 Windows experience is compared to modern Macs - poor battery life, loud, slow and hot - I’m really surprised at how little Windows users expect from their computers.
As far as the Arm based Windows computers, the x86 emulator is slower than Macs running x86 code and the processors are worse.
And are you really saying ARM based Macs, iPhones and iPads are slow?
You seem to want the Mac to be the equivalent of the “HomerMobile”.
No professional is buying Macs? You think that video and audio professionals as well as developers are really saying “we really want Windows computers” or did I miss the “Year of the Linux desktop”?
Apple has had no such culture internally and they sure as heck don't emphasise backward compatibility to their customers (users or otherwise) - if anything, Apple prods and nags their developers to stick to the latest SDK/platform APIs, and shove the burden of software compatibility and maintenance onto them and hand wave away the breaking changes as being part and parcel of membership in the Apple ecosystem. This attitude can be traced back to the Steve Jobs era at Apple. It's definitely not new and comparing what Microsoft does with software and backward compatibility and expecting Apple to do the same is not fair - they really are different companies.
Web partially fixed this, but only by accident, because Apple isn't for the web. And if I cared at all about video games or were doing certain fields of work (maybe creative tools now that Apple even lost that hegemony), that'd take me off the Mac. Somehow the Mac 3P software scene is even worse now than in the PPC era. And Microsoft is now testing just how annoying Windows can be without people leaving, answer is a lot.
Apple is limiting their reach so much, for reasons I still can't rationalize. Some basic level of backwards compatibility or at least more stable APIs should be possible without sacrificing the good things. I've done some iPhone and Mac dev too, it sucks in every possible way, and I get why people trust it so little that they'd rather shoehorn a web app into a native shell.
They are not the hardware provider like nvidia, they don’t do the software and services like OpenAI or even Microsoft/oracle. So they are struggling to find a foothold here. I am sure they are working on a lot of things but the only way to showcase them is through their phone which ironically enough feels like not the best path for apple.
Apple’s best option is to put llms locally on the phone and claim privacy (which is true) but they may end up in the same Siri vs others situation, where Siri always is the dumber one.
This is interesting to see how it plays out
This might not be widely recognised, as the proportion of people wanting to run capable LLMs locally is likely a rounding error versus the people who use ChatGPT/Claude/Gemini regularly. It's also not something that Apple market on, as they can't monetize it. However, as time goes on and memory and compute power gradually decrease in price, and also maybe as local LLMs continue to increase in ability (?) it may become more and more relevant.
In theory this would be where qualcomm would come in and provide something but in practice they seem to be stuck in qualcomm land where only lawyers matter and actual users and developers can get stuffed.
The only well supported devices are either phones or servers with very little in between.
Even common consumer devices like wifi routers will have ARM SOCs with pinned version of the kernel they are attached to which will get supported for 1 to 2 years at most.
And it's a PITA to install (needs to be started within macosx, using scripts, with the partitions already in a good state)
Curiously I found it a breeze since it didn't require digging out a flashable boot medium and pointing your BIOS to it. Calling a script from your normal desktop environment and having it automatically boot into the installer was really nice.
> with the partitions already in a good state)
What's this about? The script takes care of resizing the macOS partitions and creating new ones for Linux.
In the end I did a factory reset of the whole macbook and then I could reinstall Asahi.
However, I have been disappointed by Apple too many times (they wouldn't replace my keyboard despite their highly-flamed design-faux-pas, had to replace the battery twice by now, etc.)
Two years ago I finally stopped replacing their expensive external keyboards, which I used to buy once a year or every other (due to broken key-hinges) and have been so incredibly positively surprised by getting used to the MX Keys now. Much better built, incredible mileage for the price. Plus, I can easily switch and use them on my Windows PC, too.
So, about the Macbook — if I were to switch mobile computing over to Windows, what can I replace it with? My main machine is still a Mac Mini M2 Pro, which is perfect value/price. I like the Surface as a concept (replacable keyboards are a fantastic idea, battery however, super iffy nonsense), and I've got a Surface Pro 6 around, but it's essentially the same gloss-premium I don't need for my use.
Are there any much-cheaper but somewhat comparable laptops (12h+ battery, 1 TB disk, 16-32GB RAM, 2k+ Display) with reasonable build quality? Does bypassing the inherent premium of all the Apple gloss open up any useful options? Or is Apple actually providing the best value here?
Would love to hear from non-Surface, non-Thinkpad (I love it, but) folks who've got some recommendations for sub $1k laptops.
Not my main machine, but something I take along train rides, or when going to clients, or sometimes working offsite for a day.
But its really only capable of high performance in short bursts because of the extremely small thermal mass.
I got a 32GB 15Z90RT for $900 shipped from eBay.
Storage CPU
≤ 512GB 3 P-cores (and 6 E-cores)
1TB+ 4 P-cores (and 6 E-cores)
https://www.apple.com/ipad-pro/specs/So this made at least some sense.
I guess yields might be good enough that they can afford to bin with another core in there as well.
Memory is probably still the main reason for binning in the first place.
For example: https://forums.macrumors.com/threads/m4-mbp-ssd-speeds.24422....
Compare with: https://www.tomshardware.com/pc-components/ssds/crucial-t710....
While it is not an Apple to Oranges comparison, T710 seems 80% faster for writing big files, and for $279.99 - $299.99 for 2TB this is still much cheaper than whatever Apple is offering.
If you have a better reference (specially if there is something that is cross platform), I would be interested.
The existing neural engine's function is to maximize power efficiency, not flexible performance on models of any size.
It's an improvement, nomenclature-wise.
(Perhaps it would be safer to wait for The Next Generation?)
Open up the YouTube app and try to navigate the UI. It’s okay but not really up to the Apple standard. Now try to enter text in the search bar. A nearby iPhone will helpfully offer to let you use it like a keyboard. You get a text field, and you can type, and keystrokes are slowly and not entirely reliably propagated to the TV, but text does not stay in sync. And after a few seconds, in the middle of typing, the TV will decide you’re done typing and move focus to a search result, and the phone won’t notice, and it gets completely desynchronized.
More importantly for games, though, is the awful storage architecture around the TV boxes. Games have to slice themselves up into 2GB storage chunks, which can be purged from the system whenever the game isn't actively running. The game has to be aware of missing chunks and download them on-demand.
It makes open-world games nearly impossible, and it makes anything with significant storage requirements effectively impossible. As much as Apple likes to push the iOS port of Death Stranding, that game cannot run on tvOS as currently architected for that reason.
Apple is actively hostile to how you would build for Linux or PC or console.
If you are building your engine/game from scratch, you absolutely do not need to use Xcode
Nonetheless that’s a small fraction of the time spent actually developing the game.
That makes it a continuous headache to keep your Mac builders up.
It means you need to double dev hardware costs or more as you need a gaming PC to target your core audience and Macs handle the mac bugs.
It means your mac build machines are special snowflakes because you can't just use VMs.
The list goes on and on of Mac being actively hostile to the process.
Just Rider running on a Mac is pleasant sure, but that's not the issue.
Having to use xcode "for the final build" is irrelevant to the game development experience.
Sure you can. And officially, too. Apple still ships a bunch of virtualization drivers in macOS itself. Have a look:
/System/Library/Extensions/IONetworkingFamily.kext/Contents/PlugIns/AppleVmxnet3Ethernet.kext
Whether or not you're using ESXi, or want to, is an entirely different question. But "you're not able to" is simply incorrect. I virtualize several build agents and have for years with no issues.
macOS 26 is the last major version to support Intel, so once macOS 28 is latest this will probably become impossible (macOS 26 should be able to use Xcode 27, but maybe the platform removal will change this previous year's OS support from continuing).
I think OP means virtualizing on something that isn't Apple.
You can get an xcode building for arm Macs on PC hardware with this?
- Linux: windows and Linux vm.
- Apple: windows, Linux, Apple VM.
Seems pretty straightforward.
I am being facetious. You'll have a PC for gamedev because that's your biggest platform unless you are primarily switch or PS5, in which case you'll have a devkit as well as a PC. But the cost of an Apple device is insignificant compared to the cost of developing the software for it.
So it really comes down to the market size and _where they are_. The games I play are either on my PS5, or on my Mac, never both. For any specific game, they are on one or the other. Ghost of Tsushima is on the PS5. Factorio is on my Mac. If I were an indie game developer, I'd likely be developing the kind of game that has a good market on the Mac.
- have to pay Apple to have your executable signed
- poor Vulkan support
The hardware has never been an issue, it's Apple's walled garden ecosystem.
As a game developer, I have to literally purchase Apple hardware to test rather than being able to conveniently download a VM
Like, I get if you develop for consoles, you probably use some kind of emulation on your development workstation, which is probably running Windows. Especially for consoles like XBOX One or newer, and PS4 or newer, which are essentially PCs. And then builds get passed off to a team that has the hardware.
Is anyone developing games for Windows on Apple hardware? Do they run Parallels and call it a day? How is the gaming performance? If the answers to those 3 questions are "yes, yes, great", then Apple supports PC game development better than they support Apple game development?
I don’t think anybody does this. I haven’t heard about official emulators for any of the mainstream consoles. Emulation would be prohibitively slow.
Developers usually test on dedicated devkits which are a version of the target console (often with slightly better specs as dev builds need more memory and run more slowly). This is annoying, slow and difficult, but at least you can get these dev kits, usually for a decent price, and there’s a point to trying to ship on those platforms. Meanwhile, nobody plays games on macs, and Apple is making zero effort to bring in the developers or the gamers. It’s a no-chicken-and-no-egg situation, really.
For testing, I can do a large amount of testing in a VM for my game. Maybe not 100% and not full user testing but nothing beats running on the native hardware and alpha/beta with real users.
Also, since I can pass through hardware to my VM I can get quite good performance by passing through a physical GPU for example. This is possible and quite straightforward to do on a Linux host. I'm not sure if it's possible using Parallels.
i am obviously misunderstanding something, i mean.
Sure, I'm not doing performance benchmarking and it's just smoke tests and basic user stories, but that's all that 98% of indie developers do for cross platform support.
Apple has been intensely stupid as a platform to launch on, though I did do it eventually. I didn't like Apple before and now I like it even less.
However, these days it's possible pass-through hardware to your VM so I would be able to pass through a 2nd GPU to MacOS...if it would let me run it as a guest.
so a mac port, even if simple, is additional cost. there you have the classic chicken and egg problem. the cost doesn't seem to be justified by the number of potential sales, so major studios ignore the platform. and as long as they do, gamers ignore the platform
i've seen it suggested that Apple could solve this standoff by funding the ports, maybe they have done this a few times. but Apple doesn't seem to care much about it
It even has "for this mac" preset which is good enough that you don't need to tinker with settings to have decent experience.
The game is paused, almost like becomes "frozen" if it's not visible on screen which helps with battery (it can be in the background without any noticeable impact on battery and temperature). Overall way better experience than I expected.
The communication bandwidth you can achieve by putting CPU, CPU, and memory together at the factory is much higher than having these components separate.
Sad for enthusiasts, but practically inevitable
And that’s not even talking about porting the game to either Metal or an absolutely ancient OpenGL version that could be removed with any upcoming OS version. A significant effort just to address a tiny market.
IIRC developers literally got 15 years of warning about that one.
But it's not possible to keep maintaining 32-bit forever. That's twice the code and it can't support a bunch of important security features, modern ABIs, etc. It would be better to run old programs in a VM of an old OS with no network access.
Apple had the money to support it, we both know that. They just didn't respect their Mac owners enough, Apple saw more value in making them dogfood iOS changes since that's where all the iOS devs are held captive. Security was never a realistic excuse considering how much real zombie code still exists in macOS.
Speaking personally, I just wanted Apple to wait for WoW64 support to hit upstream. Their careless interruption of my Mac experience is why I ditched the ecosystem as a whole. If Apple cannot invest in making it a premium experience, I'll take my money elsewhere.
Not possible without forking the OS. No amount of money can make software development faster forever.
https://en.wikipedia.org/wiki/The_Mythical_Man-Month
Especially because Apple has a functional design which means there is nearly no redundancy; there's only one expert in any given field and that expert doesn't want to be stuck with old broken stuff. Nor does anyone want software updates to be twice as big as they otherwise would be, etc.
> Security was never a realistic excuse considering how much real zombie code still exists in macOS.
Code doesn't have security problems if nobody uses it. But nothing that's left behind is as bad as, say, QuickTime was.
nb some old parts were replaced over time as the people maintaining them retired. In my experience all of these people were named Jim.
Oh, my apologies to their expert. I had no idea that my workload was making their job harder, how inconsiderate of me. Anyone could make the mistake of assuming that the Mac supported these workloads when they use their Mac to run 32-bit plugins and games.
I still don't get this. Apple is a trillion dollar company. How much does it cost to pay a couple of engineers to maintain an up to date version on top of Metal? Their current implementation is 4.1, it wouldn't cost them much to provide one for 4.6. Even Microsoft collaborated with Mesa to build a translation on top of dx12, Apple could do the same.
Has anyone figured out what exactly the crux of their beef? OpenGL 4.1 came out in 2010, so surely whatever happened is settled by now.
1: https://www.facebook.com/permalink.php?story_fbid=2146412825...
Somehow Atari, EA and PlayStation are here despite this. I don't know how they did it.
Meanwhile, Nintendo is successful because they're in Seattle where it's dark and rains all the time.
It was only the intervention of Microsoft that managed to save Apple from their own tantrum.
[1] https://ruoyusun.com/2023/10/12/one-game-six-platforms.html#...
Now...something like minecraft or SubNautica? The M4 is fine, especially if you're not pushing 4k 240hz.
Apple has been pushing the gaming experience for years (iPhone 4s?) but it never REALLY seems to land, and when someone has a great gaming seperience in a modern AAA game, they always seem to be using a $4500 Studio or similar.
If you identify as a "gamer" and are in those communities, then you'll see communities talking about things you can't natively play
but if you leave niches you already have everything
and with microtransactions, Apple ecosystem users are the whales. again, not something that people who identify as "gamers" wants to admit being actually okay with, but those people are not the revenue of game production.
so I would say it is a missed opportunity for developers that are operating on antiquated calculations of MacOS deployment
It's kinda not. Here's a rough list of the 10 most-played games currently on PC: https://steamdb.info/charts/
macOS is supported by one title (DOTA 2). Windows supports all 10, Linux (the free OS, just so we're clear) runs 7 of the games and has native ports of 5 of them. If you want to go argue to them about missed revenue opportunities then be my guest, but something tells me that DOTA 2 isn't being bankrolled by Mac owners.
If you have any hard figures that demonstrate "antiquated calculations" then now is the time to fetch them for us. I'm somewhat skeptical.
And don’t forget they made an VR headset without controllers.
Apple doesn’t care about games
Kind of? It does support higher refresh rates, but their emphasis on "Retina" resolutions imposes a soft limit because monitors that dense rarely support much more than 60hz, due to the sheer bandwidth requirements.
Ports to macos have not done well from what I've heard. However you can see ports on PC do really well and have encouraged studios like Sony and SquareEnix to invest more in PC ports. Even much later after the console versions sell well. Just not a lot of reasons to add the tech debt and complexity of supporting mac as well.
Even big publishers like Blizzard who have been mac devs for a long time axed the dedicate mac team and client and moved to a unified client. This has downfalls like mac specific issues. If those are not critical then they get put in the pile with the rest of the bugs.
I know we are a few major scientific breakthroughs away from that even being remotely possible, but it sure would be nice.
Looking at my Macbook though, I can say with utmost certainty that it isn't 4000x faster than the Intel one I had 5 years ago.
So, where is the disconnect here? Why is actual user experience not able to keep up with benchmarks and marketing?
I am deeply concerned all the performance benefits of the new chips will get eaten away.
This is 4-6x faster in AI for instance.
In GPU performance (probably measured on a specific set of tasks).
Spin up ollama and run some inference on your 5-year-old intel macbook. You won't see 4000x performance improvement (because performance is bottlenecked outside of the GPU), but you might be in the right order of magnitude.
"Look how many times faster our car is![1]"
[1] Compared to a paraplegic octogenarian in a broken wheelchair!"
[1] The memory bandwidth is fine for CPU workloads, but not for GPU / NN workloads.
I use both an M1 max and an M3 max, and frankly I do not notice much difference if you control for the core count in most stuff. And for running LLMs they are almost the same performance. I think from M1-M3 there was no much performance increase in general.
First line on their website:
> M5 delivers over 4x the peak GPU compute performance for AI compared to M4
It's the GPU not the CPU (which you compare with your old Intel) and it's an AI workload, not your regular workload (which again is what you compare)
No. You exit the mail app -> Go to settings -> apps -> scroll through a massive list (that you usually just use for notification settings btw) to go to mail -> mail accounts -> add new account.
Just a simple six-step process after you’ve already hunted for it in the mail app.
You can also click the “+” button at the bottom of the list of accounts in the “Accounts” panel in Mail's settings window.
Hardware has improved significantly, but it needs software to enable me to enjoy using it.
Apple is not the only major company that has completely abandoned the users.
The fastest CPUs and GPUs with the most RAM will not make me happier being targeted by commercial surveillance mechanisms, social-media applications, and hallucinating LLM systems.
At our company we used to buy everyone MacBook Pros by default.
After the M-series chip, the MBPs are just too powerful and no longer necessary for the average white collar worker (they seem like “actual” pro machines, now) to the point where we now order MacBook Airs for new employees.
I feel like until recently, you really needed a MBP to get a decent UX (even just using chrome). But now there doesn’t seem to be a major compromise when buying an Air for half the price, at least compared to 3-5 years ago.
“ARM architecture” in the sense it’s used by Apple is just an ISA. The ISA obviously has some effect on power consumption (e.g. avoiding complex CISC decode). But in reality, by far the most significant driver of CPU efficiency and power consumption is process node.
I’ve had zero problems with lag or compile time (prior to macOS 26 anyway)
The only thing it can’t do is run Ableton in a low latency way without strongly changing the defaults
You press a key on the keyboard to play a note and half a second later you hear it
Other than that, zero regrets
something’s off with your setup.
Regular MBs are not really a thing anymore. You mean Airs?
>Looking at my Macbook though, I can say with utmost certainty that it isn't 4000x faster than the Intel one I had 5 years ago. So, where is the disconnect here?
They wrote:
> Together, they deliver up to 15 percent faster multithreaded performance over M4
The problem is comprehension, not marketing.
“M5 delivers over 4x the peak GPU compute performance for AI”
In this situation, at least, it’s just referring to AI compute power.
Did they claim 4x peak GPU compute going from the M3 to M4? Or M2 to M3? Can you link to these claims? Are you sure they weren't boasting about other metrics being improved by some multiplier? Not every metric is the same, and different metrics don't necessarily stack with each other.
However the marketing claims did not state an across the board weighted performance increase over M4 and certainly by reading the claims one would not assume one that large. Instead the claims state performance gains in specific benchmarks, which is relevant to common modern workflows such as inference. The closest benchmark stated to general purpose computing is the multicore CPU performance increase, which the marketing puts at 15% over M4.
As for that large leap in GPU-driven AI performance, this is on account of the inclusion of a "Neural Accelerator" in each GPU core, which is an M5 specific addition and is similar to changes introduced in the A19 SoC.
The disconnect here is that you can't read. Sorry, no other way to say it.
Firstly, the M5 isn't 4-6x more powerful than M4 - the claim is only for GPU, only for one narrow workload, not overall performance uplift. Overall performance uplift looks like ~20% over M4, and probably +100% over M1 or so.
But there is absolutely a massive sea change in the MacBook since Intel 5 years ago: your peak workloads haven't changed much, but the hardware improvements give you radically different UX.
For one thing, the Intel laptops absolutely burned through the battery. Five years ago the notion of the all-day laptop was a fantasy. Even relatively light users were tethered to chargers most of the day. This is now almost fully a thing of the past. Unless your workloads are very heavy, it is now safe to charge the laptop once a day. I can go many hours in my workday without charging. I can go through a long flight without any battery anxiety. This is a massive change in how people use laptops.
Secondly is heat and comfort. The Intel Macs spun their fans up at even mild workloads, creating noise and heat - they were often very uncomfortably warm. Similar workloads are now completely silent with the device barely getting warmer than ambient temp.
Thirdly is allowing more advanced uses on lower-spec and less expensive machines. For example, the notion of rendering and editing video on a Intel MacBook Air was a total pipe dream. Now a base spec MacBook Air can do... a lot that once forced you into a much higher price point/size/weight.
A lot of these HN conversations feel like sports car fans complaining: "all this R&D and why doesn't my car go 500mph yet?" - there are other dimensions being optimized for!
> I can say with utmost certainty that it isn't 4000x faster
The numbers you provided do not come to 4000x faster (closer to 2400x)
> Why is actual user experience not able to keep up with benchmarks and marketing?
Benchmarks and marketing are very different things, but you seem to be holding them up as similar here.
The 5x 6x 4x numbers you describe across marketing across many years don't even refer to the same thing. You're giving numbers with no context, which implies you're mixing them and the marketing worked because the only thing you're recalling is the big number.
Often, every M-series chip is a HUGE advancement over the past in GPU. Most of the "5x" performance jumps you describe are in graphics processing, and the "Intel" they're comparing it to is often an Intel iGPU like the Iris Xe or UHD series. These were low end trash iGPUs even when Apple launched those Intel devices, so being impressed by 5x performance when the M1 came out was in part because the Intel Macs had such terrible integrated graphics.
The M1 was a giant jump in overall system responsiveness, and the M-series seems to be averaging about a 20% year over year meaningful speed increase. If you use AI/ML/GPU, the M-series yearly upgrade is even better. Otherwise, for most things it's a nice and noticeable bump but not a Intel-to-M1 jump even from M1-to-M4.
Unless you're looking at your MacBook running LM Studio you won't be seeing much improvement in this regard.
They say "M5 offers unified memory bandwidth of 153GB/s, providing a nearly 30 percent increase over M4" but my old Macbook M2 Max have 400GB/s
Now if only Apple would sell these for use outside of their walled garden.
From: https://www.theregister.com/2012/05/03/unsung_heroes_of_tech...
"> The power test tools they were using were unreliable and approximate, but good enough to ensure this rule of thumb power requirement. When the first test chips came back from the lab on the 26 April 1985, Furber plugged one into a development board, and was happy to see it working perfectly first time.
> Deeply puzzling, though, was the reading on the multimeter connected in series with the power supply. The needle was at zero: the processor seemed to be consuming no power whatsoever.
> As Wilson tells it: “The development board plugged the chip into had a fault: there was no current being sent down the power supply lines at all. The processor was actually running on leakage from the logic circuits. So the low-power big thing that the ARM is most valued for today, the reason that it's on all your mobile phones, was a complete accident."
> Wilson had, it turned out, designed a powerful 32-bit processor that consumed no more than a tenth of a Watt."
Not for Mac mini?
Seriously, can’t you tell me about the CPU cores and their performance?
Whether you're playing games, or editing videos, or doing 3D work, or trying to digest the latest bloated react mess on some website.. ;)
- M1 | 5 nm | 8 (4P+4E) | GPU 7–8 | 16-core Neural | Memory Bandwidth: 68.25 GB/s | Unified Memory: 16 GB | Geekbench6 ~2346 / 8346
- M2 | 5 nm (G2) | 8 (4P+4E) | GPU 8–10 | 16-core Neural | Memory Bandwidth: 100 GB/s | Unified Memory: 24 GB | Geekbench6 ~2586 / 9672
- M3 | 3 nm (first-gen) | 8 (4P+4E) | GPU 8–10 | 16-core Neural | Memory Bandwidth: 100 GB/s | Unified Memory: 24 GB | Geekbench6 ~2965 / 11565
- M4 | 3 nm (second-gen) | 10 (4P+6E) | GPU 8–10 | 16-core Neural | Memory Bandwidth: 120 GB/s | Unified Memory: 32 GB | Geekbench6 ~3822 / 15031
- M5 | 3 nm (third-gen) | 10 (4P+6E) | GPU 10 | 16-core Neural | Memory Bandwidth: 153 GB/s | Unified Memory: up to 32 GB | Geekbench6 ~4133 / 15,437 (9-core sample)
It just never sweats AT ALL - it feels like a decade from obsolescence based on what I'm doing now.
It would have to be an order of magnitude faster for me to even notice at this point.
That doesn’t make it obsolete, at all.
And then there's 3rd party software that will stop supporting that old OS version, in part because Apple's dev tools make that difficult.
Eventually, Apple's own services will stop supporting that OS - no convenient iCloud support.
Finally, the root CA certs bundled with the OS will become too out of date to use.
I'm planning on putting Linux on my Intel Mac Mini soon. But when a M3+ Mini goes out of support, will we have that option?
With a debloated Windows 10 (which we're not going to connect to the internet anyway) they can live on for older games.
Your points are valid but it’s not 2 years, it’s more than that for big vulnerabilities.
Has it had one since macOS 26 came out? They usually do 2 versions behind - in the summer, that was macOS 13, but now it's macOS 14.
Edit: I know Mac OS X is a Unix and Linux is technically a clone, however, of the two, Linux & GNU is a much better environment to learn in.
15 years old is just old and has too little ram
A 15 year old device can be still as capable as a raspberry pi and those work fine now for modern computing
Just copy the LiveCD image onto a USB stick, insert, boot holding down the Option key, and you can try it without actually installing it (i.e. leaving your MacOS untouched).
Hopefully there will be some insight from what Asahi Linux is doing or has in store and what might be transferrable.
The additional cooling in them seems quite helpful to their performance compared to the same chip in a laptop.
“Per his findings, Chrome used 290MB of RAM per open tab, while Safari only used 12MB of RAM per open tab.”
https://www.macrumors.com/2021/02/20/chrome-safari-ram-test/
The notch is bigger than it should be for sure, I would've loved for it to be narrower. But I don't really mind the trade-off it represents.
You could add half an inch of screen bezel and make the machine bigger, just to fit the web cam. Or you could remove half an inch of screen , essentially making the "notch" stretch across the whole top of the laptop. Or you could find some compromised place to put the camera, like those Dell laptops which put the camera near the hinge. Or you can let the screen fill the whole lid of the laptop, with a cut-out for the camera, and design the GUI such that the menu bar fills the part of the screen that's interrupted by the notch.
I personally don't mind that last option. For my needs, it might very well be the best alternative. If I needed a bigger below-the-notch area, I could get the 16" option instead of the 14" option.
Maybe it's a patent thing.
I use my webcam enough these days to take part in video meetings that it'd be a pretty big problem for me.
Snapdragon X Elite 2 processor will be out next year for the refreshed model
The weight is the big one for me - only 2.5 lbs vs 3.4 lbs
Remember the Dell has an 18 month old processor, X Elite 2 coming out next year.
Source for all these stats: https://nanoreview.net/en/laptop-compare/dell-xps-13-9345-20...
Companies tried that. You get very strange-looking up-your-nose pictures.
OLED screens are inherently transparent, there is just a light-emitting layer in them. You put your camera behind the screen, and either make the few pixels on top of the lens go black when it's on, or you use a lot of software to remove the light that comes from the screen and clean up the picture.
Feels like for a laptop it would be durable enough and also fulfill the "webcam is physically blocked when off".
In the end they reverted because they were not willing to make it optional. They also never released a touch bar keyboard for desktop, which would have made it more useful perhaps
As for the dongle issue, that went away when I upgraded to a USB-C monitor at home and USB-C equipment at work. I can dock to a monitor or plug into a projector to give a presentation and charge with the same cable. At this point I don't want an HDMI port, and I'm kind of sad that the next laptop will probably have a dedicated charging cable.
The most common ports I need are roughly: 1. USB-C; 2. HDMI; 3. USB-A; 4. second USB-C; 5. third USB-C; 6. second USB-A; 7. DisplayPort; 8. fourth USB-C.
I also used to use the Touch Bar for a status display for things like tests, it was honestly great. Do not miss the battery life and performance compared to my subsequent Apple Silicon laptops, but definitely miss the keyboard.
OLED is much better than other display technology, and they’ve done other OLED screen devices. It would be quite surprising to see them screw this up—not impossible, sure. They could screw up some other design element for example. But, it would be somewhat surprising, right? And OLED is a big change so maybe they won’t also feel the need to mess with other stuff.
Font rendering, hard to say, I think it’s just preference.
Terminals look very nice with actual-black backgrounds.
My past monitors have lasted me 5-7 years in the past, and I only upgraded for size (once) and gsync (also once).
I don't want to be forced to buy another one just because of burn-in.
Garbage design
I'm guessing the m5 pro may support 64GB but...
* I want it.
* I have met all my other financial obligations.
* I do not have to go into debt for it.
* QED
Local LLMs.
Update: I am thinking the 24GB for M5 is a typo. I see on Apple's site the 14 inch MBP can be configured optionally with 32GB of RAM.
If anyone has any real clues that they can share pseudonymously, that would be great. Not sure which department drove that change.
There may be a technical explanation for it, but incentives are incentives.
> E.g they found that most people buying 64GB ram do also buy the upgraded processor.
It seems like the way they've divided them, there's at least one more SKU than there would otherwise be, because of that base M4 Max with only 36gb of ram (can't get it with 24,48,64,96), so if you want the extra few cores, you now have to go to the max Max to get any more ram.
It took me a while to commit to the purchase, because I felt like an idiot implicitly telling them I'm okay with that bs pricing ladder, but at least I didn't over extend and go for the Max. They already charge comically too much for ram and storage.
Pulled shenanigans wrt TPM requirements for Windows 10 and 11. Actively trying to make sure people login to a Microsoft Account and making it hard to use Local Accounts.
> Mac developers are all too used of having to constantly keep up with whatever crap Apple has changed and moved around this time.
Mmm...
Win16 API
Win32 API (including variants like GoodLuckSystemCallExExEx2W(...))
MFC
ATL
.NET WinForms
.NET Avalon/WPF
Silverlight
MAUI
...The thing with all the mentioned APIs is that, excluding 16 bit stuff (that got yeeted in Win7 x64, but if you did need it you could run W7 x32), you can still run software using them without too much of a hassle and you most probably can compile it if you need to fix a bug.
Good luck trying to get a Mac game from the 90s running on any Mac natively without an emulator/VM in contrast.
.NET technologies... Yeah, MS dropped the ball there.
I'm sure it's possible to do that, but the backwards compatibility on Windows is definitely not as good as you say.
That said, I'm also currently, as a fun personal project, converting a game originally intended to work on 68k Macs and which still has parts explicitly labelled as for resource forks, and I've lived through (and done work on) 68k, PPC, Intel, and M-series hardware, plus all the software changes, so I agree with you about Apple.
No piece of Mac software anyone has bought in the late PPC Mac era can even run (!) at all natively on a modern Mac, and even early Intel Mac software will not run on the last Intel generation ever since macOS dropped 32-bit support in userspace entirely. You need to pay the developers for a new version, that's obsolescence by definition and particularly I'm still pissed about the 32 bit removal as that also killed off WINE running 32 bit apps which, you can probably guess, include many games that never got a 64-bit Windows binary because they were developed long before Windows x64 became mainstream (or into existence).
I do love Apple for high quality hardware, but I'll stick the finger to them till the day I die for killing off WINE during the Intel era for no good reason at all.
> You need to pay the developers for a new version, that's obsolescence by definition
Sure, but you don't have to pay Apple.
The entire point of the idea of planned obsolescence is companies intentionally making their products last less time than they should, so you have to pay that company more money.
This is a company making it so you might have to pay other companies more money, because backwards compatibility isn't a priority for them. You can be annoyed by that, sure, but it is not the same thing, and is not obviously corrupt like planned obsolescence is.
I'm just asking the question.. ;-)
Our family iPad Pro is older than my 8-year old son, and still gets security patches. My wife’s phone is an XS Max, launched in 2018; iOS 26 is the first release that doesn’t support it - it will continue to receive security patches for the foreseeable future. My son’s school laptop is my old 8gb 2020 M1 Air, which continues to have stellar performance and battery life and could run Tahoe if I was crazy enough to want to upgrade it. My work machine is a 2021 M1 Pro that runs just as great as the day I bought it (thanks, Al Dente!). My 3 Apple TV 4Ks are I-have-no-idea-how-old but they are still being updated and just get out of the way like a TV box should.
I have no particular love for Apple (or any other company), but they’ve always treated me well as a customer. I can’t really think of another tech co that seems to make people as irrationally angry. Is it their marketing? I hate their marketing too. But their products and support are great.
Apple is designing and manufacturing a chip/chipset/system with 32GB with integrated memory. During QA, parts that have one non-conformant 8GB internal module out of the four are reused in a cheaper (but still functional) 24GB product line rather than thrown away.
Market segmentation also has its hand in how the final products are priced and sold, but my strong guess is that, if Apple could produce 32GB systems with perfect yield, they would, and the 24GB system would not exist.
Ah, the memory is integrated in the same package (the "chip" that gets soldered onto the motherboard) as the integrated CPU/GPU, and I had understood that correctly. However, I had incorrectly surmised that it was built into the same silicon die.
Thanks for the correction!
Lesson: TIL about the difference between System-In-a-Package (SIP) and System-On-a-Chip, and how I had misunderstood the Apple Silicon M series processors to be SoCs when they're SiPs.
[1] https://www.apple.com/newsroom/2023/06/apple-introduces-m2-u...
The fastest multicore CPUs are the ones with a lot of cores, e.g. 64+ core Threadrippers. These have approximately the same single-core performance as everything else from the same generation because single-core performance isn't affected much by number of cores or TDP, and they use the same cores.
Everyone also uses Geekbench to compare things to Apple CPUs but the latest Geekbench multi-core is trash: https://dev.to/dkechag/how-geekbench-6-multicore-is-broken-b...
For the first 8 threads or so, it's fine. Once you hit 20 or so it's questionable, or at least that's my impression.
As it is, it's just clearly misleading to people that haven't somehow figured out that it's not really a great test of multithreaded throughput.
The only “evidence” they give that GB6 is “trash” is that it doesn’t show increasing performance with more and more cores with certain tests. The obvious rejoinder is that GB6 is working perfectly well in testing that use case and those high core processors do not provide any benefit in that scenario.
If you’re going to use synthetic benchmarks it’s important to use the one that reflects your actual use case. Sounds like GB6 is a good general purpose benchmark for most people. It doesn’t make any sense for server use, maybe it also isn’t useful for other use cases but GB6 isn’t trash.
The problem with this rejoinder is, of course, that you are then testing applications that don't use more cores while calling it a "multi-core" test. That's the purpose of the single core test.
Meanwhile "most consumer programs" do use multiple cores, especially the ones you'd actually be waiting on. 7zip, encryption, Blender, video and photo editing, code compiles, etc. all use many cores. Even the demon scourge JavaScript has had thread pools for a while now and on top of that browsers give each tab its own process.
It also ignores how people actually use computers. You're listening to music with 30 browser tabs open while playing a video game and the OS is doing updates in the background. Even if the game would only use 6 cores by itself, that's not what's happening.
There are examples of programs that aren't totally parallel or serial, they'll scale to maybe 6 cores on a 32-core machine. But there's so much variation in that, idk how you'd pick the right amount of sharing, so the only reasonable thing to test is something embarassingly parallel or close. Geekbench 6's scaling curve is way too flat.
The purpose of a multi-core benchmark is that if you throw a lot of threads at something, it can move where the bottleneck is. With one thread neither a desktop nor HEDT processor is limited by memory bandwidth, with max threads maybe the first one is and the second one isn't. With one thread everything is running at the boost clock, with max threads everything may be running at the base clock. So the point of distinguishing them is that you want to see to what extent a particular chip stumbles when it's fully maxed out.
But tanking the performance with shared state will load up the chip without getting anything in return, which isn't even representative of the real workloads that use an in-between number of threads. The 6-thread consumer app isn't burning max threads on useless lock contention, it just only has 6 active threads. If you have something with 32 cores and 64 threads and it has a 5GHz boost clock and a 2GHz base clock, it's going to be running near the boost clock if you only put 6 threads on it.
It's basically measuring the performance you'd get from a small number of active threads at the level of resource contention you'd have when using all the threads, which is the thing that almost never happens in real-world cases because they're typically alternatives to each other rather than things that happen at the same time.
It's the only M5 device that leaked to the public early.
Though either Fedora itself, how it built with Asahi or just running it with little disk space end up with freeze on boot after random updates. Twice, once without even rpmfusion enabled. Either some weird btrfs issue or I dont know what.
Like I'm Linux dude for two decades and dont do anything fancy, so this is weird. Switched to Asahi Ubuntu on ext4 and it working great so far.
Much less active than it used to be when it was run by Hector Martin. The core development is a lot slower. Although the graphics stack, for instance, has reached a very mature state recently.
> Is it ready as a daily driver?
It depends. Only M1 and M2 devices are reasonably well-supported. There is no support for power-efficient sleep, Display Port, Thunderbolt, video decoding or encoding, touch ID. The speakers overheat and turn off momentarily when playing loud for a longer period of time. The audio stack in general had to be built from ground up and it seems to me like there are bits and pieces still missing or configured sub-optimally.
> Is it getting support from Apple?
Not that I am aware of.
> are they (Apple) hostile to it?
Not to my knowledge.
> Are there missing features?
Plenty, as described above. There has been some work done recently on Thunderbolt / Display Port. Quite a few other features are listed as WIP on their feature support page.
> Can I run KDE on it?
Of course. KDE Plasma on Fedora is Asahi Linux's "flagship" desktop environment.
The argument was originally about merging some Rust code into some parts of the Linux kernel if I remember correctly. It did not involve Linus Torvalds directly. Rather, the respective maintainers of those specific parts were unwilling to merge some Rust code, mostly because they did not know Rust well and they did not want to acquire the responsibility to maintain such code.
> There is no support for power-efficient sleep
"power-efficient sleep" refers to discharging 1-2% battery over night rather than 10-20%. I.e. there's room for improvement, but the device can still be used without worrying much about battery life regardless (especially given how far a full charge gets you even without sleep).
> Display Port, Thunderbolt
Big item indeed, but it's actively worked on and getting there (as you mentioned).
> video decoding or encoding
Hurts battery performance, but otherwise I never noticed any other effect. YMMV for 4K content.
> touch ID
Annoying indeed, and no one has worked on this AFAIK.
> The speakers overheat and turn off momentarily when playing loud for a longer period of time. The audio stack in general had to be built from ground up and it seems to me like there are bits and pieces still missing or configured sub-optimally.
Sad to hear since I thought the audio heat model was robust enough to handle all supported devices. On my M1 Air I've never seen anything like this, but perhaps devices with more powerful speakers are more prone to it?
My experience is also based on a M1 Macbook Air. I have repeatedly experienced sudden muting of the speakers for a second or two while playing conversations on a high volume.
I only assume it is caused by thermal management of the speakers but I did not actually verify it.
+------+------------------+--------------+----------+----------------+-------------------+-------------------+---------------------------+
| Chip | Process | CPU Cores | GPU | Neural Engine | Memory Bandwidth | Unified Memory | Geekbench6 (Single/Multi) |
+------+------------------+--------------+----------+----------------+-------------------+-------------------+---------------------------+
| M1 | 5 nm | 8 (4P+4E) | 7–8 | 16-core Neural | 68.25 GB/s | 16 GB | ~2346 / 8346 |
| M2 | 5 nm (G2) | 8 (4P+4E) | 8–10 | 16-core Neural | 100 GB/s | 24 GB | ~2586 / 9672 |
| M3 | 3 nm (first-gen) | 8 (4P+4E) | 8–10 | 16-core Neural | 100 GB/s | 24 GB | ~2965 / 11565 |
| M4 | 3 nm (second-gen)| 10 (4P+6E) | 8–10 | 16-core Neural | 120 GB/s | 32 GB | ~3822 / 15031 |
| M5 | 3 nm (third-gen) | 10 (4P+6E) | 10 | 16-core Neural | 153 GB/s | up to 32 GB | ~4133 / 15437 (9-core) |
+------+------------------+--------------+----------+----------------+-------------------+-------------------+---------------------------+ Chip | Process | CPU | GPU | Neural | Memory | Unified | Geekbench6
| | Cores | | Engine | Bandwidth | Memory | Single / Multi
-----|---------|-----------|------|---------|-------------|---------|----------------------
M1 | 5 nm G1 | 8: 4P+4E | 7–8 | 16-core | 68.25 GB/s | 16 GB | 2346 / 8346
M2 | 5 nm G2 | 8: 4P+4E | 8–10 | 16-core | 100 GB/s | 24 GB | 2586 / 9672
M3 | 3 nm G1 | 8: 4P+4E | 8–10 | 16-core | 100 GB/s | 24 GB | 2965 / 11565
M4 | 3 nm G2 | 10: 4P+6E | 8–10 | 16-core | 120 GB/s | 32 GB | 3822 / 15031
M5 | 3 nm G3 | 10: 4P+6E | 10 | 16-core | 153 GB/s | ≤32 GB | 4133 / 15437 (9 core)Maybe they'll finally turn it on for Markdown's 25th anniversary in a few years? A man can dream...
Chip | Process | CPU | GPU | Neural | Memory | Unified | Geekbench6
| | Cores | | Engine | Bandwidth | Memory | Single / Multi
---------|----------|-------------|---------|---------|------------|---------|----------------------
M1 | 5 nm G1 | 8: 4P+4E | 7–8 | 16-core | 68.25 GB/s | 16 GB | 2346 / 8346
M2 | 5 nm G2 | 8: 4P+4E | 8–10 | 16-core | 100 GB/s | 24 GB | 2586 / 9672
M3 | 3 nm G1 | 8: 4P+4E | 8–10 | 16-core | 100 GB/s | 24 GB | 2965 / 11565
M4 | 3 nm G2 | 10: 4P+6E | 8–10 | 16-core | 120 GB/s | 32 GB | 3822 / 15031
M5 | 3 nm G3 | 10: 4P+6E | 10 | 16-core | 153 GB/s | ≤32 GB | 4133 / 15437 (9 core)
M4 Pro | 3 nm G2 | 14: 10P+4E | 16–20 | 16-core | 273 GB/s | 64 GB | 3925 / 22669
M4 Max | 3 nm G2 | 16: 12P+4E | 32–40 | 16-core | 546 GB/s | 128 GB | 4060 / 26675For Context:
M1 Ultra (Mac Studio): 18,405
M3 Pro (14-inch MacBook Pro): 15,257
the boost seems mainly due to higher memory bandwidth and slightly different architecture.
an economist could probably tell me why portioning some of that money to spend on game port budget isnt valuable. gamepass seems ripe to be undercut too
If you could yank the screen out, it probably evens out :)
I have seen quite a few such announcements from competitors that tend to be so close that I wonder if they have some competitor analysis to precede the Goliath by a few days (like Google vs rest, Apple vs rest etc).
Then there is the whole ARM vs x86 issue. Even if a compatible Linux distro were made, I expect to run all kinds of software on my desktop rig including games, and ARM is still a dead end for that. For laptops, it's probably a sensible choice now, but we're still far from truly free and usable ARM desktop.
They run Linux actually very well, have you ever tried Parallels or VMware Fusion? Especially Parallels ships with good softwaer drivers for 2d/3d/video acceleration, suspend, and integration into the host OS. If that is not your thing, the new native container solution in Tahoe can run container from dockerhub and co.
> I simply don't want to live in Apple's walled garden.
And what walled garden would that be on macOS? You can install what you want, and there is homebrew at your fingertips with all the open and non-open software you can ask for.
Windows - because I needed it for a single application.
Linux - has been extremely useful as a compliment to small arm SBCs that I run. eg: Compiling a kernel is much faster there than on (say) a Raspberry Pi. Also, USB device sharing makes working with vfat/ext4 filesystems on small memory cards a breeze.
What is hard about this?
It's pretty simple to keep these two things separate, like everywhere else in the present and history of the industry.
Also, what if I want to run eBPF on my laptop on bare metal, to escape the hypercall overhead from VMs or whatever? Ultimately, a VM is not the same as a native experience. I might want to take advantage of acceleration for peripherals that aren't available unless I'm bare metal.
As in: "I can't run iOS on my macOS installation, so I am going to use a different OS where I can't run iOS either".
I switched from pixel to iPhone in large part because pixel removed the rear fingerprint reader, headphone jack, and a UI shortcut I used multiple times a day. It’s not like the iPhone had those things but now neither did the pixel.
I mean as long as the law of wirth does not bite too hard
I just want a linux-like system that is not mainful to use and apple's is the closest thing that worked for me without resorting to last ditch efforts like sacrificing virgin maidens or newborn kittens on top of my Dell machine... and Apple provides one that just works ... reliably
Does anyone know if we're still on pace with Moore's law?
M1 16 billion transistors
M5 28 billion transistors
so that would be more like a 4/5 year doubling rather than two years.
That said there's a chart in Wikipedia showing it still going on https://upload.wikimedia.org/wikipedia/commons/c/cc/The_Moor...
but that's calculations per second per dollar rather than transistors per chip like Moore.
More came up with the law in 1965 and thought it would run 10 years till 1975 so it's had a good run if it's petering out now.
The compute per sec per dollar is a longer trend ~1900 that will likely keep on.
Gemini thinks: "The machine that began the long-term trend often cited as "128 years of Moore's Law" was Herman Hollerith's tabulating machine, created for the 1890 U.S. Census"
Still super interesting architecture with accelerators in each GPU core _and_ a dedicated neural engine. Any links to software documentation for how to leverage both together, or when to leverage one vs the other?
And it ruins battery life.
For coding it's on par with GPT3 at best which is amateur tier these days.
It's good for text to speech and speech to text but PCs can do that too.
Which I know is almost a lie, since it's quite efficient but if you really hit the SoC hard you are still getting around 3hrs battery life at most. Of course, that's better than the 1,5hrs you would get at best from an efficient x86 SoC but it makes the advantage not as good as they make it out to be. You are going to need a power source, a later sure, but that's just a problem displacement.
But even if your numbers weren't pulled out of your ass, a 3hr vs 1.5hr difference is a *100%* improvement. In what multiverse is that not absolutely phenomenal?
And no battery powered device is going to last long running large AI models. How is that an ok thing to bash Apple about? Because they don't break the laws of physics?
Until then, I take a mini PC with me along with my M1 when I travel and use game streaming for gaming and offload dev and AI work via ssh + ssh remote tools.
To me, M5 has amazing hardware, but they put square wheels on a Ferrari
The MacBook ZeroPersonally, I'm looking forward to M1 MacBook Pros dropping in price so I could nab one for cheap for running Asahi Linux.
I'm imagining the engineers responsible for running the tests finely tuning the test suite for days and days so they could get that number into the press release, lol. There's no way that's a coincidence and someone definitely advocated for that line being the way it is.
https://www.apple.com/newsroom/2025/10/apple-unveils-new-14-...
https://www.apple.com/macbook-pro/#footnote-4
So yes, that is compared to a very old 14 nm design, presumably the i7-8557U per Wikipedia.
It's not that the comparison is incorrect, just that it's a silly and unenlightening statement, bordering on completely devoid of meaning if it weren't for the x86 pun.
They could sell you a downgrade and still stay 2x M1 Pro performance (it was 4x from last year)
Apple is a marketing company made to sell stuff.
That's like... every company? Are you saying they don't have good tech?
Ya sure, you can say that every company must do that, but apple are exceptional at it. Once you start noticing the unlabeled performance charts, the missing baselines, the comparing with ages old models, the disingenuous "86x" metrics, the whole show becomes cringe worthy.
I will, yes. If macOS supported Vulkan, then those Intel Macs would have GPU acceleration too, and thus it would be a fair fight comparing it to MPS. Apple's tech stack is so miserly and poor that they never supported the common GPGPU libraries that literally every single OEM is and was shipping.
Apple's tech is appalling. Are you saying they exercise good judgement on behalf of their users?
So saying their tech is "appealing" is a matter of opinion and I'd argue something a small minority of their users care about. But I don't know.
The MacBook Pro with the m5 is the low end model? an M2 Ultra is better than the m5?
I understand what they’re doing from a roadmap standpoint - but as a pure consumer is a bit confusing
> Performance tests are conducted using specific computer systems and reflect the approximate performance of MacBook Pro.
But if your workload belongs on 8 H100 GPUs then there isn't much point in trying to run it on a macbook. You'd be better served by renting them by the hour, or if you have a quarter million dollars you can always just purchase them outright.
The H100 is just an example, this is true for any workload that doesn't fit on a laptop.
Unless you close the lid on a small grain of sand or some similarly small, hard particle, at which point the screen goes black and costs nearly as much to replace as the 1 year old computer is worth. Ask me how I know. :’(
If it doesn’t happen later this week, how long would the wait be? A few months? More?
Are they trying to milk the market in small increments? Especially before Christmas.
The MBP 14 M5 release came a bit unexpected. Many analysts mentioned beginning of 2026.
When will M5 Pro and Max be released?
What are your thoughts on comparing M4 Pro against the base version of M5?
nik736•3mo ago
quest88•3mo ago
nik736•3mo ago
Rohansi•3mo ago
burnte•3mo ago
hu3•3mo ago
"M5 is Apple’s next-generation system on a chip built for AI, resulting in a faster, more efficient, and more capable chip for the 14-inch MacBook Pro, iPad Pro, and Apple Vision Pro."
mpeg•3mo ago
sgt•3mo ago
mpeg•3mo ago
andy_ppp•3mo ago
sgt•3mo ago
andy_ppp•3mo ago
sgt•3mo ago
I can easily imagine companies running Mac Studios in prod. Apple should release another Xserve.
andy_ppp•3mo ago
replete•3mo ago
andy_ppp•3mo ago
bombcar•3mo ago
asimovDev•3mo ago
iyn•3mo ago
czbond•3mo ago
shorts_theory•3mo ago
modeless•3mo ago
I highly recommend Andrej Karpathy's videos if you want to learn details.
pfortuny•3mo ago
rs186•3mo ago
Sohcahtoa82•3mo ago
So if you have a 7B parameter model with 16-bit quantization, that means you'll have 14 GB/s of data coming in. If you only have 153 GB/sec of memory bandwidth, that means you'll cap out ~11 tokens/sec, regardless of how my processing power you have.
You can of course quantize to 8-bit or even 4-bit, or use a smaller model, but doing so makes your model dumber. There's a trade-off between performance and capability.
adastra22•3mo ago
Sohcahtoa82•3mo ago
wizee•3mo ago
Models like Qwen 3 30B-A3B and GPT-OSS 20B, both quite decent, should be able to run at 30+ tokens/sec at typical (4-bit) quantizations.
zamadatix•3mo ago
Neither product actually qualifies for the task IMO, and that doesn't change just because two companies advertised them as such instead of just one. The absolute highest end Apple Silicon variants tend to be a bit more reasonable, but the price advantage goes out the window too.
cma•3mo ago
mrheosuper•3mo ago
diabllicseagull•3mo ago
chedabob•3mo ago
Tepix•3mo ago
replete•3mo ago
It would take 48 channels of DDR5x-9600 to match a 3090's memory bandwidth, so the situation is unlikely to change for a couple of years when DDR6 arrives I guess