Also just noticed this:
"And now with M5, the new 14-inch MacBook Pro and iPad Pro benefit from dramatically accelerated processing for AI-driven workflows, such as running diffusion models in apps like Draw Things, or running large language models locally using platforms like webAI."
First time I've ever heard of webAI - I wonder how they got themselves that mention?
I wondered the same. Went into Crunchbase and found out Crunchbase are now fully paywalled (!), well saw that coming... Anyway, hit the webAI blog, apparently they were showcased at the M4 Macbook Air event in 2024 [1] [2]:
> During a demonstration, a 15-inch Air ran a webAI’s 22 billion parameter Companion large language model, rendered a 4K image using the Blender app, opened several productivity apps, and ran the game Wuthering Waves without any kind of slowdown.
My guess is this was the best LLM use-case Apple could dig-up for their local-first AI strategy. And Apple Silicon is the best hardware use-case webAI could dig-up for their local-first AI strategy. As for Apple, other examples would look too hacky, purely dev-oriented and depend on LLM behemoths from US or China. Ie "try your brand-new performant M5 chip with LM Studio loaded with China's Deepseek or Meta's Llama" is an Apple exec no-go.
1. https://www.webai.com/blog/why-apples-m4-macbook-air-is-a-mi...
2. https://finance.yahoo.com/news/apple-updates-bestselling-mac...
Now that they own the SoC design pipeline, they’re really able to flex these muscles.
Very often the intel chips in macbooks were stellar, they were just seriously inhibited by Apples terrible cooling designs and so were permanently throttled.
They could never provide decent cooling for the chips coupled with their desire to make paper thin devices.
Curiously they managed to figure this out exactly when it became their silicon instead (M1 MacBook Pros were notably thicker and with more cooling capacity than the outgoing Intel ones)
And this would eventually evolve into MacOS.
That is probably the least of reasons why people buy Apple - to many it's just a status symbol, and the OS is a secondary consideration.
EDIT: I seem to be getting downvoted, so I will just leave this here for people to see I am not lying:
https://www.businessinsider.com/tim-cook-apple-is-not-a-hard...
Apple's product revenue in this fiscal year has been $233B, with a gross margin of $86B.
Their services revenue is $80B with $60B gross margin.
https://www.apple.com/newsroom/pdfs/fy2025-q3/FY25_Q3_Consol...
Look, I totally understand making an off-hand comment like you did based on a gut feeling. Nobody can fact-check everything they write, and everyone is wrong sometimes. But it is pretty lazy to demand a source when you were just making things up. When challenged with specific and verifiable nubmers, you should have checked the single obvious source for the financials of any public company. Their quarterly statements.
Regardless of revenue, Apple isn't a hardware company or a software company. It's a product company. The hardware doesn't exist merely to run the software, nor does the software exist merely to give functionality to the hardware. Both exist to create the product. Neither side is the "main" one, they're both parts of what ultimately ships.
Watch this and maybe you might change your mind:
Modern Apple is also quite a bit more integrated. A company designing their own highly competitive CPUs is more hardware-oriented than one that gets their CPUs off the shelf from Intel.
Yes, it's $70B a year from iPhones alone and $23B from the totality of the Services org. (including all app store / subscription proceeds). Significantly more than 50% of the company's total profits come from hardware sales.
We should be comparing profit on those departments not revenue. Do you have those figures?
It is well known that companies often sell the physicval devices at a loss, in order to make the real money from the services on top.
Apple is and always has been a HW company first.
Steve Jobs consistently made the point that Apples hardware is the same as everyone elses, what makes them different is they make the best software which enables the best user experience.
Here see this quote from Steve Jobs which shows that his attitude is the complete opposite of what you wrote.
The above link is a video where he mentions that.
It is true that Apple’s major software products like iOS and MacOS are only available on Apple’s own hardware. But the Steve Jobs justification for this (which he said in a different interview I can’t find right now so I will paraphrase) is that he felt Apple made the best hardware and software in the world so he wanted Apple’s customers to experience the best software on the best hardware possible which he felt only Apple could provide. (I wish I could find the exact quote.)
Anyway according to Steve Jobs Apple is a software first company.
It was coherent, (relatively) bug free, and lacked the idiot level iOSification and nagging that is creeping in all over MacOS today.
I haven't had to restart Finder until recently, but now even that has trouble with things like network drives.
I'm positive there are many internals today that are far better than in Snow Leopard, but it's outweighed by user visible problems.
It shouldn't surprise you I think that Android Jelly Bean was the best phone OS ever made as well, and they went completely in the wrong direction after that.
Programs absolutely could have much more controllable auto save before for when it made sense.
Speaking of security it didn't have app sandboxing either.
This is what I mean about iOSification - it's trending towards being a non serious OS. Linux gets more attractive by the day, and it really is the absence of proper support of hardware in the class of the M series that prevents a critical mass of devs jumping ship.
being poor, I need to sell my Macbook to get money to pay of my 16e, then sell the 16e and use that money to but a Pixel 9, then probably a but a Thinkpad Carbon X1. Just saying all that to show you the lengths I am going through to boycott/battle the enshitification.
At least its open source and free I guess.
Adding extra features that aren't necessarily needed is enshittification, and very not-unix.
That would be the end of open source, hobbyists and startup companies because you'd have to pay up just to have a basic C library (or hope some companies would have reasonable licensing and support fees).
Remember one of the first GNU projects was GCC because a compiler was an expensive, optional piece of software on the UNIX systems in those days.
It's not even about open source or closed source at this point. It's about feature creep.
Why parse whatever is in the logs, at all?
Imagine the same stuff in your SSH client, it would parse the content before sending them over because a functionality requires it to talk to some server somewhere, it's insanity.
> Hardware and software both matter, and Apple’s history shows that there’s a good argument to be made for developing integrated hardware and software. But if you asked me which matters more, I wouldn’t hesitate to say software. All things considered I’d much prefer a PC running Mac OS X to a Mac running Windows.
https://daringfireball.net/2009/11/the_os_opportunity
At the time I'd only been a Mac user for a few years and I would have strongly agreed. But definitely things have shifted— I've been back on Windows/WSL for a number of years, and it's software quality/compatibility issues that are a lot of what keeps me from trying another Mac. Certainly I'm far more tempted by the hardware experience than I am the software, and it's not even really close.
It's a server or developer box first and a non-technical user second.
On Linux there is variety and choice, which some folks dislike.
But on the Mac I get whatever Apple gives me, and that is often subject to the limitations of corporate attention spans and development budgets.
And arbitrary turf wars like their war against web apis/apps causing more friction for devs and end users.
Should Emacs and Vim both be called "Editor" then?
To me, this is actually a great example of the problems with Linux as a community, that GUI applications seem to just be treated as placeholders (e.g., all word processors are the same?), but then its inconsistent by celebrating the unique differences between editors like Vim and Emacs. Photoshop, Excel, Logic Pro, Final Cut Pro are, in my opinion, crown jewels of what we've accomplished in computing, and by extension some of the greatest creations of the human race, democratizing tasks that in some cases would have cost millions of dollars before (e.g., a recording studio in your home). Relegating these to generic names like "spreadsheet", makes them sound interchangeable, when in my opinion they're each individual creations of great beauty that should wear their names with pride. They've helped improve the trajectory of the human race by facilitating many individuals to perform actions they never would have had the resources to do otherwise.
If I close my laptop for a few days, I don't want significant battery drain. If I don't use it for two weeks, I want it to still have life left. And I don't want to write tens of gigabytes to disk every time I close the lid, either!
If you're talking about hardware interaction from the command line, that's very different and I don't think there's a fix.
I want good window management. Linux gives me a huge number of options. MacOS - not as much.
One can just hand wave "Apple must support Linux and all" but that is not going to get anything done.
Edit: Hard to call intentionally preventing support for web apis a power user thing. This creates more friction for basic users trying to use any web app.
Edit2: lol Apple PR must be all over this, went from +5 to -1 in a single refresh. Flagged for even criticizing what they intentionally break.
I understand that this post is about MacOS, but yes, we are forced to support Safari for iOS. Many of these corporate decisions to prevent web apps from functioning properly spill over from MacOS Safari to iOS Safari.
On iOS you cannot even keep a web app running in the background. The second they mutlitask, even with an audio/microphone active, Apple kills it. Are they truly adding battery life or are they cheating by creating restrictions that prevent apps from working?
Being able to conduct a voice call through the browser seems like a pretty basic use case to me.
For a simple example, no app remembers the last directory you were working in. The keys each app uses are completely inconsistent from app to app. And it was only in Windows 11 that Windows started remembering my window configuration when I plugged and unplugged a monitor. Then there’s the Windows 95-style dialog boxes mixed in with the Windows 11-style dialog boxes; what a UI mess. I spoke with one vendor the other day who was actually proud they’d adopted a ribbon interface in their UI “just like Office” and I verbally laughed.
From a hardware perspective, I still don’t understand why Windows and laptop manufacturers can’t get sleep working right. My Intel MacBook Pro with an old battery still sleeps and wakes and lasts for several hours, while my new Windows laptop lasts about an hour and won’t wake from hibernate half the time without a hard reboot.
I think Windows is the “good enough” for most people.
While overall I may say MacOS is better, I would not say it's better in every way.
Believe it or not, I had a better experience with 3rd party window managers in Windows than on MacOS.
I don't think the automation options in MacOS are better than AutoHotKey (even Linux doesn't have something as good).
And for corporate work, the integration with Windows is much better than anything I've seen on MacOS.
Mac HW is great. The OS is in that uncanny valley where it's UNIX, but not as good as Linux.
Did you try Keyboard Maestro https://www.keyboardmaestro.com/main/ (I've never used AutoHotKey and I'd be super curious if there are deficiencies in KM relative to it, but Keyboard Maestro is, from my perspective, a masterpiece, it's hard to imagine it being any better.)
Also I think this statement needs a stronger defense given macOS includes Shortcuts, Automator, and AppleScript, I don't know much about Windows automation but I've never heard of them having something like AppleScript (that can say, migrate data between applications without using GUI scripting [e.g., iterate through open browser tabs and create todos from each of them operating directly on the application data rather than scripting the UI]).
So, Windows' saving grace is being able to run a different operating system inside it? Damning with faint praise if I ever heard it...
You can't, really. Almost everyone resorts to buying an HDMI dongle to fake a display. Apple solved the problem at such a low level, the flexibility to run something in clamshell mode is broken, even when using caffeine/amphetamine/etc etc etc.
So, tradeoffs. They made their laptops go to sleep very well, but broke functionality in the process. You can argue it's a good tradeoff, just acknowledge that there WAS a tradeoff made.
Oh god, I'm going to have to bite the bullet and switch to 11, huh?
The one thing that has been saving me from throwing my PC out the window in rage has been the monitor I have that supports a "keep alive" mode where switching inputs is transparent to the computers connected to it. So when switching inputs between my PC and laptop neither one thinks the monitor is being disconnected/reconnected. If it wasn't for that, I'd be screaming "WHY ARE YOU MOVING ALL MY WINDOWS?" on a regular basis. (Seriously, why are you moving all my windows? Sure, if they're on the display that was just disconnected, I get you. But when I connect a new display, Windows 10 seems to throw a dart at the display space for every window and shuffle them to new locations. Windows that live in a specific place on a specific display 100% of the time just fly around for no reason. Please god just stop.)
A friend of mine lost a ton of messages when upgrading the OS (and therefore Mail). A number of others were affected by the same issue. There have been show-stopper bugs in the core functionality of Photos as well. I don't get the impression that the basics are Apple's focus with respect to software.
But I’ve certainly never struggled with getting WiFi to work on a Mac, or struggled with getting it to sleep/wake, or a host of other problems you routinely have on both Windows and Linux.
It’s not even close.
To compare Apples to apples, you'd have to look at a Framework computer and agree that wifi is going to work out of the box... but here I'm meeting you on a much weaker argument: "Apple's software basics are /not/ rock solid, but other platforms have issues too"
I don't find your original anecdote convincing:
> A friend of mine lost a ton of messages when upgrading the OS (and therefore Mail).
E.g., what does this mean? They lost mail messages? How did they verify they had those messages before and after? E.g., file-system operations? GUI search? How much do they know about how Mail app stores message (e.g., I used to try understand this decades ago, but I expect today messages aren't even necessarily always stored locally)? How are you syncing mail messages, e.g., using native IMAP, or whatever Gmail uses, or Exchange? What's the email backend?
E.g. without deeper evidence this sounds more like a mail message indexing issue rather than a mail-messages-stored-on-disk-issue (in 2025, I'd personally have zero expectations about how Mail manages messages on disk, e.g., I'd expect local storage of message to be dynamically managed like most applications that aren't document-based use a combination of cloud functionality and local caching, e.g., found this in a quick search https://apple.stackexchange.com/questions/471801/ensure-maco...), but if you have stronger evidence I'd love to hear it. But as presented your extrapolating much stronger conclusions than are warranted by the anecdote in my opinion.
I want to be able to set different networking options (manual DNS, etc) for different wifi networks, but as far as I can tell, I can only set them per network interface.
There's something like "locations" but last time I tried using that, the entire System Settings.app slowed to a crawl / beachballed until I managed to turn it back off.
> or struggled with getting it to sleep/wake
My m1 MBP uses something like 3-5% of its battery per hour while sleeping, because something keeps waking it up. I tried some app that is designed to help you diagnose the issue but came up empty-handed.
... but yes on both counts, it's light years better than my last experience with Linux, even on hardware that's supposed to have fantastic support (thinkpads).
In my case it works roughly ~50% of the time. Probably because of the Thunderbolt monitor connected to power it, idk.
> the basics are still rock solid
The basics like the OS flat out refusing to provide you any debugging information on anything going wrong? It's rock solid allright. I had an issue where occasionally I would get an error "a USB device is using too much power, try unplugging it and replugging it." Which device? Why the hell would Apple tell you that, where is the fun in that?
Key remapping requires installing a keylogger, nor can you have a different scroll direction between mouse and touchpad. There still isn't window management which for the sizes of modern monitors is quite constraining.
> still has UNIX underneath
A very constrained UNIX. A couple of weeks ago I wanted to test something (pkcs11-tool signing with a software HSM), and turns out that Apple has decided that libraries can only be loaded from a number of authorised locations which can only be accessed while installing an application. You can't just use a dynamic library you're linking to, it has to be part of a wider install.
You can remap with config files: https://hidutil-generator.netlify.app
Long story short, I was very happy with the "it just works" of ChromeOS, and only let down by the lack of support for some installed apps I truly needed in my personal life. I tried a Mac back in 2015 but couldn't get used to how different it was, and it felt very bulky compared to ChromeOS and much slower than the Linux machine I'd had, so I switched to a Pixelbook as was pretty content.
Fast forward to 2023 when I needed to purchase a new personal laptop. I'd bought my daughter a Pixelbook Go in 2021 and my son a Lenovo x1 Carbon at the same time. Windows was such a dumpster fire I absolutely ruled it out, and since I could run all the apps I needed on ChromeOS it was between Linux & Mac. I decided to try a Mac again, for both work & personal, and I've been a very happy convert ever since.
My M2 Pro has been rock solid, and although I regret choosing to upgrade to Sequoia recently, it still makes me feel better than using Windows. M4 Pro for work is amazingly performant and I still can't get over the battery efficiency. The nicest thing, imho, is that the platform has been around long enough for a mature & vibrant ecosystem of quality-of-life utilities to exist at this point, so even little niggles (like why do I need the Scroll Reverser app at all?) are easy to deal with, and all my media editing apps are natively available.
They got away with pushing ads, online and enterprise services, Copilot, etc. to every desktop user.
Not once in 10 years have I had ti troubleshoot while she uses her personal macOS, but a Dell Latitude laptop in 2025 still can’t just “open lid, work, close lid”.
And it’s slower. And eats more battery.
Quoth the Tao of Programming:
8.4
Hardware met Software on the road to Changtse. Software said: "You are Yin and I am Yang. If we travel together, we will become famous and earn vast sums of money." And so they set forth together, thinking to conquer the world.
Presently, they met Firmware, who was dressed in tattered rags and hobbled along propped on a thorny stick. Firmware said to them: "The Tao lies beyond Yin and Yang. It is silent and still as a pool of water. It does not seek fame; therefore, nobody knows its presence. It does not seek fortune, for it is complete within itself. It exists beyond space and time."
Software and Hardware, ashamed, returned to their homes.
AMD was also lagging with drivers, but now we see OpenAI swearing they gonna buy loads of their products, which so many people were not favor of liek just 5-7 years ago.
https://arstechnica.com/gadgets/2023/08/report-apple-is-savi...
Apple's chip engineering is top tier, but money also buys them a lot of advance.
Hardware is naturally limited in scope due to manufacturing costs, and doesn't "grow" in the same way. You replace features and components rather than constantly add to them.
Apple needs someone to come in and aggressively cut scope in the software, removing features and products that are not needed. Pair it down to something manageable and sustainable.
macOS has way too many products but far too few features. In terms of feature-completeness, it's already crippled. What OS features can macOS afford to lose?
(I have the same complaint about AWS, where a bunch of services are in KTLO and would be better served by not being inside AWS)
Furthermore, they do also engage in the traffic and sale of digital programmes wrought by the hands of other, independent artisans.
But this is the exception.
There aren't a lot of tangible gains left to be made by the software teams. The OS is fine, the office suite is fine, the entertainment apps are fine.
If "performance" is shoving AI crap into software that was already doing what I wanted it to do, I'd rather the devs take a vacation.
Who knows, maybe the era of "exciting computing" is over, and iteration will be a more pleasant and subtle gradient curve of improvements, over the earth-shattering announcements of yore (such as the advent of popular cellular phones).
May be steve is true. We don't know what we want until some one shows it .
The UI itself is supposed to be intense to render to some degree. That's crazy because most of the time it looks like an Android skin from 2012.
And on top of this all -- absolutely nobody asked for this. No one asked for some silly new UI that is transparent or whateveer.
Apple (post Apple II) has always been a systems company, which is much different. Dell is a hardware company.
Hopefully that will bring whatever they’re doing right to other teams.
Biggest grief with MacOS software:
- Finder is very mediocre comparing to even File explorer in Windows
- Scrollbar and other UI issues
Unfortunately I don't think Asahi is going to catch up, and Macbook is so expensive, so I'll probably keep buying second hand Dell/Lenovo laptop and dump a Linux on top of it.
I still agree that second hand Thinkpads are ridiculously better in terms of price/quality ratio, and also more environmentally sustainable.
But I could be wrong. Maybe the earlier Macs didn't have great software either -- but at least the UI is better.
I do miss window shading from MacOS 8 or 9, though. I think a whimsical skin for MacOS would be nice, too. The system error bomb icon was classic, the sad-Mac boot-failure icon was at least consolation. Now everything is cold and professional, but at least it stays out of my way and looks decent.
It really is awful. Why the hell is there no key to delete a file? Where's the "cut" option for moving a file? Why is there no option for showing ALL folders (ie, /bin, /etc) without having to memorize some esoteric key combination?
For fuck's sake, even my home directory is hidden by default.
> - Scrollbar and other UI issues
Disappearing scrollbars make sense on mobile where screen real estate is at a premium and people don't typically interact with them. It does not make sense on any screen that you'd use a mouse to navigate.
For years, you couldn't even disable mouse acceleration without either an esoteric command line or using 3rd party software. Even now, you can't disable scroll wheel acceleration. I hate that I can't just make a consistent "one click = ~2 lines of text" behavior.
I could go on and on about the just outright dumb decisions regarding UX in MacOS. So many things just don't make sense, and I feel like they were done for the sole purpose of being different from everyone else, rather than because of a sense of being better.
Command + backspace.
Cmd+delete? I don't really want it to be a single key as it's too easy to accidentally trigger (say I try to delete some text in a filename but accidentally bump my mouse and lose focus on the name)
MacOS doesn't have enough 'openness' to it. There's no debug information, lack of tools etc. To this day I can still daily drive a XP or 98/2000 machine( if they supported the modern web) because all the essentials are still intact. You can look around system files, you customize them edit them. I could modify game files to change their behaviour. I could modify windows registry in tons of ways to customize my experience, experiment lot of things.
As a 'Pro' user my first expectation is options, options in everything I do , which MacOS lacks severely.
All the random hardware that we see launching from time to time have drivers for windows but not for Mac. Even linux has tons of terminal tools and customisation.
MacOS is like a glorified phone OS. It's weirdly locked down at certain places that drive you crazy. Tons of things do not have context menus(windows is filled with it).
Window management sucks, there's no device manager! Not even cli tools! (Or maybe I'm not aware?) Why can't I simpy cut and paste?
There's no API/way to control system elements via scripting, windows and linux are filled to the brim with these! Even though the UI is good looking I just cannot switch to an Apple device (both Mac and iPhone) for these reasons. I bought an iPad pro and I'm regretting. There's no termux equivalent in iPadOS/iOS , there are some terminal tools but they can't use the full processing power, they can't multi thread. They can't run in background, it's just ridiculous. The iPad Pro is just a glorious iPhone. Hardware doesn't make a device 'Pro' software does. Video editing isn't a 'Pro' workflow in the sense that it can be done in any machine that has sufficient oomph. An iPad Pro from 5 years ago will be slower than an iPad Air of today, does that make the air a 'Pro' device? No!
It's a bad idea to add an option entirely for the purpose of making the product not work anymore.
> Window management sucks
I'm always mystified reading these kinds of posts on HN because it literally always turns out to mean "macOS doesn't have a tiling window manager". Like, cmon man, who cares.
> there's no device manager! Not even cli tools!
`ioreg -l` or `system_profiler`. Why does this matter?
> There's no API/way to control system elements via scripting
https://developer.apple.com/library/archive/documentation/Ac...
https://developer.apple.com/documentation/XCUIAutomation
Command+Backspace.
Apple's Hardware Chief, John Ternus, seems to be next in line for succession to Tim Cook's position.
I remember using iTunes when fixing the name of an album was a modal blocking function that had to write to each and every MP3, one by one, in the slowest write I have ever experienced in updating file metadata. Give me a magnetised needle and a steady hand and I could have done it faster.
A long time ago they had some pretty cool design guides, and the visual design has often been nice, but other than that I don't think their software has been notable for its quality.
Curious if I'm missing something though, is there another entity with a stronger suite than that? Or some other angle to look at this? (E.g., it seems silly to me to use an MP3 metadata example when you're talking about the same company that makes Logic Pro.)
It's not exactly clear to me what niche Apple occupies in this market. It doesn't feel like "native Mac UI" is a must-have feature for DAWs or IDEs alike, but maybe that's just my perspective.
I also think you're confusing what I wrote. It's not a competition.
I have just found that Apple's hardware on desktop has been stronger than their software, in my experience (periodic sporadic use, ~2006->now).
[1] and now from a sibling comment I hear that perhaps people regard that tool as bad, so there you go, they jury is clearly out
Now the M1 range, that really was an impressive 'outperform' moment of engineering for them, but otherwise this is just a clock-work MBA driven trickle of slightly better over-hyped future eWaste.
To outperform during this crisis, hardware engineers worth their salt need to designing long lived boxes with internals that can be easily repaired or upgraded. "yeah but the RAM connections are fiddly" Great, now that sounds like a challenge worth solving.
But you are right about the software. Installing Asahi makes me feel like I own my compter again.
"Linux on Apple Silicon: Asahi Linux aims to bring you a polished Linux® experience on Apple Silicon Macs."
Why the "®" after Linux? I think this is the first time I've seen this.
Aside from that, I think of Apple as a hardware company that must write software to sell their devices, maybe this isn't true anymore but that's how I used to view them. Maintaining and updating as much software as Apple owns is no small task either.
The PowerBook from the mid 1990’s were hugely successful, especially the first ones, which were notable for what we now take for granted: pushing the keyboard back allowing space for palm rests. Wikipedia says at one time Apple had captured 40% of the laptop market. All the while the ’90s roared on, Apple was languishing, looking for a modern OS.
Software (iOS26), services (Music/Tv/Cloud/Apple Intelligence) and marketing (just keep screaming Apple Intelligence for 3 months and then scream Liquid Glass) ---- on the other hand seem like they are losing steam or very reactive.
No wonder John Ternus is the widely anticipated to replace Tim Cook (and not Craig).
Edit: gigabits indeed. Confusing, my old M2 Max has 400 GB/s (3200 gigabits per second) bandwidth. I guess it's some sort of baseline figure for the lowest end configuration?
Edit 2: 1,224 Gbps equals 153 GB/s. Perhaps M5 Max will have 153 GB/s * 4 = 612 GB/s memory bandwidth. Ultra double that. If anyone knows better, please share.
Edit: Apparently 100GB/s, so a 1.5x improvement over the M3 and a 1.25x improvement over the M4. That seems impressive if it scales to Pro, Max and Ultra.
The advantage of the unified architecture is that you can use all of the memory on the GPU. The unified memory architecture wins where your dataset exceeds the size of what you can fit in a GPU, but a high end gaming GPU is far faster if the data fits in VRAM.
That’s true for the on-GPU memory but I think there is some subtlety here. MoE models have slimmed the difference considerably in my opinion, because not all experts might fit into the GPU memory, but with a fast enough bus you can stream them into place when necessary.
But the key difference is the type of memory. While NVIDIA (Gaming) GPUs ship with HBM memory ship for a while now, the DGX Spark and the M4 use LPDDR5X which is the main source for their memory bottleneck. And unified memory chips with HBM memory are definitely possible (GH200, GB200), they are just less power efficient on low/idle load.
NVIDIA Grace sidestep: They actually use both HBM3e (GPU) and LPDDR5X (CPU) for that reason (load characteristics).
The moat of the memory makers is just so underrated…
Guessing that's their base tier and it'll increase on the higher spec/more mem models.
I wish Apple would take gaming more seriously and make GPTK a first class citizen such as Proton on Linux.
This has been by far the best setup until Apple can take gaming seriously, which may never happen.
No one who was forced to write a statement like [this](https://help.steampowered.com/en/faqs/view/5E0D-522A-4E62-B6...) is going to be enthusiastic about continuing to work with Apple.
1. When is the next transition on bits? Is Apple going to suddenly move to 128-bit? No.
2. When is the next transition on architecture? Is Apple going to suddenly move back to x86? No.
3. When is the next API transition? Is Apple suddenly going to add Vulkan or reinvigorate OpenGL? No. They've been clear it's Metal since 2014, 11 years ago. That's plenty of time for the industry to follow if they cared, and mobile gaming has adopted it without issue.
We might as well complain that the PlayStation 4 was completely incompatible with the PlayStation 3.
> What would it even be? 128 bit? Back to x86? Notarization++? Metal 4 incompatible with Metal 1?
Sure, I can think of lots of things. Every macOS update when I worked in this space broke something that we had to go fix. Code signature requirements change a bit in almost every release, not hard to imagine a 10-year-old game finally running afoul of some new requirement. I can easily see them removing old, unmaintained APIs. OpenGL is actively unmaintained and I would guess a massive attack vector, not hard to see that going away. Have you ever seen their controller force feedback APIs? Lol, they're so bad, it's a miracle they haven't removed those already.
You see, the existence of that "almost" is already less confidence than developers have on every game console as well as Linux and Windows.
The attitude in the apple developer ecosystem is that apple tells you to jump, and you ask how high.
You could complain that Playstation 4 software is incompatible with Playstation 3. This is the PC gaming industry, there are higher standards for the compatibility of software that only a couple companies can ignore.
"This is the PC gaming industry"
Who said Apple needed to present themselves as a PC gaming alternative over a console alternative?
Macs are personal computers, whether or not they come from some official IBM Personal Computer compatibility bloodline.
Sega Saturn - 9 million
Wii U - 13 million
PlayStation 5 - 80 million
Nintendo Switch - 150 million
Nintendo Switch 2 opening weekend - 4 million in 3 days
Sure.
https://en.wikipedia.org/wiki/List_of_best-selling_mobile_ph...
https://store.steampowered.com/stats/stats/
If you consider time zones (not every PC gamer is online at the same time), the fact that it's not the weekend, and other factors, I'd estimate the PC gaming audience is at least 100M.
Unfortunately, there's no possible way to get an exact number. There are multiple gaming PC manufacturers, not to mention how many gaming PCs are going to be built by hand. I'm part of a PC gaming community, and nearly 90% of us have a PC built by either themselves or a friend/family. https://pdxlan.net/lan-stats/
I mean, it's at least partially true. I used to play BioShock Infinite on my MacBook in high school, there was a full port. Unfortunately it's 32 bit and doesn't run anymore and there hasn't been a remaster yet.
Anyway, the whole situation was quite bad. Many games were still 32-bit, even if macOS itself had been mainly 64-bit for almost 10 years or more. And Valve didn't help either, the Steam store is full of 64-bit mislabeled as 32-bit. They could have written a simple script to check whether a game is actually 64-bit or not, instead they decided to do nothing and keep their chaos.
The best solution would have been a lightweight VM to run old 32-bit games, nowadays computer are powerful enough to do so.
You don't buy Apple to use your computer they way you want to use it. You buy it to use it the way they tell you to. E.g. "you're holding it wrong" fiasco.
In some ways this is good for general consumers (and even developers, with limited config comes less unpredictablilty)... However this generally is bad for power users or "niche" users like Mac gamers.
Not to mention many subscription services on iOS that don’t allow you to subscribe through the App Store.
That is true, but now they are in a position where their hardware is actually more affordable and powerful than their Windows/x86 counterpart - and Win 11 is a shitload of adware and an annoyance in itself, layered ontop of a OS. They could massively expand their hardware sales to the gaming sector.
I'm eyeing at a framework Desktop with an AMD AI 395 APU for gaming (I am happy with just 1080p@60) and am looking at 2000€ to spend, because I wan't a small form factor. Don't quote me on the benchmarks, but a Mac Mini on M4 Pro is probably cheaper and more powerful for gaming - IF it had proper software support.
Why would I do anything bespoke at all for such a tiny market? Much less an entirely unique GPU API?
Apple refusing to support OpenGL and Vulkan absolutely hurt their gaming market. It increased the porting costs for a market that was already tiny.
Because there is a huge potential here to increase market share.
Note that games with anticheat don't work on Linux with Proton either. Everything else does, though.
Of course some anticheats aren't supported at all, like EA Javelin.
https://forums.ea.com/blog/apex-legends-game-info-hub-en/dev...
I just redid my windows machine to get at TPM2.0 and secure boot for Battlefield 6. I did use massgrave this time because I've definitely paid enough Microsoft taxes over the last decade. I thought I would hate this new stuff but it runs much better than the old CSM bios mode.
Anything not protected by kernel level anti cheats I play on my steam deck now. Proton is incredible. I am shocked that games like Elden Ring run this well on a linux handheld.
In my case, for software development, I'd be happy with an entry-level MacBook Air (now with a minimum of 16GB) for $999.
1. Implementing PR_SET_SYSCALL_USER_DISPATCH
2. Implementing ntsync
3. Implementing OpenGL 4.6 support (currently only OpenGL 4.1 is supported)
4. Implementing Vulkan 1.4 with various extensions used by DXVK and vkd3d-proton.
That said, there are alternatives to those things. 1. Not implementing this would just break games like Jurassic World where DRM hard codes Windows syscalls. I do not believe that there are many of these, although I could be wrong.
2. There is https://github.com/marzent/wine-msync, although implementing ntsync in the XNU kernel would be better.
3. The latest OpenGL isn't that important these days now that Vulkan has been widely adopted, although having the latest version would be nice to have for parity. Not many things would suffer if it were omitted.
4. They could add the things needed for MoltenVK to support Vulkan 1.4 with those extensions on top of Metal:
https://github.com/KhronosGroup/MoltenVK/issues/203It is a shame that they do not work with Valve on these things. If they did, Proton likely would be supported for MacOS from within Steam and the GPTK would benefit.
Since I am playing mostly MSFS 2024 these days I currently use GeForce Now which is fine, but cloud gaming isn’t still quite there yet…
Death Stranding is a great looking game to be sure, but it's also kinda hard to get excited about a 5 year old game achieving rtx 2060 performance on a $2000+ system. And that was apparently worthy of a keynote feature...
Codeweavers?
I've been trying to get Unreal Engine to work on my Macbook but Unity is an order of magnitude easier to run. So I'm also stuck doing game development on my PC. The Metal APIs exist and apparently they're quite good... it's a shame that more engines don't support it.
edit: for now I'll get that win 10 ESU
Inference speed and fast feedback matter a lot more than perfect generation to me.
I personally wish they would learn from the failure of Metal.
Also unleashes? Really? The marketing madness has to stop at some point.
I think Metal's ergonomics advantage is a much slimmer lead when you consider the other high-level APIs it competes with.
M4: May 2024
M4 pro/max: Oct 2024
https://www.apple.com/newsroom/2024/05/apple-introduces-m4-c...
https://www.apple.com/newsroom/2024/10/apple-introduces-m4-p...
My hope is that they are taking longer because of a memory system upgrade that will make running significantly more powerful LLMs locally more feasible.
All in all, apple is doing some incredible things with hardware.
Software teams at apple really need to get their act together. The M1 itself is so powerful that nobody really needs to upgrade that for most things most people do on their computers. Tahoe however makes my M1 Air feel sluggish doing the exact same tasks ive been last couple of years. I really hope this is not intentional from Apple to make me upgrade. That would be a big let down.
An infinitely small percentage of people can take advantage of 320Mhz. It's fine.
https://support.apple.com/en-gb/guide/deployment/dep268652e6...
I was at a Wi-Fi vendor presentation a while back and they said that 160 Mhz is pretty improbable unless you're leaving alone and no wireless networks around you. And 320 Mhz even less so.
In real life probably the best you can get is 80 Mhz in a really good wireless environment.
WiFi does currently get anywhere near the bandwidth that these huge channels advertise in realistic environments.
For 6ghz? Yeah, not uncommon.
https://support.apple.com/guide/deployment/wi-fi-ethernet-sp...
No devices support 320Mhz bandwidths, and only supports 160Mhz on 6GHz band on MacBooks and iPads. Some iPhones support 160Mhz on 5GHz as well.
Reducing Broadcom's influence over the WiFi ecosystem alone would be a large benefit.
I have a work provided M2 Pro with 32GB of RAM. After the Tahoe upgrade it feels like one of the sluggish PCs at the house. It is the only one that I can see the mouse teleporting sometimes when I move it fast. This is after disabling transparency in Accessibility settings mind you, it was even worse before.
I just got one example while passing the mouse quickly through my dock (I still use the magnify animation) and I can clearly see it dropping a few frames. This never happened in macOS 15.
Electron used to override a private function that makes the Mac OS sluggish on Tahoe, and apparently no one uses Electron apps while doing testing at Apple.
What I can say is that while the situation is much better than at Day 1, the whole Tahoe experience is not as fluid as Sequoia.
Also, it doesn't really matter to me if this was a private function or not, if this was Windows or Gnome/KDE people would blame the developers of the desktop instead.
This happens in pretty much every Electron app as far I know, and lots of Electron apps are like Spotify, VSCode or Slack are very likely to be in the Top 10 or at least Top 100 most used apps. And yes, I would expect Apple to test at least the most popular apps before releasing a new version of their OS.
> Maybe the thought is that an email from Apple to the dev saying fix your code would more compelling???
Of course not. Apple controls the SDK, they could workaround this in many different ways, for example instead of changing how this function was implemented they could introduce a new method (they're both private so it doesn't matter) and effectively ignore the old method (maybe also they could add a message for developers building their application that this method was removed). It would draw ugly borders in the affected apps but it wouldn't cause this issue at least.
why do we think this would be a solve as the devs clearly ignored the previous message about not using a private method?
If anything the fact that devs can actually access private symbols is an issue with how Apple designed their APIs, because they could make this so annoying to do that nobody would try (for example, stripping symbols).
Also, the fact that devs need to access private symbols to do what they need to do also shows that the public API is lacking at least some features.
Another thing, if this only affected the app itself that would be fine, but this makes the whole system slow to a crawl.
So while devs share some of the blame here (and I am not saying they don't), I still think this whole situation is mostly Apple's fault.
I think the failures here are that Apple should have tested this themselves and the Electron devs should have tested and resolved this during the beta period.
I don't think it's that clear cut. It looks like it was a workaround for a MacOS rendering bug going back to at least 2017, landed in 2019 and had no apparent downsides for six years[1].
The PR removing the private API code also included someone verifying that Apple had fixed the original bug some time in the intervening years[2].
I probably wouldn't have taken this approach personally (at the very least file the original rendering issue with Apple and note it with the code, though everyone knows the likelihood of getting a even a response on an issue like that), but it wasn't some cargo culted fix.
[1] https://github.com/electron/electron/pull/20360
[2] https://github.com/electron/electron/pull/48376#issuecomment...
https://www.pcworld.com/article/2816273/how-microsofts-windo...
https://github.com/tkafka/detect-electron-apps-on-mac
About half of the apps I use regularly have been fixed. Some might never be fixed, though...
From: https://www.reddit.com/r/MacOS/comments/1nvoirl/i_made_a_scr...
they ship-of-thesseus the crap out of their OS but replacing with parts that need these new hardware features that run slow on older chips due to software-only implementations.
I got the first generation iPad Pro, which is e-waste now, but I use it as a screen for my CCTV, it cannot even display the virtual keyboard without stuttering like crazy, it lags switching apps, there's a delay for everything, this thing was smooth as butter on release.
I was considering just replacing the battery and keeping it for several more years but now I feel forced to upgrade which has me considering whether I still want/need an iPad since I'd also have to buy a new magic keyboard since they redesigned it, and they bumped the price ($1299 now vs. $999 when I got the 4th gen) so I'd be looking at $1700. Trying to hold out for an iPad Air with ProMotion.
I may be in the minority here, but I think 5 years is too short of a lifespan for these devices at this point. Early days when things were advancing like crazy, sure. But now? I have 8 year old computers that are still just fine, and with the M-series chips I'd expect at least 10 years of usable life at minimum (battery not withstanding)
The iPad Air 13 with a M3 is a really nice experience. Very fast device.
I also have an M2 Pro with 32GB of memory. When I A/B test with Electron apps running vs without, the lag disappears when all the unpatched Electron apps are closed out.
1. https://avarayr.github.io/shamelectron/
Here's a script I got from somewhere that shows unpatched Electron apps on your system:
Edit: HN nerfed the script. Found a direct link: https://gist.github.com/tkafka/e3eb63a5ec448e9be6701bfd1f1b1...
I’ve been debating making a Tumblr-style blog, something like “dumbapple.com,” to catalogue all the dumb crap I notice.
But, like, man - why can't I just use the arrow keys to select my WiFi network anymore? I was able to for a decade.
And the answer, of course, is the same for so much of macOS' present rough edges. Apple took some iPadOS interface elements, rammed them into the macOS UI, and still have yet to sand the welds. For how much we complain on HN about Electron, we really need to be pissed about Catalyst/Marzipan.
Why does the iCloud sign in field have me type on the right side of an input? Why does that field have an iPadOS cursor? Why can't I use Esc to close its help sheet? Why aren't that sheet's buttons focusable?
Why does the Stocks app have a Done button appear when I focus its search field? Why does its focus ring lag behind the search field's animated size?
Where in the HIG does it sign off on unfocusable text-only bolded buttons, like Maps uses? https://imgur.com/a/e7PB5jm
...Anyway.
1. I won't focus on a bunch of Siri items, but one example that always bugs me: I cannot ask Siri to give me directions to my next meeting. The latest OS introduces an answer for the first time, though. It tells me to open the calendar app on my Apple watch, and tap on the meeting, and tap the address. (I don't have an Apple watch.)
2. Mail.app on iOS does not have a "share sheet." This makes it impossible to "do" anything with an email message, like send it to a todo app. (The same problem exists with messages in Messages.app)
3. It is impossible to share a contact card from Messages.app (both iOS and MacOS). You have to leave messages, go to contacts and select the contact to share. Contacts should be one of the apps that shows up in the "+" list like photos, camera, cash, and plenty third party apps.
4. You still have to set the default system mail app in MacOS as a setting in the Mail.app, instead of in system settings. Last I checked, I'm pretty sure you couldn't do this, without first setting up an account in the Mail.app. Infuriating.
I'd love to agree that comically amateurish, but apparently there's something about settings dialogs that make them incredibly difficult to search. It takes Android several seconds to search its settings, and the Microsoft start menu is also comically slow if you try to access control panels through it, although it's just comically slow at search in general. Even Brave here visibly chokes for like 200ms if I search in its preferences dialog... which compared to Android or Windows is instant but still strikes me as a bit to the slow side considering the small space of things being searched. Although it looks like it may be more related to layout than actual searching.
Still. I dunno why but a lot of settings searches are mind-bogglingly slow.
(The only thing I can guess at is that the search is done by essentially fully instantiating the widgets for all screens and doing a full layout pass and extracting the text from them and frankly that's still not really accounting for enough time for these things. Maybe the Android search is blocked until the Storage tab is done crawling over the storage to generate the graphs that are not even going to be rendered? That's about what it would take to match the slowdown I see... but then the Storage tab happily renders almost instantly before that crawl is done and updates later... I dunno.)
Funny I'm defending them, but I think this is not even a papercut in my opinion, while they have far bigger issues.
But it's a glorified Kindle and YouTube box, so I'm hesitating a little bit.
It feels very much like how I imagine someone living in the late 1800's might have felt. The advent of electricity, the advent of cars, but can't predict airplanes, even though they're right around the corner and they'll have likely seen them in their lifetime.
Maybe, but for lots of scenarios even M5 could still benefit from being an order of magnitude faster.
AI, dev, some content scenarios, etc…
* My iPhone as a remote for my Apple TV has randomly stopped deciding it can control the volume - despite the "Now Playing" UI offering an audio control that works.
There auth screens drive me crazy:
* Why cannot I not punch in a password while Face ID is working? If I'm skiing, I know Face ID isn't gong to work, stop making me wait.
* Likewise, on Apple TV the parental control input requires me to explicitly choose to enter a Pin Code. Why? Just show me the Pin Code screen. If I can approve from my device, I will.
* Similarly, if I use my phone as a remote, why do I need to manually click out of the remote to get to the parental control approval screen. I'm literally using my phone. Just auto-approve.
Funny, a similar thing has been driving me crazy on my Ubuntu 20.04 laptop with fingerprint login. When unlocking, I can either enter a password or use fingerprint. On boot, I am not allowed to enter a password until I fail with fingerprint. If I use fingerprint to log in on boot, I have to enter my password anyways once logged in to unlock my keychain.
I should probably just figure out a way to disable fingerprint on boot and only use it for the lock screen.
Of course the thin Apple remote has a way of getting lost, but it has a Find Me feature which locates it pretty well.
It seems to have been degrading for a long time, but for me it’s been in this past year where it’s crossed into that threshold android used to live in where using the phone causes a physiological response from how aggravating it can be sometimes
I let my guard down and got too deep into the apple ecosystem- I know better and always avoided getting myself into these situations in the last, but here I am
The phone sucks right now - super buggy and they continue to remove/impose features that should be left as an option to the user By Yes, this has always been the knock on apple, but I typically havent had an issue with their decisions - it’s just so bad now
Lesson (re)learned and I will stay away from ecosystems - luckily the damage here is only for media
The minute I can get blue bubbles reliably on an android, I’ll give the pixel a shot again - if that sucks too then maybe I’ll go back to my teenage years and start rooting devices again
I am fully bought into the Apple ecosystem. Not sure yet if I regret it. It is annoying to be so tied down to one company that isn’t going the way I want it to.
There are current workarounds, like isn’t your home Mac as a relay, but nothing super elegant that I know of
I’ve also been unable to get the remote app on my watch to work at all. It’s hard to imagine people working at Apple don’t also run into these issues all the time.
Apple has a higher duty to their shareholders than to their customers.
Not hating on Apple, just stating the hard economic truth.
Sad if true. I feel my M1 max sluggish too lately. After bragging that this was the longest lived work machine I had and thinking I'm good to wait for M6. This is not good for business, but IMO you need more than raw power to justify upgrades even for professional use - form factor, screen quality, battery, etc.
I think they bet a lot of hardware money on AI capabilities, but failed to deliver the software, so there was no real reason to upgrade because of AI features in the chip (which is literally what they boast on the first line of the announcement - yet nobody cares about making more cute faces)
It's not 100% their fault. Everyone got onto the LLM bandwagon like it's "the thing" so even if they didn't believe it they sill needed something. Except an OS is not a chat interface, and LLMs do suck at stricter things.
a rant on my part, but a computer from 10 years ago would be fine for what most people do on their computer, only for software bloat..
Counterpoint: my M1 Pro was a turtle for a few weeks and then stopped doing nonsense in the background and is back to its zippy self. (Still buggy. But that would be true on new hardware, too.)
I would love to se a ThinkPad with an M5 running Linux.
Is that good? Their cellular modems have been terrible. I'll reserve judgement until trying one out.
>The M1 itself is so powerful
I think this is a bit of a fallacy. Apple Silicon is great for the power consumption to power ratio, but something like a Ryzen 9 7945HX can do 3x more work than an M1 Max. And a non-laptop chip, like an Intel Core Ultra 7 265k can do 3.5x.
https://browser.geekbench.com/processors/amd-ryzen-9-7945hx
https://browser.geekbench.com/processors/intel-core-ultra-7-...
https://browser.geekbench.com/macs/macbook-pro-16-inch-2021-...
https://browser.geekbench.com/macs/macbook-pro-16-inch-2024-...
Performance claims:
https://www.ookla.com/articles/iphone-c1-modem-performance-q...
Energy claims:
https://appleinsider.com/articles/25/02/27/apples-c1-modem-b...
I keep hearing this since the Intel 486DX times, and
> Nobody will ever need more than 640K of RAM!
Amusingly enough, adding more ports could do it.
"the ability to transform 2D photos into spatial scenes in the Photos app, or generating a Persona — operate with greater speed and efficiency."
And by making Apple AI (which is something I do not use for many reasons, but mainly because of Climate Change) their focus, I am afraid they are losing and making their operating Systems worse.
For instance, Liquid Glass, the mess I was lucky enough to uninstall before they put in the embargo against doing so, is, well, a mess. An Aplha release in my opinion which I feel was a distraction from their lack of a robust AI release.
So by blowing money on the AI gold rush that they were too late for, will they ultimately ruin their products across the board?
I am currently attempting to sell my iPhone 16E and my M1 Macbook Air to move back to Linux because of all of this.
Are they really doing that? Because if it's the case they have shockingly little to show for it.
Their last few attempts at actual innovation seem to have been less than successful. The Vision Pro failed to find a public. Liquid Glass is to put it politely divisive.
At that point to me, it seems that good SoC and a captive audience in the US are pretty much all they have remaining and competition on the SoC part is becoming fierce.
But I think $500 billion is a lot of money for AI:
Apple accelerates AI investment with $500B for skills, infrastructure
https://www.ciodive.com/news/Apple-AI-infrastructure-investm...
Imagine using $500 for the operating system and squashing bugs or making the system even more energy efficient? Or maybe figuring out how to connect to an android tablet's file system natively?
I totally understand why someone would refuse to use it due to environmental reasons (amongst others) but I'm curious to hear your opinions on it.
If I can't search my Apple Mail without AI, why would I trust AI?
Why would I trust this when they can't deliver a voice assistant that can parse my sentences beyond "Set a reminder" or "Set a timer"? They have neglected this area of their products for over a decade, they are not owed the benefit of the doubt
Also, I like researching things old school how I learned in college because I think it leads to unintended discoveries.
I do not trust the source you linked to. It is an organization buried under organizations for which I cannot seem to find their funding source after looking for a good 15 minutes this morning. It led me back to https://ev.org/ where I found out one guy used to work for "Bain and Company", a consulting firm, and was associated with FTX funding:
https://oxfordclarion.uk/wytham-abbey-and-the-end-of-the-eff...
Besides "Effective Altruism" makes no sense to me. Altruism is Altruism IMO.
Altruism: unselfish regard for or devotion to the welfare of others
There is no way to be ineffective at altruism. The more you have to think about altruism the further you get from it.
But the organization stinks as some kind of tech propaganda arm to me.
I'm from country (in Europe) where CO2 emissions per capita [0] are 5.57 while number for USA is 14.3, so reading this sentence in that article: "The average American uses ~50,000 times as much water every day..." surly does not imply that one should use ChatGPT because it is nothing. If "average American" wants to decrease emissions then not using LLMs is just start.
[0]: https://ourworldindata.org/grapher/co-emissions-per-capita
Huh. This one baffles me.
Maybe they are in USA - every little think counts there.
So, to keep this on point, Apple making a faster chip is not on my climate change agenda and anything but negative.
A typical passenger car driving 12,000 miles puts out about 5 metric tons of C02
The person driving that passenger car likely has a 1,000 sq ft or larger home or apartment, which can vary widely but could be reasonably estimated at another 5 metric tons of C02 (Miami vs. Minnesota makes a huge difference)
So we're at 10 metric tons for someone who doesn't live in a van but still drives like a suburbanite
Care to be a little kinder next time you feel whatever compelled you to write you response to the other user? Jeesh.
> Burning 1000 gallons of motor fuel has the same GHG impact as 300 million uses of Google Gemini, and the CO2 impact of local inference on a Mac is even less
Still, even lets say your number are correct (and I feel they are not), does that mean I should just add to the problem and use something I do not need?
Driving my van for my yearly average creates about 4.4 metric tons of CO2.
"A more recent study reported that training GPT-3 with 175 billion parameters consumed 1287 MWh of electricity, and resulted in carbon emissions of 502 metric tons of carbon, equivalent to driving 112 gasoline powered cars for a year."
https://news.climate.columbia.edu/2023/06/09/ais-growing-car...
Just to get an idea of how I conserve, another example is I only watch videos in 480 becasue it uses less power. This has a double benefit for me since it saves my solar battery as well.
I am not bragging, just showing what is possible. Right now, being tsill this week in the desert, my carbon footprint is extremely low.
Second, I cannot really trust most numbers that are coming out regarding AI. Sorry, just too much confusion and green-washing. For example, Meta is building an AI site that is about the size of Manhattan. Is all the carbon used to build that counted in the equations?
But this paper from 5/25:
https://www.technologyreview.com/2025/05/20/1116327/ai-energ...
says "by 2028 more than half of the electricity going to data centers will be used for AI. At that point, AI alone could consume as much electricity annually as 22% of all US households."
And
"Tallies of AI’s energy use often short-circuit the conversation—either by scolding individual behavior, or by triggering comparisons to bigger climate offenders. Both reactions dodge the point: AI is unavoidable, and even if a single query is low-impact, governments and companies are now shaping a much larger energy future around AI’s needs."
And
"The Lawrence Berkeley researchers offered a blunt critique of where things stand, saying that the information disclosed by tech companies, data center operators, utility companies, and hardware manufacturers is simply not enough to make reasonable projections about the unprecedented energy demands of this future or estimate the emissions it will create. "
So the confusion and obfuscation is enough for me to avoid it. I think AI shoudl be restaind to research, not to be used from most of the silliness adn AI slop that is being produced. Because yiou know, we are not even counting the AI slop views that also take up data space and energy by people looking at it all.
But part if why I do not use it is my little boycott. I do not like AI, at least how it is being misused to create porn and AI slop instad of doing the great things it might do. They are misusing AI to make a profit. And that is also what I protest.
"The neural engine features a graphic accelerator" probably M6
Gaming on mac is indeed lacking, but that's really not the reason.
If they're studios, you can have stacks of M5 Max Macs.
>Built into your iPhone, iPad, Mac, and Apple Vision Pro* to help you write, express yourself, and get things done effortlessly.** Designed with groundbreaking privacy at every step.
The asterisks are really icing on the cake here.
---
[1] https://news.bloomberglaw.com/ip-law/apple-accused-of-ai-cop...
Yesterday’s hype is today’s humility.
https://web.archive.org/web/20251010205008/https://www.apple...
[1] https://www.apple.com/us-edu/shop/buy-mac/macbook-pro/14-inc...
I'm sure it's a perfectly fine daily driver, but you have to appreciate the irony of a massive chip loaded to the gills with matrix multiplication units, marketed as an amazing AI machine, and yet so hobbled by mem capacity and bandwidth.
But never, ever, through not shipping incremental hardware bumps every year regardless of whether there's anything really worth shipping.
And it's things like not including a charger, cable, headphones anymore to reduce package size, which sure, will save a little on emissions but it's moot because people will still need those things.
Hardware longevity and quality are probably the least valid criticisms of the current Macbook lineup. Most of the industry produces future landfill at an alarming rate.
Logos is King
https://security.apple.com/blog/memory-integrity-enforcement...
1. CPU, via SIMD/NEON instructions (just dot products)
2. CPU, via AMX coprocessor (entire matrix multiplies, M1-M3)
3. CPU, via SME (M4)
4. GPU, via Metal (compute shaders + simdgroup-matrix + mps matrix kernels)
4. Neural Engine via CoreML (advisory)
Apple also appears to be adding a “Neural Accelerator” to each core on the M5?
A Mac Quadra in 1994 probably had floating point compute all over the place, despite the 1984 Mac having none.
I know how just the multitasking nature of the OS probably make this situation happens across different programs, but nonetheless would be pretty cool!
- https://machinelearning.apple.com/research/neural-engine-tra...
- https://machinelearning.apple.com/research/vision-transforme...
Things have definitely gotten better with MLX on the software side, though it still seems they could do more in that area (let’s see what the M5 Max brings). But even if they made big strides here, it won’t help previous generations, and the main thing limiting Apple Intelligence (in my opinion) will continue to be the 8 GB of unified memory they still insist on.
I would hope that the Foundation Models (https://developer.apple.com/documentation/foundationmodels) use the neural engine.
https://www.google.com/url?sa=t&source=web&rct=j&opi=8997844...
AMD is likely to back away from this IP relatively soon.
The ANE is for very low power, very specific inference tasks. There is no universe where Apple abandons it, and it's super weird how much anti-ANE rhetoric there is on this site, as if there can only be one tool for an infinite selection of needs. The ANE is how your iPhone extracts every bit of text from images and subject matter information from photos with little fanfare or heat, or without destroying your battery, among many other uses. It is extremely useful for what it does.
>tensor units on the GPU
The M5 / A19 Pro are the first chips with so-called tensor units. e.g. matmul on the GPU. The ANE used to be the only tensor-like thing on the system, albeit as mentioned designed to be super efficient and for very specific purposes. That doesn't mean Apple is going to abandon the ANE, and instead they made it faster and more capable again.
What work product? Who is running models on Apple hardware in prod?
You don't have to believe this. I could not care less if you don't.
Have a great day.
If you had a source to cite then it would remove all doubt pretty quickly here. But your assumptions don't seem to align with how iOS users actually use their phone.
You seem set on MLX and apparently on your narrow view of what models are. This discussion was about ANE vs "tensor" units on the GPU, and someone happened to mention MLX in that context. I clarified the role of MLX, but that from an inference perspective most deployments are CoreML, which will automatically use ANE if the model or some subset fits (which is actually fairly rare as it's a very limited -- albeit speedy and power efficient -- bit of hardware). These are basic facts.
>how iOS users actually use their phone.
What does this even mean? Do you think I mean people are running Qwen3-Embedding-4B in pytorch on their device or something? Loads of apps, including mobile games, have models in them now. This is not rare, and most users are blissfully unaware.
correct and non-controversial
> An enormous number of people and products [use CoreML on Apple platforms]
non-sequitur
EDIT: i see people are not aware of
i love blustery nerds lol. what if i told you i'm a coreml contrib and i know for a fact you're wrong?
It would help if you would explain what I said that is wrong. You know, as the "haven't logged in in three years but pulled out the alt for this" CoreML contributor you are. This is an especially weird bit of trolling giving that nothing I said is remotely even contentious, and is utterly banal facts.
That seems like a strange comment. I've remarked in this thread (and other threads on this site) about what's known re: low-level ANE capabilities, and it seems to have significant potential overall, even for some part of LLM processing. I'm not expecting it to be best-in-class at everything, though. Just like most other NPUs that are also showing up on recent laptop hardware.
As you said - it won’t help previous generations, though since last year (or two??) all macs start with 16GB of memory. Even entry level macbook airs.
The latter has up to 128GB of memory?
But I think it's also a huge issue Apple makes storage so expensive. If Apple wants local AI to answer your questions it should be able to take your calender, emails, text messages, photos, journal entries etc. into account. It can't do that as nicely as long as customers opt for only 256GB or 1TB devices due to cost
If anything, these refreshes let them get rid of the last old crap on the line for M1 and M2, tie up loose ends with Walmart for the $599 M1 Air they still make for ‘em, and start shipping out the A18 Pro-based Macbooks in November.
M5 announcement [1] says 4x the peak GPU compute performance for AI compared to M4. I guess in the lab?
Both iPad and MBP M5 [2][3] say "delivering up to 3.5x the AI performance". But all the examples of AI (in [3]), they are 1.2-2.3X faster than M4. So where this 3.5X is coming from? What tests did Apple do to show that?
---
1. https://www.apple.com/newsroom/2025/10/apple-unleashes-m5-th...
2. https://www.apple.com/newsroom/2025/10/apple-unveils-new-14-...
3. https://www.apple.com/newsroom/2025/10/apple-introduces-the-...
I'll believe the benchmarks, not marketing claims, but an observation and a question.
1. AMD EPYC 4585PX has ~89GB/s, with pretty good latency, as long you use 2xdimm
2. How does this compare to the memory bandwidth and latency of M1,M2,M3,M4 in reality with all of the caveats? It seems like M1 was a monumental leap forward, then everything else was a retraction.
Snow Leopard still remains the company's crown achievement. 0 bloatware, 0 "mobile features on desktop" (wtf is this even a thing?), tuned for absolute speed and stability.
Ironically I can still run old 32-bit Windows software in Wine on my M1 Mac. Windows software is more stable on a Mac than Mac software.
They are not the hardware provider like nvidia, they don’t do the software and services like OpenAI or even Microsoft/oracle. So they are struggling to find a foothold here. I am sure they are working on a lot of things but the only way to showcase them is through their phone which ironically enough feels like not the best path for apple.
Apple’s best option is to put llms locally on the phone and claim privacy (which is true) but they may end up in the same Siri vs others situation, where Siri always is the dumber one.
This is interesting to see how it plays out
This might not be widely recognised, as the proportion of people wanting to run capable LLMs locally is likely a rounding error versus the people who use ChatGPT/Claude/Gemini regularly. It's also not something that Apple market on, as they can't monetize it. However, as time goes on and memory and compute power gradually decrease in price, and also maybe as local LLMs continue to increase in ability (?) it may become more and more relevant.
In theory this would be where qualcomm would come in and provide something but in practice they seem to be stuck in qualcomm land where only lawyers matter and actual users and developers can get stuffed.
The only well supported devices are either phones or servers with very little in between.
Even common consumer devices like wifi routers will have ARM SOCs with pinned version of the kernel they are attached to which will get supported for 1 to 2 years at most.
And it's a PITA to install (needs to be started within macosx, using scripts, with the partitions already in a good state)
However, I have been disappointed by Apple too many times (they wouldn't replace my keyboard despite their highly-flamed design-faux-pas, had to replace the battery twice by now, etc.)
Two years ago I finally stopped replacing their expensive external keyboards, which I used to buy once a year or every other (due to broken key-hinges) and have been so incredibly positively surprised by getting used to the MX Keys now. Much better built, incredible mileage for the price. Plus, I can easily switch and use them on my Windows PC, too.
So, about the Macbook — if I were to switch mobile computing over to Windows, what can I replace it with? My main machine is still a Mac Mini M2 Pro, which is perfect value/price. I like the Surface as a concept (replacable keyboards are a fantastic idea, battery however, super iffy nonsense), and I've got a Surface Pro 6 around, but it's essentially the same gloss-premium I don't need for my use.
Are there any much-cheaper but somewhat comparable laptops (12h+ battery, 1 TB disk, 16-32GB RAM, 2k+ Display) with reasonable build quality? Does bypassing the inherent premium of all the Apple gloss open up any useful options? Or is Apple actually providing the best value here?
Would love to hear from non-Surface, non-Thinkpad (I love it, but) folks who've got some recommendations for sub $1k laptops.
Not my main machine, but something I take along train rides, or when going to clients, or sometimes working offsite for a day.
But its really only capable of high performance in short bursts because of the extremely small thermal mass.
Storage CPU
≤ 512GB 3 P-cores (and 6 E-cores)
1TB+ 4 P-cores (and 6 E-cores)
https://www.apple.com/ipad-pro/specs/The existing neural engine's function is to maximize power efficiency, not flexible performance on models of any size.
It's an improvement, nomenclature-wise.
(Perhaps it would be safer to wait for The Next Generation?)
Open up the YouTube app and try to navigate the UI. It’s okay but not really up to the Apple standard. Now try to enter text in the search bar. A nearby iPhone will helpfully offer to let you use it like a keyboard. You get a text field, and you can type, and keystrokes are slowly and not entirely reliably propagated to the TV, but text does not stay in sync. And after a few seconds, in the middle of typing, the TV will decide you’re done typing and move focus to a search result, and the phone won’t notice, and it gets completely desynchronized.
More importantly for games, though, is the awful storage architecture around the TV boxes. Games have to slice themselves up into 2GB storage chunks, which can be purged from the system whenever the game isn't actively running. The game has to be aware of missing chunks and download them on-demand.
It makes open-world games nearly impossible, and it makes anything with significant storage requirements effectively impossible. As much as Apple likes to push the iOS port of Death Stranding, that game cannot run on tvOS as currently architected for that reason.
Apple is actively hostile to how you would build for Linux or PC or console.
If you are building your engine/game from scratch, you absolutely do not need to use Xcode
Nonetheless that’s a small fraction of the time spent actually developing the game.
That makes it a continuous headache to keep your Mac builders up.
It means you need to double dev hardware costs or more as you need a gaming PC to target your core audience and Macs handle the mac bugs.
It means your mac build machines are special snowflakes because you can't just use VMs.
The list goes on and on of Mac being actively hostile to the process.
Just Rider running on a Mac is pleasant sure, but that's not the issue.
Having to use xcode "for the final build" is irrelevant to the game development experience.
Sure you can. And officially, too. Apple still ships a bunch of virtualization drivers in macOS itself. Have a look:
/System/Library/Extensions/IONetworkingFamily.kext/Contents/PlugIns/AppleVmxnet3Ethernet.kext
Whether or not you're using ESXi, or want to, is an entirely different question. But "you're not able to" is simply incorrect. I virtualize several build agents and have for years with no issues.
macOS 26 is the last major version to support Intel, so once macOS 28 is latest this will probably become impossible (macOS 26 should be able to use Xcode 27, but maybe the platform removal will change this previous year's OS support from continuing).
I think OP means virtualizing on something that isn't Apple.
You can get an xcode building for arm Macs on PC hardware with this?
- have to pay Apple to have your executable signed
- poor Vulkan support
The hardware has never been an issue, it's Apple's walled garden ecosystem.
As a game developer, I have to literally purchase Apple hardware to test rather than being able to conveniently download a VM
Like, I get if you develop for consoles, you probably use some kind of emulation on your development workstation, which is probably running Windows. Especially for consoles like XBOX One or newer, and PS4 or newer, which are essentially PCs. And then builds get passed off to a team that has the hardware.
Is anyone developing games for Windows on Apple hardware? Do they run Parallels and call it a day? How is the gaming performance? If the answers to those 3 questions are "yes, yes, great", then Apple supports PC game development better than they support Apple game development?
I don’t think anybody does this. I haven’t heard about official emulators for any of the mainstream consoles. Emulation would be prohibitively slow.
Developers usually test on dedicated devkits which are a version of the target console (often with slightly better specs as dev builds need more memory and run more slowly). This is annoying, slow and difficult, but at least you can get these dev kits, usually for a decent price, and there’s a point to trying to ship on those platforms. Meanwhile, nobody plays games on macs, and Apple is making zero effort to bring in the developers or the gamers. It’s a no-chicken-and-no-egg situation, really.
For testing, I can do a large amount of testing in a VM for my game. Maybe not 100% and not full user testing but nothing beats running on the native hardware and alpha/beta with real users.
Also, since I can pass through hardware to my VM I can get quite good performance by passing through a physical GPU for example. This is possible and quite straightforward to do on a Linux host. I'm not sure if it's possible using Parallels.
i am obviously misunderstanding something, i mean.
Sure, I'm not doing performance benchmarking and it's just smoke tests and basic user stories, but that's all that 98% of indie developers do for cross platform support.
Apple has been intensely stupid as a platform to launch on, though I did do it eventually. I didn't like Apple before and now I like it even less.
However, these days it's possible pass-through hardware to your VM so I would be able to pass through a 2nd GPU to MacOS...if it would let me run it as a guest.
so a mac port, even if simple, is additional cost. there you have the classic chicken and egg problem. the cost doesn't seem to be justified by the number of potential sales, so major studios ignore the platform. and as long as they do, gamers ignore the platform
i've seen it suggested that Apple could solve this standoff by funding the ports, maybe they have done this a few times. but Apple doesn't seem to care much about it
It even has "for this mac" preset which is good enough that you don't need to tinker with settings to have decent experience.
The game is paused, almost like becomes "frozen" if it's not visible on screen which helps with battery (it can be in the background without any noticeable impact on battery and temperature). Overall way better experience than I expected.
The communication bandwidth you can achieve by putting CPU, CPU, and memory together at the factory is much higher than having these components separate.
Sad for enthusiasts, but practically inevitable
And that’s not even talking about porting the game to either Metal or an absolutely ancient OpenGL version that could be removed with any upcoming OS version. A significant effort just to address a tiny market.
IIRC developers literally got 15 years of warning about that one.
But it's not possible to keep maintaining 32-bit forever. That's twice the code and it can't support a bunch of important security features, modern ABIs, etc. It would be better to run old programs in a VM of an old OS with no network access.
Apple had the money to support it, we both know that. They just didn't respect their Mac owners enough, Apple saw more value in making them dogfood iOS changes since that's where all the iOS devs are held captive. Security was never a realistic excuse considering how much real zombie code still exists in macOS.
Speaking personally, I just wanted Apple to wait for WoW64 support to hit upstream. Their careless interruption of my Mac experience is why I ditched the ecosystem as a whole. If Apple cannot invest in making it a premium experience, I'll take my money elsewhere.
I still don't get this. Apple is a trillion dollar company. How much does it cost to pay a couple of engineers to maintain an up to date version on top of Metal? Their current implementation is 4.1, it wouldn't cost them much to provide one for 4.6. Even Microsoft collaborated with Mesa to build a translation on top of dx12, Apple could do the same.
1: https://www.facebook.com/permalink.php?story_fbid=2146412825...
Somehow Atari, EA and PlayStation are here despite this. I don't know how they did it.
Meanwhile, Nintendo is successful because they're in Seattle where it's dark and rains all the time.
It was only the intervention of Microsoft that managed to save Apple from their own tantrum.
[1] https://ruoyusun.com/2023/10/12/one-game-six-platforms.html#...
Now...something like minecraft or SubNautica? The M4 is fine, especially if you're not pushing 4k 240hz.
Apple has been pushing the gaming experience for years (iPhone 4s?) but it never REALLY seems to land, and when someone has a great gaming seperience in a modern AAA game, they always seem to be using a $4500 Studio or similar.
If you identify as a "gamer" and are in those communities, then you'll see communities talking about things you can't natively play
but if you leave niches you already have everything
and with microtransactions, Apple ecosystem users are the whales. again, not something that people who identify as "gamers" wants to admit being actually okay with, but those people are not the revenue of game production.
so I would say it is a missed opportunity for developers that are operating on antiquated calculations of MacOS deployment
It's kinda not. Here's a rough list of the 10 most-played games currently on PC: https://steamdb.info/charts/
macOS is supported by one title (DOTA 2). Windows supports all 10, Linux (the free OS, just so we're clear) runs 7 of the games and has native ports of 5 of them. If you want to go argue to them about missed revenue opportunities then be my guest, but something tells me that DOTA 2 isn't being bankrolled by Mac owners.
If you have any hard figures that demonstrate "antiquated calculations" then now is the time to fetch them for us. I'm somewhat skeptical.
And don’t forget they made an VR headset without controllers.
Apple doesn’t care about games
Kind of? It does support higher refresh rates, but their emphasis on "Retina" resolutions imposes a soft limit because monitors that dense rarely support much more than 60hz, due to the sheer bandwidth requirements.
Ports to macos have not done well from what I've heard. However you can see ports on PC do really well and have encouraged studios like Sony and SquareEnix to invest more in PC ports. Even much later after the console versions sell well. Just not a lot of reasons to add the tech debt and complexity of supporting mac as well.
Even big publishers like Blizzard who have been mac devs for a long time axed the dedicate mac team and client and moved to a unified client. This has downfalls like mac specific issues. If those are not critical then they get put in the pile with the rest of the bugs.
I know we are a few major scientific breakthroughs away from that even being remotely possible, but it sure would be nice.
Looking at my Macbook though, I can say with utmost certainty that it isn't 4000x faster than the Intel one I had 5 years ago.
So, where is the disconnect here? Why is actual user experience not able to keep up with benchmarks and marketing?
I am deeply concerned all the performance benefits of the new chips will get eaten away.
This is 4-6x faster in AI for instance.
In GPU performance (probably measured on a specific set of tasks).
Spin up ollama and run some inference on your 5-year-old intel macbook. You won't see 4000x performance improvement (because performance is bottlenecked outside of the GPU), but you might be in the right order of magnitude.
"Look how many times faster our car is![1]"
[1] Compared to a paraplegic octogenarian in a broken wheelchair!"
[1] The memory bandwidth is fine for CPU workloads, but not for GPU / NN workloads.
I use both an M1 max and an M3 max, and frankly I do not notice much difference if you control for the core count in most stuff. And for running LLMs they are almost the same performance. I think from M1-M3 there was no much performance increase in general.
First line on their website:
> M5 delivers over 4x the peak GPU compute performance for AI compared to M4
It's the GPU not the CPU (which you compare with your old Intel) and it's an AI workload, not your regular workload (which again is what you compare)
No. You exit the mail app -> Go to settings -> apps -> scroll through a massive list (that you usually just use for notification settings btw) to go to mail -> mail accounts -> add new account.
Just a simple six-step process after you’ve already hunted for it in the mail app.
You can also click the “+” button at the bottom of the list of accounts in the “Accounts” panel in Mail's settings window.
Hardware has improved significantly, but it needs software to enable me to enjoy using it.
Apple is not the only major company that has completely abandoned the users.
The fastest CPUs and GPUs with the most RAM will not make me happier being targeted by commercial surveillance mechanisms, social-media applications, and hallucinating LLM systems.
At our company we used to buy everyone MacBook Pros by default.
After the M-series chip, the MBPs are just too powerful and no longer necessary for the average white collar worker (they seem like “actual” pro machines, now) to the point where we now order MacBook Airs for new employees.
I feel like until recently, you really needed a MBP to get a decent UX (even just using chrome). But now there doesn’t seem to be a major compromise when buying an Air for half the price, at least compared to 3-5 years ago.
I’ve had zero problems with lag or compile time (prior to macOS 26 anyway)
The only thing it can’t do is run Ableton in a low latency way without strongly changing the defaults
You press a key on the keyboard to play a note and half a second later you hear it
Other than that, zero regrets
something’s off with your setup.
Regular MBs are not really a thing anymore. You mean Airs?
>Looking at my Macbook though, I can say with utmost certainty that it isn't 4000x faster than the Intel one I had 5 years ago. So, where is the disconnect here?
They wrote:
> Together, they deliver up to 15 percent faster multithreaded performance over M4
The problem is comprehension, not marketing.
“M5 delivers over 4x the peak GPU compute performance for AI”
In this situation, at least, it’s just referring to AI compute power.
The disconnect here is that you can't read. Sorry, no other way to say it.
Firstly, the M5 isn't 4-6x more powerful than M4 - the claim is only for GPU, only for one narrow workload, not overall performance uplift. Overall performance uplift looks like ~20% over M4, and probably +100% over M1 or so.
But there is absolutely a massive sea change in the MacBook since Intel 5 years ago: your peak workloads haven't changed much, but the hardware improvements give you radically different UX.
For one thing, the Intel laptops absolutely burned through the battery. Five years ago the notion of the all-day laptop was a fantasy. Even relatively light users were tethered to chargers most of the day. This is now almost fully a thing of the past. Unless your workloads are very heavy, it is now safe to charge the laptop once a day. I can go many hours in my workday without charging. I can go through a long flight without any battery anxiety. This is a massive change in how people use laptops.
Secondly is heat and comfort. The Intel Macs spun their fans up at even mild workloads, creating noise and heat - they were often very uncomfortably warm. Similar workloads are now completely silent with the device barely getting warmer than ambient temp.
Thirdly is allowing more advanced uses on lower-spec and less expensive machines. For example, the notion of rendering and editing video on a Intel MacBook Air was a total pipe dream. Now a base spec MacBook Air can do... a lot that once forced you into a much higher price point/size/weight.
A lot of these HN conversations feel like sports car fans complaining: "all this R&D and why doesn't my car go 500mph yet?" - there are other dimensions being optimized for!
> I can say with utmost certainty that it isn't 4000x faster
The numbers you provided do not come to 4000x faster (closer to 2400x)
> Why is actual user experience not able to keep up with benchmarks and marketing?
Benchmarks and marketing are very different things, but you seem to be holding them up as similar here.
The 5x 6x 4x numbers you describe across marketing across many years don't even refer to the same thing. You're giving numbers with no context, which implies you're mixing them and the marketing worked because the only thing you're recalling is the big number.
Often, every M-series chip is a HUGE advancement over the past in GPU. Most of the "5x" performance jumps you describe are in graphics processing, and the "Intel" they're comparing it to is often an Intel iGPU like the Iris Xe or UHD series. These were low end trash iGPUs even when Apple launched those Intel devices, so being impressed by 5x performance when the M1 came out was in part because the Intel Macs had such terrible integrated graphics.
The M1 was a giant jump in overall system responsiveness, and the M-series seems to be averaging about a 20% year over year meaningful speed increase. If you use AI/ML/GPU, the M-series yearly upgrade is even better. Otherwise, for most things it's a nice and noticeable bump but not a Intel-to-M1 jump even from M1-to-M4.
Unless you're looking at your MacBook running LM Studio you won't be seeing much improvement in this regard.
They say "M5 offers unified memory bandwidth of 153GB/s, providing a nearly 30 percent increase over M4" but my old Macbook M2 Max have 400GB/s
From: https://www.theregister.com/2012/05/03/unsung_heroes_of_tech...
"> The power test tools they were using were unreliable and approximate, but good enough to ensure this rule of thumb power requirement. When the first test chips came back from the lab on the 26 April 1985, Furber plugged one into a development board, and was happy to see it working perfectly first time.
> Deeply puzzling, though, was the reading on the multimeter connected in series with the power supply. The needle was at zero: the processor seemed to be consuming no power whatsoever.
> As Wilson tells it: “The development board plugged the chip into had a fault: there was no current being sent down the power supply lines at all. The processor was actually running on leakage from the logic circuits. So the low-power big thing that the ARM is most valued for today, the reason that it's on all your mobile phones, was a complete accident."
> Wilson had, it turned out, designed a powerful 32-bit processor that consumed no more than a tenth of a Watt."
Not for Mac mini?
Seriously, can’t you tell me about the CPU cores and their performance?
Whether you're playing games, or editing videos, or doing 3D work, or trying to digest the latest bloated react mess on some website.. ;)
- M1 | 5 nm | 8 (4P+4E) | GPU 7–8 | 16-core Neural | Memory Bandwidth: 68.25 GB/s | Unified Memory: 16 GB | Geekbench6 ~2346 / 8346
- M2 | 5 nm (G2) | 8 (4P+4E) | GPU 8–10 | 16-core Neural | Memory Bandwidth: 100 GB/s | Unified Memory: 24 GB | Geekbench6 ~2586 / 9672
- M3 | 3 nm (first-gen) | 8 (4P+4E) | GPU 8–10 | 16-core Neural | Memory Bandwidth: 100 GB/s | Unified Memory: 24 GB | Geekbench6 ~2965 / 11565
- M4 | 3 nm (second-gen) | 10 (4P+6E) | GPU 8–10 | 16-core Neural | Memory Bandwidth: 120 GB/s | Unified Memory: 32 GB | Geekbench6 ~3822 / 15031
- M5 | 3 nm (third-gen) | 10 (4P+6E) | GPU 10 | 16-core Neural | Memory Bandwidth: 153 GB/s | Unified Memory: up to 32 GB | Geekbench6 ~4133 / 15,437 (9-core sample)
It just never sweats AT ALL - it feels like a decade from obsolescence based on what I'm doing now.
It would have to be an order of magnitude faster for me to even notice at this point.
That doesn’t make it obsolete, at all.
Update: I am thinking the 24GB for M5 is a typo. I see on Apple's site the 14 inch MBP can be configured optionally with 32GB of RAM.
If anyone has any real clues that they can share pseudonymously, that would be great. Not sure which department drove that change.
It's the only M5 device that leaked to the public early.
If you could yank the screen out, it probably evens out :)
I have seen quite a few such announcements from competitors that tend to be so close that I wonder if they have some competitor analysis to precede the Goliath by a few days (like Google vs rest, Apple vs rest etc).
Then there is the whole ARM vs x86 issue. Even if a compatible Linux distro were made, I expect to run all kinds of software on my desktop rig including games, and ARM is still a dead end for that. For laptops, it's probably a sensible choice now, but we're still far from truly free and usable ARM desktop.
Does anyone know if we're still on pace with Moore's law?
nik736•6h ago
quest88•6h ago
nik736•6h ago
Rohansi•3h ago
burnte•5h ago
hu3•6h ago
"M5 is Apple’s next-generation system on a chip built for AI, resulting in a faster, more efficient, and more capable chip for the 14-inch MacBook Pro, iPad Pro, and Apple Vision Pro."
mpeg•6h ago
sgt•6h ago
mpeg•5h ago
andy_ppp•5h ago
sgt•3h ago
andy_ppp•2h ago
sgt•51m ago
I can easily imagine companies running Mac Studios in prod. Apple should release another Xserve.
bombcar•5h ago
asimovDev•4h ago
iyn•4h ago
czbond•5h ago
shorts_theory•5h ago
modeless•5h ago
I highly recommend Andrej Karpathy's videos if you want to learn details.
pfortuny•5h ago
rs186•3h ago
Sohcahtoa82•3h ago
So if you have a 7B parameter model with 16-bit quantization, that means you'll have 14 GB/s of data coming in. If you only have 153 GB/sec of memory bandwidth, that means you'll cap out ~11 tokens/sec, regardless of how my processing power you have.
You can of course quantize to 8-bit or even 4-bit, or use a smaller model, but doing so makes your model dumber. There's a trade-off between performance and capability.
adastra22•58m ago
Sohcahtoa82•35m ago
wizee•5h ago
Models like Qwen 3 30B-A3B and GPT-OSS 20B, both quite decent, should be able to run at 30+ tokens/sec at typical (4-bit) quantizations.
zamadatix•5h ago
Neither product actually qualifies for the task IMO, and that doesn't change just because two companies advertised them as such instead of just one. The absolute highest end Apple Silicon variants tend to be a bit more reasonable, but the price advantage goes out the window too.
cma•4h ago
diabllicseagull•5h ago
chedabob•5h ago
Tepix•4h ago