And you don’t want to buy premium brick if you don’t have to.
(and I bet this will stand up to the test of time better than the original quote)
In the end, it sure looks like the ARM detractors were right. Architectural licenses made it so nobody except Softbank partners could afford to design an ARM core. Stock ARM core designs are outright anemic and inefficient even compared to decade-old x86 cores, meaning Broadcom or Rockchip can't just release a board and compete with AMD and Intel.
Nowadays, international detente is down the toilet, Softbank is seen as a penny-pincher, and RISC-V is already being shipped in place of ARM microprocessors in mass-market computer hardware. Apple basically ruined the entire ISA in the process of making it marginally competitive with generations-old silicon.
They lost over $800 million last quarter.
I understand how Mac users would argue that sacrificing ARM to Moloch makes great PCs. But it also hamstrings the rest of the industry, prevents direct competition, slows down ARM proliferation and accelerates the development of foreign replacements like RISC-V. As a fan of ARM, I cannot say I'm happy with the stewardship Softbank exercises over the IP.
Apple has been making money hand over fist designing ARM based hardware since 2010 with the A4.
Amazon is saving billions using its own custom chips for both servers and specialized networking chips.
But the fact is that Intel has to figure out something. Right now x86 may not be dead. But it is smelling funny. They are selling fewer desktops, servers and are non existent in mobile.
I could imagine it sucking a lot of air out of the competitive ecosystem. Anyone working on with fiddly little Rasperry Pi class products to help get the platform bootstrapped to full desktop performance suddenly has to compete with a much heavier opponent.
Terrible for wagies, great for capitalist equity lords
I doubt CoPilot+ PC with Qualcom chips are doing any better.
The AI part is useless because no Windows but AI laptops typically have beefy Intel/AMD APUs which most definitely makes life nice while using Linux.
In this case, dedicated capacity for AI has been in systems for a while behind the scenes (ie- Intel GNA) but without consumer demand; it's OEMs and OS/software vendors who are most interested in pushing the "AI Ready" standard.
If Microsoft built a set of local tools and models that integrated with Office or other applications (like Krita + ComfyUI) that were "sold" out of the Microsoft Store and only available on AI Ready systems, it might ignite some consumer interest.
However, leading with "Look how we can record your screen in secret and use AI to harvest your private information!" instead of "Here's you as a cartoon character" more or less solidified everyone's distrust in their responsible intent.
Small screen, full-OS laptops for $300 at a time before the iPad/iOS era of dumbing down computers. I wrote most of my undergrad thesis on a Dell Mini, with STATA installed for regression analysis. 9-inch screen, 1 GB RAM, 1GHz Intel Atom was plenty powerful for handling office work and watching Youtube. Its battery life was actually better than most high TDP Intel laptops, and games up to 2003-04 worked very well on it.
Even YouTube was pretty choppy on most of these things. And the screen resolution so bad that web browsing was terrible on them. A lot of dialog boxes wouldn't even fit on the screen, leaving you stuck and unable to change settings or even save files in a lot of situations. The drives were slow in most of them too.
Not to say that they couldn't be better, but they were rated poorly because most of them really were trash.
I remember netbooks like the Asus eee PC and Dell Mini (which I think I had for a spell) being _very_ portable but being capable of _maybe_ three hours of battery life with the most conservative power settings, which made these tiny laptops dog slow most of the time.
Even looking at building my own machine today, “AI PC” is all over much of this.
Nvidia's DGX Spark is a bit of a joke with only 273 GB/s memory bandwidth which is less than a 10 year old Radeon Fury X with 512 GB/s or the current RTX 5090's 1.79 TB/s.
E.g. https://frame.work/de/en/desktop 128 GB of RAM, gigantic memory bandwidth, a great GPU, and a great CPU.
It's simply another sign that we're at the tail-end of a massive bubble that's about to burst spectacularly.
JohnFen•9mo ago
My employer just refreshed all of our dev machines, and they didn't go with "AI PCs" either.