What do you do on wifi that requires more than 10gb per seconds... on a laptop, you'd fill up the base model ssd in under a minute of download
Even pre-Apple Silicon, it's been a decade since users could upgrade MacBook's RAM or internal storage.
Sounds like maybe they didn't want to try and fit their new N1 chip this go around so they could re-use some components? MacBook still has the same broadcom chip. Or for a pro differentiating feature when the M5 Pro/Max comes out later. There's a rumored MBP re-design, so I'm guessing we'll see it then along with it having the N1 for WiFi 7.
I can see an overlap with people who want smaller computers who also want max power, but I just would not believe that is a significant group. (again, all personal observations)
I have to say if I had any choice I would delay my purchase until the 16” catches up rather than buying a generation behind. If I see specs saying M5 14” is more performant for my workloads than my more expensive 16” I’m even more motivated to delay. Most product managers would be aware of these things.
I also think the 15 inch MacBook Air filled the non-power-user-but-likes-big-screen niche.
- normal - pro - max
pro and max had way more cores and gpus and supported way more ram. today's release is the basic version of the new cpu; if you want more ram you can get the m4pro or m4max based MacBook Pros, or wait for the M5pro/max to come out.
This has been their staggered release strategy for a while.
edit: suggested retail price also dropped with EUR 100. Mind is less blown now. It seems like a good thing in fact.
edit2: in Belgium, the combined price of the 70W adapter and 2m USB-C to MagSafe is EUR 120.
[1] https://forums.macrumors.com/threads/new-macbook-pro-does-no...
I'll take the discount and use one of my 12 existing USB-C chargers.
Compared to the marginal environmental impact to source materials, build hardware and parts, assemble, ship, stock, and transport to customer each unit, the box could be 10x larger and it wouldn't make a dent.
This is not how shipping works.
A larger box, even by 1 inch on any direction, absolutely makes a huge difference when shipping in manufacturing quantities. Let's not pretend physical volume doesn't exist just to make an argument.
10 planes flying with MacBooks == much different than 1 plane (in other words, when you 10x the size of something, as you suggest, it does actually have a huge impact)
A smaller box allows more to be carried. But if we go that route, it's trivial to ship them without any box and box them domestically - and that's a 2-3x volume reduction right there.
Ah yeah I can't imagine any scenario where this could go wrong
Like man in the middle attacks
Replacement/fake products
... or you know, damage? Boxes provide... protection.
> it's trivial
Anytime you catch yourself thinking something is trivial, you're probably trivializing it (aka think about it more and you'll probably be able to think of a dozen more reasons packaging products is the norm)
USB-C chargers are everywhere now. Monitors with USB-C or Thunderbolt inputs will charge your laptop, too. I bought a monitor that charges over the USB-C cable and I haven’t use the charger that came with the laptop in years because I have a smaller travel charger that I prefer for trips anyway.
You don’t have to buy the premium Apple charger and cable. There are many cheap options.
I already have a box of powerful USB-C chargers I don’t use. I don’t need yet another one to add to the pile.
On the go, I've bought a small GaN with multiple ports. At home, I already have all of my desks wired up with a Usb-c charger.
Had a similar issue with my 2018 MBP Intel - the 86/87 Watt Apple charger was the only thing it would come to life with as the battery aged if the device got too low.
In my experience a low-power charger will revive, you just must wait for it to hit enough SOC since it is effectively starting off the battery. This does take a while, but starting dead on a supply that can't guarantee enough power would be dumb.
Even a Studio Display, which can provide more power than my M1 Pro can use, won't wake it from this state. Apple wants $300 for a replacement battery so I'll just buy a new MacBook at that price, but the charger situation doesn't bode well for M5 MacBook buyers who wonder why their Mac is dead one day (and they just need the exact charger the system wants, but Apple didn't provide it)
In 2018 I had a phone that entered a boot loop: battery depleted, plug it in, it automatically starts booting, it stops charging while booting, it dies due to depletion, it recognises it’s plugged in and starts charging, boot, stop, die, start, boot, stop, die… I tried every combination of the four or five cables that I had with a car USB power adapter and someone’s power bank, nothing worked. Diverted on my way home (an 8 hour journey) to buy another phone because I needed one the next day. When I got home, I tried two or three power adapters with all the cables and finally found one combination that worked. I learned the lesson that day that allowing devices to deplete completely can be surprisingly hazardous.
USB-C 15W Chargers may be everywhere, but higher power charger required for MacBook Pro is not.
I would have agreed if the devices is using 10W or 20W where you could charge it slightly slower. Not for a 70W to 100W MacBook Pro though.
I actually have very few USB-C chargers. With everyone leaving them out of the box, I don’t happen to have a bunch of them by chance. They took them out of the box before giving time for people to acquire them organically. I never bought a single lightning cable, but almost all my USB-C cables had to be purchased. This is not great, considering how confusing the USB-C spec is.
Other than the one that came with my M1 MBP (which I will lose when I sell it), I have had to purchase every charger I have.
Not being able to charge a $1,500+ laptop without buying a separate accessory is crazy to me. I’ve also seen many reports over the years comparing Apple chargers to cheap 3rd party ones where there are significant quality differences, to the point of some of the 3rd party ones being dangerous or damaging. I don’t know why Apple would want to open the door to more of that.
I assume a lot of people will use a phone charger, then call support or leave bad reviews, because the laptop is losing battery while plugged in. Most people don’t know what kind of charger they need for their laptop. My sister just ordered a MacBook Air a couple weeks ago and called me to help order, and one of the questions was about the charger, because there were options for different chargers, which confused her and had her questioning if one even came with it or if she had to pick one of the options. This is a bad user experience. She’s not a totally clueless user either. She’s not a big techie, but in offices she used to work with, she was the most knowledgeable and was who they called when the server had issues. She also opened up and did pretty major surgery on her old MacBook Air after watching a couple YouTube videos. So I’d say at least 50% of people know less than her on this stuff.
Apple positions themselves as the premium product in the market and easy to use for the average user. Not including the basics to charge the internal battery is not premium or easy. I can see it leading to reputational damage.
This is especially true for someone moving up to an MBP from an MBA, which takes less juice.
Germany: 1758 USD (1512 EUR) without charger.
US: 1599 USD with 70W charger.
This feels like is an insult.
People in other countries will get pissed but ultimately suck it up and buy a product. People in America will take it as a personal offense due to the current Maoist-style cult of personality, and you'll get death threats and videos of them shooting your products posted onto social media. Just look at what happened to that beer company. No such thing would happen in Germany.
Or a certain individual…
Apple could subsidize by absorbing part of the tariff in the U.S. and overcharging in the EU.
That said, in the EU we have a two-year warranty.
VAT in the U.S. is no more than 12%.
As far as I know, the US has zero warranty laws. It can be zero days.
https://www.apple.com/legal/warranty/products/embedded-mac-w...
> Under EU rules, if the goods you buy turn out to be faulty or do not look or work as advertised, the seller must repair or replace them at no cost. If this is impossible or the seller cannot do it within a reasonable time and without significant inconvenience to you, you are entitled to a full or partial refund. You always have the right to a minimum 2-year guarantee from the moment you received the goods. However, national rules in your country may give you extra protection.
> The 2-year guarantee period starts as soon as you receive your goods.
> If a defect becomes apparent within 1 year of delivery, you don't have to prove it existed at the time of delivery. It is assumed that it did unless the seller can prove otherwise. In some EU countries, this period of “reversed burden of proof” is 2 years.
When it was announced, I expected it to be at least 4000 AUD (~2600 USD). When I heard it was starting at 1500 USD instead (~2300 AUD), I was astonished and very excited. And it still is that price… but only in the US. In Australia it is 4000 AUD (the 32GB/1TB model, which is 1700 USD, ~2600 AUD). So I sadly didn’t get one.
Is the rest of the world subsidising the US market, or are they just profiteering in the rest of the world?
Prices are about 65 EUR for a 70W (tested DE + CH)
The EU law states they must provide an SKU without an adapter - i.e. they're still allowed to offer one with a power adapter.
Same, for a laptop??? Really? Wild. You can charge these with USB-C chargers too.
Chargers don’t change quickly. If I lost my charger from 2019, the ideal replacement in 2025 would be literally exactly the same model—and mine still works like new and looks good. I have nothing to gain from buying a new charger.
We should be cheering the EU for ending an abuse that the US has long failed to.
Also, it still bundles a USB-C to MagSafe 3 cable.
If you sell your old laptop when you buy a new one, you generally sell it with old charger. And different Apple laptops take chargers of different maximum watts (they're compatible but not optimal), so they're not all the same anyways.
There's a reason they generally make sense to bundle. Especially with laptop chargers, which provide a whole lot more power than some random little USB-C charger you might have. Sometimes letting the free market decide actually gives customers what they want and find most useful.
Sounds like a symptom of incompatibility. I’ve only ever included the charger when it was specific to the laptop.
> And different Apple laptops take chargers of different maximum watts (they're compatible but not optimal), so they're not all the same anyways.
Chargers automatically provide whatever power level is needed, up to their max, and charging power isn’t the steady tick upward we’re used to elsewhere. The MacBook Pro did get a faster charger a few years ago, relegating old ones to that “compatible but not optimal” state, but meanwhile MacBook Air chargers got slower, and most releases didn’t change the charger. Certainly there are sometimes benefits to buying a new charger, but it happens much less often than new device purchases, and even when there are benefits purchases should still be the customer’s choice.
> Sometimes letting the free market decide actually gives customers what they want and find most useful.
I agree, but “free market” doesn’t mean lawlessness, it means an actual market that’s actually free. Actual market: companies compete on economics, not e.g. violence or leverage over consumers. Actually free: consumers freely choose between available options. Bundling is a very classic example of an attempt to circumvent free market economics, using the greater importance of one choice to dictate a second choice.
Only when there's no competition and you can use that to abuse market power.
But competition for laptops is strong. I don't know why you don't accept the obvious fact, which is that most consumers want their laptops to come with a charger, even if you personally don't.
Like, nobody says the free market is failing because Coke forces me to buy carbonated H2O along with their syrup at the grocery store.
Apple M5 Chip
Everyone buying their high end gear is buying something waiting to be refreshed now.
Buyers who walk into an Apple store for a base MacBook Pro will wait if they hear a new model is coming out. So if you have a buyer basing purchases on the generation number, it makes sense to launch that model as soon as possible.
Pro/Max buyers generally are checking into specs and getting what they need. Hence the M2 Ultra still being for sale, for some niches that have specific requirements.
Looks like the Pro and Max will be on a three month delay.
Most of their buyers aren’t buying the highest end parts. Those are a niche market.
Focusing on the smaller parts first makes sense because they’re easier to validate and ship. The larger parts are more complicated and come next.
The standard practice is to start by producing the chips with the smallest die size.
People in the U.S. are starting to think about their Christmas shopping lists right about now.
Smaller chips means more of a wafer is usable when a defect exists
AMD is somewhat of an exception/unique case though, having chipsets and monolithic depending on the use case and console/semicustom offerings, so that doesn't map fully.
Also, let's not forget in Apples case, that they actually go phone first, the Air+iPad, then Pro and finally Studio. Feel that the lower end devices should priority personally though, efficiency gains are more valuable in connected devices with limited space for batteries over my 16 incher with 100wh.
Course, would be nice if we just got the entire range updated at once, but I doubt even Apple could pull such a supply chain miracle off, even if they bought all of TSMC and the entire island to boot...
Did they announce this or are you speaking for Apple?
This has been their release strategy for past generations.
- m4 -> m5, same core number and distribution, "neural accelerators", higher memory bandwidth
- max storage increased from 2 to 4TB (and probably an extra kidney in price)
Everything else is strictly identical.
The marketing blurb claims 2~3x the performances of the M4 for AI stuff (I assume that's the neural accelerators), 20% for CPU tasks (compiling), 40~80% for GPU tasks (gaming and 3D rendering).
Not to mention the M4 pro and max released 6 months after the M4. If that holds for M5, it won’t be this year.
Edit: okay, that garnered more attention than I expected, I guess I owe a qualification.
1. Everything is just slightly different. I had to split all my dot files into common/Linux/Mac specific sections. Don't expect to be able to clone and build any random C++ project unless someone in the project is specifically targeting Mac.
2. Not everything is supported natively on arm64. I had an idea and wanted to spin up a project using DynamoRIO, but wasn't supported. Others have mentioned the docker quirks.
3. The window manager. I'm not a fan of all the animations and needing to gester between screens (and yes, I've been down the hotkeys rabbit hole). To install a 3rd party window manager you need to disable some security setting because appearantly they work by injecting into the display manager and calling private APIs.
So my person takeaway was that I took the openness of the Linux ecosystem for granted (I've always had a local checkout of the kernel so I can grep an error message if needed). Losing that for me felt like wearing a straightjacket. Ironically I have a MBP at work, but spend my day ssh'd into a Linux box. It's a great machine for running a web browser and terminal emulator.
What "permission headaches"?
I know that it's possible to script that since Homebrew handles it automatically, but if you just want to use a specific app outside of Homebrew, experience is definitely worse than on Linux/Windows.
The OS also has weird rough edges when used from the terminal - there are read-only parts, there are restrictions on loading libraries, multiple utilities come with very old versions or BSD versions with different flags than the GNU ones you might be used to coming from Linux, the package manager is pretty terrible. There are things (e.g. installing drivers to be able to connect to ESP32 devices) that require jumping through multiple ridiculous hoops. Some things are flat out impossible. Each new OS update brings new restrictions "for your safety" that are probably good for the average consumer, but annoying for people using the device for development/related.
You use nix or brew (or something like MacPorts).
And they are mighty fine.
You shouldn't be concerned with the built-in utilities.
the workarounds on the internet are like "just build the image so that it uses the same uid you use on your host" which is batshot crazy advice.
i have no idea how people use docker on other platforms where this doesn't work properly. One of our devs has a linux host and was unable to use our dev stack and we couldn't find a workaround. Luckily he's a frontend dev and eventually just gave up using the dev stack in favour of running requestly to forward frontend from prod to his local tooling.
Things I prefer: Raycast + it's plugins compared to the linux app search tooling, battery life, performance. Brew vs the linux package managers I don't notice much of a difference.
Things that are basically the same: The dev experience (just a shell and my dotfiles has it essentially the same between OS's)
It may seem like a small thing, but when you have literal decades of muscle memory working against you, it's not that small.
What messes me up when I'm working on a linux machine is not being able to do things like copy/paste text from the terminal with a hotkey combo because there is no CMD-C, and CTRL-C already has a job other than copying.
IMO apple really messed up by putting the FN key in the bottom left corner of the keyboard instead of CTRL. Those keys get swapped on every Mac I buy.
I agree on the Fn key positioning... I hate it in the corner and tend to zoom in when considering laptops for anyone just in case. I've also had weird arrow keys on the right side in a laptop keyboard where I'd hit the up arrow instead of the right shift a lot in practice... really messed up test area input.
It's the same thing when switching from a Nintendo to a Western game where the cancel/confirm buttons on the gamepads are swapped.
But in the end the biggest thing to remember is in MacOS a window is not the application. In Windows or in many Linux desktop apps, when you close the last or root window you've exited the application. This isn't true in MacOS, applications can continue running even if they don't currently display any windows. That's why there's the dot at the bottom under the launcher and why you can alt+tab to them still. If you alt+tab to an app without a window the menu bar changes to that app's menu bar.
I remember back to my elementary school computer lab with the teacher reminding me "be sure to actually quit the application in the menu bar before going to the next lesson, do not just close" especially due to the memory limitations at the time.
I've found once I really got that model of how applications really work in MacOS it made a good bit more sense why the behaviors are the way they are.
I'm often envious of these Macbook announcements, as the battery life on my XPS is poor (~2ish hours) when running Ubuntu. (No idea if it's also bad on Windows - as I haven't run it in years).
Thanks for the heads-up.
MacOS is great for development. Tons of high profile devs, from Python and ML, to JS, Java, Go, Rust and more use it - the very people who headline major projects for those languages.
2ish hours battery life is crazy. It's 8+ hours with the average Macbook.
Did I get a dud? I rarely get over 2.5
Have you checked your Battery Health?
If you have an intel-based Mac, it's the same expected battery life as Windows and 2.5 hours on an intel MacBook battery sounds decent for something 5+ years old.
Gaming is another story though, or any other uses that put a lot of stress the GPU.
Highly recommend doing nix + nix-darwin + home-manager to make this declarative. Easier to futz around with.
Though if you don't like Nixlang it will of course be a chore to learn/etc. It was for me.
Really useful for debugging though
What are the differences though? I have mbpr and a pc with Fedora on it and I barely see any differences aside from sandboxing in my atomic Kinoite setup and different package manager.
People often hating on brew but as a backend dev I haven't encountered any issues for years.
There isn't a "dev switch" in macOS, so you have to know which setting is getting in your way. Apple doesn't like to EVER show error alerts if at all possible to suppress, so when things in your dev environment fail, you don't know why.
If you're a seasoned dev, you have an idea why and can track it down. If you're learning as you go or new to things, it can be a real problem to figure out if the package/IDE/runtime you're working with is the problem or if macOS Gatekeeper or some other system protection is in the way.
Sounds more like a you problem, probably due to unfamiliarity. There are endless options for local dev on a Mac, and a huge share of devs using one.
This sounds like you think macOS is a good dev environment, but that you personally don't like the UI/UX (always safer to make UI/UX judgements subjective ["I don't like"] rather than objective ["it's bad"], since it's so difficult to evaluate objectively, e.g., compared to saying something like Docker doesn't run natively on macOS, which is just an objective fact).
It was a bit of a struggle to get used to it, coming from windows.
The only thing I really miss now is alt-tab working as expected. (It's a massive pain to move between two windows of the same program)
Or otherwise you can enable the app exposé feature to swipe down with three fingers and it will show you only windows of the same app.
Shame on me.
If you have an Apple keyboard, CTRL-F3 (without the Fn modifier) will do the same. Not sure if there are third-party keyboards that support Mac media keys, but I'm guessing there are some at least...
GNOME does this much better, as it instead uses Super+<whatever the key above Tab is>. In the US, that remains ` but elsewhere it's so much better than on MacOS.
That's true, hence why I remap it to a "proper" key, above Tab with:
$ cat ~/Scripts/keyboard_remapping.sh
#!/bin/bash
hidutil property --set '{"UserKeyMapping":
[{"HIDKeyboardModifierMappingSrc":0x700000064,
"HIDKeyboardModifierMappingDst":0x700000035},
{"HIDKeyboardModifierMappingSrc":0x700000035,
"HIDKeyboardModifierMappingDst":0x7000000E1}]
}'
Specifically, sometimes it works with my Safari windows ans sometimes it doesn't.
And sometimes when it doesn't work, Option+< will work for some reason.
But sometimes that doesn't work either and then I just have to swipe and slide or use alt-tab (yes, you can now install a program that gives you proper alt-tab, so I do not have to deal with this IMO nonsense, it just feels like the right thing to do when I know I'm just looking for the other Safari window.)
I'm not complaining, I knew what I went to when I asked $WORK for a Mac, I have had one before and for me the tradeoff of having a laptop supported by IT and with good battery time is worth it even if the UX is (again IMO) somewhat crazy for a guy who comes from a C64->Win 3.1->Windows 95/98->Linux (all of them and a number of weird desktops) background.
https://karabiner-elements.pqrs.org/
https://ke-complex-modifications.pqrs.org/?q=windows#windows...
but i absolutely hate MacOS26, my next laptop won't be a macbook
It's a shame what they did to this awesome hardware with a crappy update
Some of this is probably brew not being as useful as apt, and some more of it is probably me not being as familiar with the Mac stuff, but it's definitely something I noticed when I switched.
The overall (graphical) UI is much fluider and more convenient than Linux though.
I had been a Linux notebook user for many years and have praised it on this board years ago. But today the Linux desktop has regressed into a piece of trash even for basic command line usage while providing zero exclusive apps worth using. It's really sad since it's unforced and brought upon Linux users by overzealous developers alone.
Mac OS trounces Linux in absolutely every way on the desktop it's not even funny: performance, battery life, apps, usability, innovation. Available PC notebook HW is a laughable value compared to even an entry level Apple MacBook Air. Anecdata but I have no less than five "pro" notebooks (Dell Lattitude, XPS, and Lenovo Thinkpad) come and go with basic battery problems, mechanical touchpad problems, touchpad driver issues, WLAN driver issues, power management issues, gross design issues, and all kind of crap come and go in the last five years so I'm pretty sure I know what I'm talking about.
The one thing Mac isn't great for is games, and I think SteamOS/Proton/wine comes along nicely and timely as Windows is finally turning to the dark side entirely.
Specifically for this, there's Aerospace (https://github.com/nikitabobko/AeroSpace) which does not require disabling SIP, intentionally by the dev.
For using the vanilla macOS workspaces though, if you avoid using full screen apps (since those go to their on ephemeral workspace that you can't keybind for some stupid reason), if you create a fixed amount of workspaces you can bind keyboard shortcuts to switch to them. I have 5 set up, and use Ctrl+1/2/3/4/5 to switch between isntead of using gestures.
Apart from that, I use Raycast to set keybindings for opening specific applications. You can also bind apple shortcuts that you make.
Still not my favorite OS over Linux, but I've managed to make it work because I love the hardware, and outside of $dayjob I do professional photography and the adobe suite runs better here than even my insanely overspeced gaming machine on Windows.
It will be interesting to see how this evolves as local LLMs become mainstream and support for local hardware matures. Perhaps, the energy efficiency of the Apple Neural Engine will widen the moat, or perhaps NPUs like those in Ryzen chips will close the gap.
Also, note that thunderbolt not yet supported[2].
[0] https://web.archive.org/web/20241219125418/https://social.tr... [1] https://github.com/AsahiLinux/linux/issues/262 [2] https://asahilinux.org/docs/platform/feature-support/overvie...
Unfortunately I do a lot of C++… I hate the hoops you have to go through to not use the Apple Clang compiler.
My work got me a similar M4 MacBook Pro early this year, and I find the friction high enough that I rarely use it. It is, at best, an annoying SSH over VPN client that runs the endpoint-management tools my IT group wants. Otherwise, it is a paperweight since it adds nothing for me.
The rest of the time, I continue to use Fedora on my last gen Thinkpad P14s (AMD Ryzen 7 PRO 7840U). Or even my 5+ year old Thinkpad T495 (AMD Ryzen 7 PRO 3700U), though I can only use it for scratch stuff since it has a sporadic "fan error" that will prevent boot when it happens.
But, I'm not doing any local work that is really GPU dependent. If I were, I'd be torn between chasing the latest AMD iGPU that can use large (but lower bandwidth) system RAM versus rekindling my old workstation habit to host a full size graphics card. It would depend on the details of what I needed to run. I don't really like the NVIDIA driver experience on Linux, but have worked with it in the past (when I had a current gen Titan X) but also did OpenCL on several vendors.
I much prefer a framework and the repairability aspect. However, if it's going to sound like a jet engine and have half the battery life of a new m series Mac. Then I feel like there's really no option if I want solid battery life and good performance.
Mac has done a great job here. Kudos to you, Mac team!
https://asahilinux.org/docs/platform/feature-support/overvie...
Linux is too ugly for me to use as my main device. Same with what I’ve seen of Android.
Unless you're talking about the look of the physical machine. Well then that's an easier fix ;)
I don't really use local LLMs but think 32GB RAM would be good for me... but I am so ready to upgrade but trying to figure out how much longer we need to wait.
From a buyer's perspective, I don't like it at all.
As an other example the current ultra part is the M3, and it was released early 2025, after even the M4 Pro/Max, and a good 18 months after the M3 was unveiled. We might not see an M4 Ultra until 2027.
I got the cheapest m1 pro (the weird one they sold thats binned due to defects) with 32gb ram and everything runs awesome.
Always get the most ram you can in mac world. Running a largish local LLM model is slowish but it does run.
A mac out of memory is just a totally different machine than one with.
probably because most of the devs building the software are on the highest ram possible and there is just so much testing and optimization they dont do.
we went from 10 hours to 24 hours in 5 years - impressive
i wonder why they advertise gaming on the laptop, anyone plays anything Meaningful on macbooks?
The M5's GPU specs seem to put it near a high-end NVIDIA card from 2018. Impressive as all get out for a power-friendly chip, but not really what I think of when I hear "good for gaming"
I'm happy to hear your games work well for you, but it sounds like the games you're playing aren't demanding compared to modern AAA titles. Most games released in the last year or two don't run well on my 2080 test system at anything approaching decent graphics.
Whether or not the M5 GPU is actually capable of that level of performance or whether the drivers will let it reach its potential is of course a completely different story. GPU performance is hard to estimate based on raw specs, you just have to run benchmarks and see what you end up with.
A 5060 outperforms a 2080 by roughly 20% on most titles, across the board, not cherry-picking for the best results. They are not about the same.
> you should be able to run at like 1080p High or better
This is disconnected from reality. 1080p low/medium, some games are playable but not enjoyable. Remember, I actually have a 2080, so I'm not just guessing.
> GPU performance is hard to estimate based on raw specs, you just have to run benchmarks and see what you end up with.
Rich coming from someone who claims a 7 year old graphics card is "about the same" as a card which has 2.5x better RayTracing, has 3x faster DLSS, faster VRAM, and much better AI capabilities. The 2080 can't even encode/decode AV1...
Sorry you made your first gen chip so good that I don't feel the need to upgrade lol.
I'm still doing fine with a 16gb M1 Air, I mostly VPN+SSH to my home desktop when I need more oomph anyway. It lasts a full day, all week when you just check email on vacation once a day.
Chip, memory and storage are really fast, but I’m fully convinced that the OS is crippling these machines.
M1 MacBooks are ~5 years old at this point, and if you've been working a laptop hard for 5 years it's often worth getting an upgrade for battery life as well as speed.
Then it started having issues waking up from sleep. Only the OG Apple charger could wake it up, then it would see it actually had 40-60% battery but something had gone wrong while sleeping and it thought it was empty.
Intel MacBooks had terrible SMC issues, so maybe this won't afflict the M-series. Just sharing because I could still use that MacBook a few hours between charged, it just couldn't be trusted to wake up without a charger. That's really inconvenient and got me to upgrade combined with new features.
I'm still pretty happy with my 16gb M1 Air, but it would be nice to know some closer to real world differences.
Macs barely got faster for ages with Intel - they just got hotter and shorter on battery life.
20% per year is a doubling every 4y. That is awesome.
I’m confused — they made a comparison that is directly relevant to your situation and you don’t like it?
Most people with an M4 won’t be looking to upgrade to an M5. But for people on an M1 (like you) or even an older Intel chip, this is good information!
I would be happy to sacrifice the EU keyboard and have the AI instead :-)
SSD has double the speed. I guess they say this only for M5 MacBook Pro, because the previous M4 has always had slower SSD speed than M4 Pro at 3.5GB/s. So now the M5 should be at 7GB/s.
I assume no update on SDXC UHS-III.
Air’s don’t have to be just cheap. I want a thin and light premium laptop for walking around and a second Mac (of any type) for my desk.
I have bought cracked-screen iPhones since Personal Hotspot allowed wired connections back in the 2000s, velcro'd them to the back of my MacBook screen and have been living the "I have internet on my Mac everywhere" life since then. With 5G, I can't really tell when I'm on Wi-Fi vs. when my MacBook opts for the hotspot connection.
I'd love a cellular MacBook and would also insta-buy, but I've given up hope until the next network upgrade.
Yes, I mentioned that in the post you responded to.
> Not sure which apps, if any, respect it, but it's there
It reduces data consumption for me about 1/5. Not nothing, but the Mac can easily consume hundreds of GB of data a week doing "normal" activities. YouTube on a MacBook is many times more data than the equivalent on a phone screen.
My craving has been answered by the GPD WIN MAX 2, a 10" mini laptop with lots of ports and bitchin' performance (AI 9 chip sips battery). It's windows, but an upgrade to pro to disable the annoying stuff via group policy + never signing into a Microsoft account, it's amazing how much faster it is than a machine that's always trying to authenticate to the cloud before it does anything. Wake from sleep is also excellent which was the main thing that kept me using MacBooks. Anyway it's the first computer I've bought in a decade that has some innovation to it.
Edit: there's a slot for a cellular modem but I haven't done enough research to find one that will play nice on US networks
The only possible issue I can think of is battery life, but if I'm carrying around my laptop I can throw a charge cable in the bag to keep my phone juiced.
We're discussing a MacBook someday with a built-in phone, the closest I've found is an iOS device wired to my MacBook as a wired hotspot. It's like having fast wifi everywhere.
Using my personal phone (that I also use for other things like calls) wouldn't be like having wifi everywhere on my Mac, for example if I walk away from my laptop while on the phone the Mac would lose internet.
(I had just been looking at macs a few weeks ago, and had noticed how close in price macbook pro and macbook air were for same specs -- was thinking, really no reason not to get pro even if all I really want it for is the built-in HDMI. They are now more price differentiated, if I am remembering right).
However, it is not just because of the larger display.
M5 14" starts at:
10-Core CPU
10-Core GPU
16GB Unified Memory
512GB SSD Storage
M5 16" starts at:
14-Core CPU
20-Core GPU
24GB Unified Memory
512GB SSD Storage
So it's the cost of 4x more core CPU, 10x (double) the core GPU, and +8GB memory.
Printerisreal•6h ago
simonw•5h ago
Bad news for anyone who buys the M5 MacBook Pro as an "AI" machine and finds it can't fit any of the more interesting LLMs!
ortusdux•5h ago
outside1234•5h ago
masklinn•5h ago
kgwgk•5h ago
xuki•5h ago
Printerisreal•5h ago
masklinn•5h ago
Printerisreal•5h ago
flemhans•4h ago
wltr•4h ago
tracker1•3h ago
At this point, I get the soldered on ram, for better or worse... I do wish at least storage was more approachable.
polshaw•4h ago
astrange•2h ago
Exceptions apply to those running local LLMs.
sixothree•5h ago
I know people complain at every release. But I look at the three choices presented and they are all disappointing to me. It's a huge turnoff to see the only initial differentiator presented to be a choice between "measly" amounts of RAM and storage to "barely acceptable" amounts.
To get even close to the specs on my Surface Pro I'd have to hit the configurator and spend at least $1000. Even more to hit the config of my work issued HP notebook.
czbond•5h ago
Their sales copy for reference:
"M-series chips include unified memory, which is more efficient than traditional RAM. This single pool of high-performance memory allows apps to efficiently share data between the CPU, GPU, and Neural Engine.... This means you can do more with unified memory than you could with the same amount of traditional RAM."
_joel•5h ago
masklinn•5h ago
thefz•5h ago
tracker1•3h ago
Still not the fastest ram, that they use for dedicated GPUs, but faster than most x86 options.
masklinn•5h ago
justonceokay•4h ago
I’m not trying into be a fanboy and maybe it’s a little bit “cope”, but apple has always put as much RAM as is necessary for the computer to work—and not a lot more—in their base models.
michaelt•4h ago
:)
justonceokay•3h ago
leakycap•2h ago
You're not silly, you're just able to see reality.
Apple knows who is buying the bulk of their computers, and it isn't power users ... most people buying computers don't have a clue what RAM is even used for.
I'd hit beachballs, but macOS balances 8GB of RAM fine even with Tahoe for regular users
eitally•2h ago
dvfjsdhgfv•3h ago
znkr•3h ago