(Genuinely curious, perhaps there are third-party apps I can use to bridge the gap.)
also hard to find a better laptop for running an LLM locally too
ChatGPT being the number one app is a weird way for people to express they don't trust AI: https://apps.apple.com/us/charts/iphone
They’ve been putting AI in a lot of places over the years.
The expectation is that Apple will eventually launch a revolutionary new product, service or feature based around AI. This is the company that envisioned the Knowledge Navigator in the 80s after all. The story is simply that it hasn't happened yet. That doesn't make it a non-story, simply an obvious one.
I feel like he’d be obsessively working to combine AI, robotics and battery technology into the classic sci fi android.
Instead, modern Apple seems to be innovating essentially nothing unless you count the VR thing and the rumors of an Apple car, which sounds to me much like the Apple Newton.
* Apps are already logged in, so no extra friction to grant access.
* Apps mostly use Apple-developed UI frameworks, so Apple could turn them into AI-readable representations, instead of raw pixels. In the same way a browser can give the AI the accessibility DOM, Apple could give AIs an easier representation to read and manipulate.
* iPhones already have specialized hardware for AI acceleration.
I want to be able to tell my phone to a) summarize my finances across all the apps I have b) give me a list of new articles of a certain topic from my magazine/news apps c) combine internet search with on-device files to generate personal reports.
All this is possible, but Apple doesn't care to do this. The path not taken is invisible, and no one will criticize them for squandering this opportunity. That's a more subtle drawback with only having two phone operating systems.
Edit: And add strong controls to limit what it can and cannot access, especially for the creepy stuff.
Apps already have such an accessibility tree; it's used for VoiceOver and you can use it to write UI unit tests. (If you haven't tested your own app with VoiceOver, you should.)
This really is the problem. Why do I spend hundreds of dollars more for specialized hardware that’s better than last years specialized hardware if all the AI features are going to be an API call to chatGPT? I am pretty sure I don’t need all of that hardware to watch YouTube videos or scroll Instagram/web, which is what 95% of the users do.
A big issue to solve is battery life. Right now there's already a lot that goes on at night while the user sleeps with their phone plugged in. This helps to preserve battery life because you can run intensive tasks while hooked up to a power source.
If apps are doing a lot of AI stuff in the course of regular interaction, that could drain the battery fairly quickly.
Amazingly, I think the memory footprint of the phones will also need to get quite a bit larger to really support the big uses cases and workflows. (I do feel somewhat crazy that it is already possible to purchase an iPhone with 1TB of storage and 8GB of RAM).
taylodl•4h ago
nathan_douglas•4h ago
...except for the motion-activated lighting in our foyer and laundry room. $15, 15 minutes to install, no additional charges, no external services, no security issues, and just works year after year with 100% reliability.
goodells•4h ago
scyzoryk_xyz•3h ago
I want to reach for my tools when I want to use them.
bayindirh•3h ago
They are all done locally on your device for the last decade, at least.
mort96•3h ago
micromacrofoot•3h ago
derefr•3h ago
Remember the term "smart" as applied to any device or software mode that made ~any assumptions beyond "stay on while trigger is held"? "AI" is the new "smart." Even expert systems, decision trees, and fulltext search are "AI" now.
mort96•3h ago
Not really, I'm taking the hint. If they call a feature "AI", there's a 99% chance it's empty hype. If they call a feature "machine learning", there may be something useful in there.
Notice how Apple, in this event even, uses the term "machine learning" for some features (like some of their image processing stuff) and "AI" for other features. Their usage of the terms more or less matches my line of features I want and features I don't want.
derefr•1h ago
But that's not true of any other actor in the market. Everyone else — but especially venture-backed companies trying to get/retain investor interest — are still trying to find a justification for calling every single thing they're selling "AI".
(And it's also not even true of Apple themselves as recently as six months ago. They were approaching their marketing this way too, right up until their whole "AI" team crashed and burned.)
Apple-of-H2-2025 is literally the only company your heuristic will actually spit out any useful information for. For everything else, you'll just end up with 100% false positives.
choilive•3h ago
All machine learning is AI, not all AI is machine learning.
mort96•3h ago
eric_h•3h ago
tartoran•3h ago
loloquwowndueo•2h ago
foogazi•2m ago
nerdjon•3h ago
Of course this is going to be spun and turned into a negative, but I basically want ML to be invisible again. The benefits being clear, but the underlying tech no longer mattering.
Razengan•3h ago
taylodl•2h ago
Razengan•2h ago
akomtu•54m ago