Perplexity doesn’t have their own foundation model they just wrap existing models so what good are they? They should buy Mistral instead.
Regardless i dont accept the false constraint the Apple simply must buy a foundation model or whatever that means. Perplexity's "wrapper" does a better job than chatgpt or geminiin its domain.
Imagine an updated Spotlight that would allow the user to enter any query, obtaining information from the internet, enriched with their local context/data.
LLM Siri is an entirely different concern than Apple potentially acquiring Perplexity. I view them as two wildly different initiatives.
I’m pretty sure I could knock up a decent Siri clone with got4o-mini, because I already did a bootleg Alexa to power our smart home stuff after Amazon removed the privacy controls. The only hard bit was the wake word.
Siri is currently so terrible that even something like Mistral 8b could do a decent job. Whey don’t they just run something like that on their own servers instead?
The more interesting question is how they're going to handle the 180 on all the talk about privacy.
But yeah you're not trusting Anthropic but Apple + Amazon
I dunno if thats even a win?
The same thing did happen to Apple Maps, but many people still default to google (though google maps is still significantly better at finding businesses). But Apple was humiliated by the Apple Maps rollout. Siri has just been a slow-burning joke that's only really useful for setting a timer or reminder.
Why? What does Gemini actually do, that users actually use, that requires deep integration into the OS?
Does someone need to send someone an email to realise you don't need a huge frontier model to do basic tool calling like this?
I bet Apple put 16GB on their notebooks as default while grinding their teeth and cursing at the whole $5 of extra cost per unit
Apple's AI strategy has seriously hurt their reputation. I'd love to be a fly on the wall where they discussed a strategy that amounted to "forget using the most basic LLM to understand combinations of commands like, stop all the timers and just keep the one that has about four minutes left... or turn on the lights in x, y, z room and turn off the fans around the house. let's just try to invent a completely new wheel that will get us bogged down in tech hell for years never making any progress"
They could've just improved the thing probably 99% of people use Siri for (Music, Home, Timers, Weather, Sports Scores) without developing any new tech or trying to reinvent any wheel. And in the background, continue to iterate in secret like they do best. Instead they have zero to show for two years since good LLMs have been out.
Even my son suggested things like "I wish your phone had ChatGPT and you could ask it to organize all your apps into folders" – we can all come up with really basic things they could've done so easily, with privacy built in.
Apple has a wonderful set of products without any form of generative AI, and those products continues to exist. Yes there is opportunity to add some fancy natural-language based search / control, but that seems like relatively low hanging fruit compared to the moat they have and defend.
Will adding gen ai natively to apple products have people non-trivially change the way they use iphones or macs? Probably not. So there is literally no rush here.
< do you want to open - the east door - the west door - all the doors
> Siri, open the east door < opening the east door
They kinda really super suck. Siri used to work better than it does today. It's often a bigger chore than opening the app and tapping the button.
These quirks hit me on a daily basis when all I want to do is control my lights and locks
Being able to say "turn on the lights in the living room and turn off the fans in the kids' rooms" – is not a crazy use case.
Instead, I literally have to say:
- Turn on Living Room light
- wait
- turn off <son's name> bedroom fan
- wait
- turn off <daughter's name> bedroom fan
Yes, I could actually say "turn off all the fans" (I believe Siri understands that) but that's usually not what I want.
Another example, you have 3-4 timers going: Let's say I'm cooking and have an oven timer, but also have a timer for my kids to stop their device time. I may have a few going. But being able to say "cancel all the timers except the longest one" is TRIVIAL for a first year programmer to implement. But instead, it's a slog with Siri.
At this point only a handful of apps that are irreplaceable are propping iOS up and that won’t last.
What irreplaceable apps are propping up iOS? What's the data showing that 50% of iPhone users are basically just begging to get off the platform?
They are not though. Absolute control over the platform means Apple has the responsibility to have more vision for the future than anyone else. They do not and will fail to satisfy their users. It will result in either a dramatic change of leadership and strategy and or drive the customers elsewhere.
They will implement something using GPT-4 or Claude and this whole mess will be forgotten.
A better Siri is an expense to keep up the premium brand, not something that they will monetize. For particular uses of AI people will just want particular apps.
I know plenty of folks in their 40s and 50s who have used Siri as their primary way to search the internet for years.
Even my 2.5 year old will ask Alexa and Siri to do things, sometimes far away from any device that could respond.
"Text my wife and say I'll be late." is still too much to ask: it responds with 20 questions about all the parameters.
"turn up the volume" does actually work for the first time, lately. (Bravo, Android).
"open app antenna pod and begin playing" is way out of the question. Suck.
"Turn off all the lights in the house" works, but "turn off all the lights" does not. What?!??
- It regularly displays an accurate transcription with exactly the same text that usually works and then sits there, apparently indefinitely, doing nothing.
- Sometimes it’s very slow to react. This seems to be separate from the above “takes literally forever” issue.
- Siri is apparently incapable of doing a lot of things that ought to work. For example, for years, trying to use Siri on a watch to place a call on Bluetooth (while the phone is right there) would have nonsensical effects.
These won’t be fixed with a better model. They will be fixed with a better architecture. OpenAI and Anthopic can’t provide that except insofar as they might inspire Apple to wire useful functionality up to something like MCP to allow the model to do useful things.
> Even my son suggested things like "I wish your phone had ChatGPT and you could ask it to organize all your apps into folders" – we can all come up with really basic things they could've done so easily, with privacy built in.
I’m not convinced the industry knows how to expose uncontrolled data like one’s folders to an LLM without gaping exploit opportunities. Apple won’t exploit you deliberately, as that’s not their MO, but they are not immune to letting things resembling instructions that are in one of your folders exploit you.
Felt like the most obvious "CEO says we need to do this, doesn't matter if it isn't ready" kinda thing. Straight up checking a box for parity with Samsung et al.
/s
Really wish this would be optional, but you know it won't be.
Apple is being far too conservative with a far too fast a developing piece of technology to possibly keep up unless they loosen up. But they're being run by a bunch of 50+ year old white guys trying to still be cool, but not understanding what's really going on. I'm not saying they need to publish a roadmap or anything, but they need to tell their marketing dept. to piss off and that not everything needs to be a "delightful surprise" on stage.
Apple has never been the company that does it first. They're the company that does it right. Arguably, their fuckup with LLMs was rushing a garbage product to market instead of waiting and watching and perfecting in the background.
> Apple is being far too conservative with a far too fast a developing piece of technology
Strongly disagree. OpenAI and Anthropic are blowing billions on speculative attempts at advancing the frontier. They're advancing, but at great cost and uncertainty in respect of future returns.
The smart move would be to recapitulate the deal with Google, possibly being paid by these cash-burning companies for the default AI slot on the iPhone, all the while watching what Apple's users do. Then, when the technology stabilises and the best models are known, Sherlocking the whole thing.
1. Siri - not the first assistant, absolute garbage.
2. Apple Maps (original) - utter garbage at launch, slightly better today in US.
3. Vision Pro - Not the first VR headset. Massive failure.
If anything, Apple has been tremendously successful few times when they were not first (phones, tablets, silicon ..) but they have also been tremendously faltered when they were not first.
The third bullet is soft because ALL vr headsets have been flops.
All told you are actually painting a pretty solid picture of apples track record. They’ve launched so many things in the past 20 years and expanded into new markets (wearables, headphones, streaming hardware and services) that it’s impressive there aren’t more flops
So I don't think this a likely explanation. Maybe they just wanted to have an in-house solution but realized they have no chance at delivering that value on their own. But it can't be about UX predictability because Siri has none unless you're setting a timer.
My wishlist:
Let me talk to AI about anything on my screen. Hey AI why did this guy email me? Hey AI what’s this webpage about? Etc
AI designs the UI on the fly depending on the task I’m doing. No more specific apps? just a fluid interface for whatever I need.
Leave AI in listening or video mode and ask about my environment or have a conversation.
Then we can interact with multiple apps all via Siri and have them work together. To me that's a huge win.
So for instance if I wanted information on public transit options in London I'd tap the search bar in Safari, tap the mic icon, and say "Public transit options in London" and that used to work pretty much all the time. It would even work if I had a loud TV on or loud music on, and it was great about realizing when I'd stopped speaking and automatically starting the search.
Lately it has tended to cut off early, so I only get "Public transit options" entered. I can get it to work if I try again and say each word very loud and with a distinct short gap between the words.
My understanding is that modern dictation systems make heavy use of deep learning so I'd expect it shares from underlying technology with Siri. I wonder if there is a problem with that underlying technology?
bigyabai•4h ago