Yep, looks like that is it. Recent patent from one of the founders: https://scholar.google.com/citations?view_op=view_citation&h...
Pardon the AI crap, but:
> ...in most people, when they "talk to themselves" in their mind (inner speech or internal monologue), there is typically subtle, miniature activation of the voice-related muscles — especially in the larynx (vocal cords/folds), tongue, lips, and sometimes jaw or chin area. These movements are usually extremely small — often called subvocal or sub-articulatory activity — and almost nobody can feel or see them without sensitive equipment. They do not produce any audible sound (no air is pushed through to vibrate the vocal folds enough for sound). Key evidence comes from decades of research using electromyography (EMG), which records tiny electrical signals from muscles: EMG studies consistently show increased activity in laryngeal (voice box) muscles, tongue, and lip/chin areas during inner speech, silent reading, mental arithmetic, thinking in words, or other verbal thinking tasks
So, how long until my Airpods can read my mind?
Twice, well done!
you mean something that improves the detection and transcription of voices when the person doesn't realize the mic is on, like when it's in our pocket?
Granted, they are slowly but surely killing it, but it’s still going quite strong.
This is an interesting acquisition given their rumored Echo Show / Nest Hub competitor (1). Maybe this is part of their (albeit flawed and delayed) attempt to revitalize the Siri branding under their Apple Intelligence marketing. When you have to say the exact right words to Siri, or else she will add “Meeting at 10” as an all day calendar event, people get frustrated, and that non-technical illusion of the “digital assistant” is lost. If this is the model of understanding Apple have of their customers’ perception of Siri, then maybe their thinking is that giving Siri more non-verbal personable capability could be a differentiating factor in the smart hub market, along with the LLM rebuild. I could also see this tying into some sort of strategy for the Vision Pro.
Now, whether this hypothetical differentiating factor is worth $2 billion, I’m not so sure on, but I guess time will tell.
https://www.macrumors.com/2025/11/05/apple-smart-home-hub-20...
bnchrch•1h ago
alighter•1h ago
tobmlt•1h ago
I wish the iphone had word prediction and autocorrect that was from the previous centruy
thewebguyd•33m ago
Crazy he had pretty much perfected the tech of typing out text on a smartphone and then decided to throw it all away by moving to all-screen devices instead. A virtual keyboard with no tactile feel will never compare until we can have screens that can recreate the tactile bumps of a physical keyboard.
darth_avocado•1h ago
tartoran•42m ago
wahnfrieden•1h ago