Getting displays that are bright, colour, reliable, artifact free and low power enough to be in glasses is not a solved problem.
Meta spent billions (literally) to get to the point where they decided to buy in magic leap's capacity for wave guides. Even then they are pretty shit. (its more nuanced than that, but its not in house displays being used in the upcoming AR glasses. The "orion" ones require solid carbide lenses, which are ludicrously expensive and difficult to make)
The state of the art is ok, but not good enough for apple (you'd hope.) still exhibits a whole bunch of rainbow sparkle, and is dim.
Next, what you want to do has a huge impact on how much battery you need.
The meta raybans have something like 1.3 watt-hour of battery. (https://moorinsightsstrategy.com/research-notes/ray-ban-meta... claims its actually ~0.5watt hour) Which means that if you want it do something useful you power budget is three tenths of bugger all.
For AR glasses to be useful, they need to be able to gather context about what you're asking it to do. Apple has the advantage of being able to offload to the iphone. But if you want precise location ie which room am I in, what object do I have in my hand, you need cameras.
But turning on cameras is like ~60mw. Then you need to process those images. Don't think about uploading to your phone by wifi, that'll eat >200mw.
So you're stuck with Bluetooth low energy. Even then, streaming images to your phone is going to eat battery. that means you need to bake your algorithms into silicon.
But.
Imagine you are trying to make immersive AR glasses. for that to work, you need to know what objects you are looking at, what direction your head is looking and draw your overlay to account for that head/object position. at 60hz or more. With a power budget of ~50-80mw.
That shit is hard. really really hard.
duxup•7h ago
ghaff•6h ago
duxup•6h ago
I was listening to a Steve Jobs interview and he told a story that they were working on a tablet, but once they saw what it could do they shelved it and went to work on the iPhone instead because they thought it was more relevant there.
That's to say, who knows what happens with this research.
ghaff•6h ago
weberer•6h ago
https://archive.is/qLqaF
They're developing two different models of glasses with cameras, completely ignoring the creepyness issues that made everyone hate Google Glass. If you're hoping for simple glasses with a HUD for notifications, time, etc, then don't hold your breath for the Apple glasses.
duxup•6h ago