> This allows the pianist to not have to turn pages, and more importantly, allows them to see the music and their hands at the same time, which is an unavoidable problem with traditional sheet music.
I could definitely see this being beneficial for beginners. When I lived in a dormitory during uni I often played familiar pieces from memory pretty late on a digital piano (with headphones) in extremely dim lighting so as not to disturb my roommate.
At some point I just stopped having to look down at the keyboard. I play a lot of stride piano as well and that probably conditioned me to just have a sort of musical proprioception for the instrument. And of course, there's numerous examples of unbelievable blind pianists - Stevie Wonder, Ray Charles, Art Tatum, etc.
When I start to think too much about what my fingers are doing I will play worse. For if I want to practise a particular part where I get the fingering wrong, sure, but when you play it for real, looking is counterproductive.
Something like this could be great for beginners tho. But simular to automatic guitar tuners I am not sure if you should get into the habit of this technology being around.
The "looking at your fingers" challenge then becomes that you start to play "by eye" instead of "by ear" (or "by feel") which I find is very hard to overcome. Especially when you are improvising.
Though in a sense "by sheet" is just as bad.
Uh, why? Lots of pros use sheet music, especially for complex pieces. I’ve never heard of an orchestra conductor insisting everyone be off-book.
It’s one thing to memorize pop songs or whatever, but nobody is out there shaming people for not memorizing Rachmaninov
Maybe the next step is an app for people who don't read sheet music; it would light up the keys on the keyboard that you need to press, when you look at it...?
Same for guitar, highlight where to put your fingers on a fret for each chord.
That's why an app on glasses for this is better than a complete alternative instrument; the app is much cheaper and should work with any real instrument.
(Next step: evaluate afterwords and point out mistakes)
From the end of the video:
> I had the bars of music auto-sending at a preset interval. The pedals, instead of flipping bars, temporarily pause the flipping or speed it up, in case I'm desynced from the glasses.
That's how teleprompter apps work. Of course the difference is that when speaking you can pause a little if you get desynced, while with music you're like "on a train" and if you pause, it shows. But having an interval is not shocking.
Maybe the problem with this is that typically, sheet music resolution is not constant -- if there are many short notes it will result in a larger space on the page (a larger bitmap) than if there are few long notes.
So maybe an approach is to send a fixed number of bars, regardless of their actual size, so that the interval can have a constant relation with the tempo of the piece?
> My dream smart glasses would just listen to the performance and automatically flip bars
Couldn't the phone do that? The phone is already the part doing most of the work.
great video editing, OP. loved the playthru at the end with the text. you have real talent here, keep giong
I am looking for hackable smart glasses with camera which doesn't rely upon any proprietary service to work, Mentra seems to have a camera version but this video seems to suggest that we need to use their service all the time?
Would be interesting to dive a bit deeper where this 3s latency comes from. I assume the bitmaps have been pregenerated so I guess it's just the turnaround time when accessing the AugmentOS servers?
A lot of digital pianos have midi out, (there was a midi recording tool posted here months ago by another HN member) I wonder if you could use that midi signal to keep what you see in sync with your playing to drive page turns? You could even add a karaoke like highlight to show the note being played.
I was surprised about using dilation. I would have expected music21 to support rendering to a certain resolution/dpi setting directly and avoid rescaling the images. But from the music21 documentation this is not obvious how to do it. Rendering music to a low dpi screen nicely (pixel perfect) could circumvent some of the hardware limitations in the mid term.
There was a lot of hype around VR, but for the last 10 years I've been following progress on AR glasses.
The thing about AR is that it has the ability to enhance everything in your daily life, versus VR which is meant to be a separate experience.
Both Meta and Samsung are due to put out consumer AR glasses later this year and I think this might be the first wave of useful, daily-wear glasses we'll see.
Is there anyone who works in the AR space that could comment more?
I've been building smart glasses for over 7 years. First 6 years were in academia because the tech wasn't ready yet, because they were too heavy and battery didn't last long enough.
But in the last year, all-day battery smart glasses have become lightweight enough to be worn all day (see Even Realities G1, Vuzix Z100, etc.).
I believe smart glasses are having their iPhone moment in 2025 + 2026.
We make the smart glasses OS that Kevin used in the video to make this smart glasses app: AugmentOS.org
Let me connect via bluetooth direct to the glasses with anything and just tx/rx via a serial port and some low level protocol to get pixels/text on the screen.
This is also the only way I'd be able to buy a pair and feel safe it won't be able to be bricked in 2 years when some company shuts their server down and ends support.
This is an inferior means of development, however.
By going through AugmentOS, you get a much easier development experience, compatibility with multiple pairs of glasses (through both iOS and Android), and the ability to run multiple glasses apps simultaneously.
I noticed the iron ring on your pinkie -- Canadian engineer?
alex1115alex•12h ago
Awesome job Kevin!