(So I guess you can have either a list of too many or a list of too few )
p.s. If anyone notices really cool work that would be even better if the creator did a Show HN, please let us know at hn@ycombinator.com.
> This allows the pianist to not have to turn pages, and more importantly, allows them to see the music and their hands at the same time, which is an unavoidable problem with traditional sheet music.
I could definitely see this being beneficial for beginners. When I lived in a dormitory during uni I often played familiar pieces from memory pretty late on a digital piano (with headphones) in extremely dim lighting so as not to disturb my roommate.
At some point I just stopped having to look down at the keyboard. I play a lot of stride piano as well and that probably conditioned me to just have a sort of musical proprioception for the instrument. And of course, there's numerous examples of unbelievable blind pianists - Stevie Wonder, Ray Charles, Art Tatum, etc.
When I start to think too much about what my fingers are doing I will play worse. For if I want to practise a particular part where I get the fingering wrong, sure, but when you play it for real, looking is counterproductive.
Something like this could be great for beginners tho. But simular to automatic guitar tuners I am not sure if you should get into the habit of this technology being around.
The "looking at your fingers" challenge then becomes that you start to play "by eye" instead of "by ear" (or "by feel") which I find is very hard to overcome. Especially when you are improvising.
Though in a sense "by sheet" is just as bad.
Uh, why? Lots of pros use sheet music, especially for complex pieces. I’ve never heard of an orchestra conductor insisting everyone be off-book.
It’s one thing to memorize pop songs or whatever, but nobody is out there shaming people for not memorizing Rachmaninov
The orchestra setting has the extra requitement that the sheet is a tool for communication. "The figure in bar 83" is not a term you have gained an intuitive understanding for, but is needed to communicate in an orchestra setting. The soloist though often times plays by heart, at least during performances, so as not to get distracted / get tunnel vision.
Look at the audience, out of the window, into nothingness — or even close your eyes. As long as you're there.
Maybe the next step is an app for people who don't read sheet music; it would light up the keys on the keyboard that you need to press, when you look at it...?
Same for guitar, highlight where to put your fingers on a fret for each chord.
That's why an app on glasses for this is better than a complete alternative instrument; the app is much cheaper and should work with any real instrument.
(Next step: evaluate afterwords and point out mistakes)
From the end of the video:
> I had the bars of music auto-sending at a preset interval. The pedals, instead of flipping bars, temporarily pause the flipping or speed it up, in case I'm desynced from the glasses.
That's how teleprompter apps work. Of course the difference is that when speaking you can pause a little if you get desynced, while with music you're like "on a train" and if you pause, it shows. But having an interval is not shocking.
Maybe the problem with this is that typically, sheet music resolution is not constant -- if there are many short notes it will result in a larger space on the page (a larger bitmap) than if there are few long notes.
So maybe an approach is to send a fixed number of bars, regardless of their actual size, so that the interval can have a constant relation with the tempo of the piece?
> My dream smart glasses would just listen to the performance and automatically flip bars
Couldn't the phone do that? The phone is already the part doing most of the work.
great video editing, OP. loved the playthru at the end with the text. you have real talent here, keep giong
got any more of these effects you'd put in that category?
I am looking for hackable smart glasses with camera which doesn't rely upon any proprietary service to work, Mentra seems to have a camera version but this video seems to suggest that we need to use their service all the time?
Mentra Live runs AugmentOS, so you can control all the I/O (camera, speakers, mic) in your own app with the AugmentOS SDK.
Regarding use the backend service all the time: Most apps/developers connect through official AugmentOS.org servers and focus on building their apps, but you can self-host your own backend if you want.
Would be interesting to dive a bit deeper where this 3s latency comes from. I assume the bitmaps have been pregenerated so I guess it's just the turnaround time when accessing the AugmentOS servers?
For text, everything is fast - the AugmentOS servers introduce <350ms round trip latency in most places/countries.
In next-generation glasses, this will be a lot faster because of better bitmaps handling, BMP encoding schemes, and also pre-loading of BMPs.
A lot of digital pianos have midi out, (there was a midi recording tool posted here months ago by another HN member) I wonder if you could use that midi signal to keep what you see in sync with your playing to drive page turns? You could even add a karaoke like highlight to show the note being played.
I was surprised about using dilation. I would have expected music21 to support rendering to a certain resolution/dpi setting directly and avoid rescaling the images. But from the music21 documentation this is not obvious how to do it. Rendering music to a low dpi screen nicely (pixel perfect) could circumvent some of the hardware limitations in the mid term.
There was a lot of hype around VR, but for the last 10 years I've been following progress on AR glasses.
The thing about AR is that it has the ability to enhance everything in your daily life, versus VR which is meant to be a separate experience.
Both Meta and Samsung are due to put out consumer AR glasses later this year and I think this might be the first wave of useful, daily-wear glasses we'll see.
Is there anyone who works in the AR space that could comment more?
I've been building smart glasses for over 7 years. First 6 years were in academia because the tech wasn't ready yet, because they were too heavy and battery didn't last long enough.
But in the last year, all-day battery smart glasses have become lightweight enough to be worn all day (see Even Realities G1, Vuzix Z100, etc.).
I believe smart glasses are having their iPhone moment in 2025 + 2026.
We make the smart glasses OS that Kevin used in the video to make this smart glasses app: AugmentOS.org
If there’s context (eg I go to another department at work, or see my child’s friend with their parent) I can get their name easier. But that barrier for being out of context can be difficult to surmount.
I’m curious if you also find yourself having trouble to remembering other names when in conversation (eg what was that politician called, what’s that city name, it begins with an F…)
Usually it’s proper nouns that I have trouble recalling. It’s almost like I need an Anki to refresh my mental DRAM and keep things recallable.
When I was a kid I did a round of testing for ADHD, and one of the tests is category fluency. Basically they gave me a category (e.g. Breakfast foods) and told me to say as many words in that category as I could in 60 seconds. I failed miserably; the administrator told me it was essentially as bad as people with severe cognitive deficiency. I just haven't worked about it since then! (Except with names!!)
Facial recognition, with name, where I know you from, and last time I saw you.
Basically I want the same notes my dental hygenist or optometrist uses to make light conversation with me during a checkup.
You could also maybe perhaps tie in a car's knowledge of adjacent vehicles, which is something I've wanted for ages. Since some newer models have some level of awareness about the speed / distance to / relative location of cars around your car, you could eg overlay the speed/acceleration info onto adjacent cars, so you'd know if you'll need to pass or speed up. Seems at least possible since the glasses have awareness of your head's orientation, something missing from any existing windscreen-style HUD system.
Let me connect via bluetooth direct to the glasses with anything and just tx/rx via a serial port and some low level protocol to get pixels/text on the screen.
This is also the only way I'd be able to buy a pair and feel safe it won't be able to be bricked in 2 years when some company shuts their server down and ends support.
This is an inferior means of development, however.
By going through AugmentOS, you get a much easier development experience, compatibility with multiple pairs of glasses (through both iOS and Android), and the ability to run multiple glasses apps simultaneously.
I noticed the iron ring on your pinkie -- Canadian engineer?
alex1115alex•1d ago
Awesome job Kevin!