3-4 weeks ago I was trying to record a webinar for another side project. Screen recording, full screen presentation, talking through slides — should be simple.
Except I kept forgetting what I wanted to say. And every time I glanced at my notes, it was obvious on camera. Worse — when recording full screen, there's nowhere to put notes without them being captured.
Tried a few teleprompter apps. They either covered half my screen, scrolled at a fixed speed I couldn't match, or showed up in the recording.
So I thought: I've been a full-stack dev for 20+ years. Never touched Swift or SwiftUI. Can I actually build a native Mac app?
Turns out — yes. Took about 4 weeks of evenings and weekends. Claude helped me get past the Swift learning curve when I got stuck, but the architecture and problem-solving was mine.
What it does: - Sits in the MacBook notch area (right below camera) - Scrolls based on voice — speak and it moves, pause and it waits - Invisible during screen sharing/recording (uses a specific NSWindow level that screen capture APIs ignore) - Runs 100% locally, no cloud
The technically interesting bits: - Voice detection uses just audio input, mic — just detecting audio levels, not transcribing - The "invisible during screen share" trick is just the right window level + excluding from capture - SwiftUI made the UI surprisingly fast to build coming from React/NodeJS
The biggest surprise: Launched on Product Hunt on Dec 31st. No prep, no audience, no hunter — just posted it myself. Hit #4 for the day, got into their newsletter (1M+ subs), 75 sales so far. Wild.
For a niche utility I built to scratch my own itch, this exceeded every expectation.
Curious if others here have jumped into unfamiliar stacks for side projects. How steep was the learning curve?
Site: notchie.app