But current dictation tools don’t capture any of that. I’ve tried using them with Cursor and Claude Code, and they miss all the rich interactions that make pair programming work.
Talking to an AI agent should feel the same as pairing with a teammate. That’s why we built a Voice Prompting tool: it captures your on-screen actions and stitches them into the prompt at the right moment.
So instead of saying: “Refactor this class [select class in IDE] to…”
You get: “Refactor [MicManager @114 MicManager.swift]…”
It’s early days, but if you like how we're thinking, give it a try and let us know what you think!