Hey HN,
I’ve been building something called Output, an AI agent that sees your screen, understands context, and controls your computer like a human.
To show what it can do, I recorded myself applying to YC using only my voice. No keyboard. No mouse. Just natural conversation.
The AI read the application form on screen, understood each question, and filled out the answers in real time. It even suggested reaching out to YC founders for referrals, based on what it remembered from earlier sessions.
It feels less like a tool and more like a teammate. She sees, listens, remembers, and takes action — just like a person would.
Output can:
See your screen in real time
Understand what is happening across applications
Control your computer directly (clicks, typing, navigation)
Remember past context and conversations
Help you get things done or do them for you
We are letting early users in at https://theoutput.co/
Would love your feedback, questions, or thoughts — especially if you are exploring agentic interfaces or AI that works across the real desktop environment.
Thanks!
cebert•4h ago