I started with a question: what existing infrastructure was built for humans that now needs to work for AI agents? Forms seemed like an obvious answer.
The problem is that AI agents can fill out most of a form, but some fields just need a human. Signatures, file uploads, anything subjective. And the form tools out there are either fully automated or fully manual. Nothing handles that handoff well.
So FormBridge is basically form infrastructure for agent-to-human handoffs. An agent creates a submission via API, fills what it knows, and gets back a URL. The human opens the link, sees what's already filled (with little badges showing who filled what), finishes the rest, and submits. Then it goes through validation, optional approvals, webhooks, etc.
A few decisions I'm happy with:
Field-level attribution: every field knows which actor (agent, human, system) touched it and when
Rotating resume tokens: the URL token rotates on every state change so stale links just stop working
MCP server baked in: it auto-generates MCP tools from your form definitions, so agents can discover and use forms without extra setup
Schema flexibility: you can throw Zod, JSON Schema, or OpenAPI specs at it and it normalizes everything internally
Stack is TypeScript, Hono for HTTP, React for the form UI, with swappable storage (memory for dev, SQLite for prod, S3 for files). 1,339 tests, 85.9% coverage.
Built this solo over about a week. Happy to talk through architecture or the handoff model if anyone's curious.