It’s built with Next.js, TypeScript, and Tailwind, and uses Tambo’s withInteractable() to let the AI control UI components directly -opening apps, updating data, or even generating new UI elements through chat.
Each app is a registered component, and actions are exposed as tools in tambo.ts. For example, Tambo can:
open the Notepad or Calendar, add events or notes, fetch map directions, edit images via AI prompts.
Each tool in the OS exposes specific actions (like editing an image or adding an event), while each component defines the visual interface the AI can manipulate. This creates a conversational layer over the entire UI, where Tambo doesn’t just respond - it operates the interface.
Everything runs client-side, and you can switch between apps using a floating dock - like a tiny OS inside your browser tab.
GitHub: https://github.com/akdeepankar/tambo-os Try: https://tambo-os.vercel.app Docs: https://docs.tambo.co/