The interesting part is the backend. It's a single JS file (~100 lines) that handles web search, LLM streaming, and per-user conversation history. No vector database, no Redis, no separate storage service.
It runs inside a cell — an isolated environment with a built-in database, search index, and filesystem. The cell handles persistence and streaming natively, so the agent code only has to deal with the actual logic.
Tech: Next.js frontend, Tavily for search, OpenRouter for LLM (Gemini 2.5 Flash default).