Got back a super fast, thoughtful, and complete "nope" within hours. My hubris kicked in: "How hard can it be? I've done this before!"
Spoiler alert: Harder than I thought. But now I have LiteChat running at https://litechat.dev and honestly? I think it turned out pretty damn good !
*What makes it different:* - *100% client-side* – your data never leaves your browser. Bring your own API keys and you're set - *Multi-model everything* – OpenRouter for 300+ models, plus OpenAI, Claude, Gemini, local Ollama, whatever - *Workflow automation* – chain AI interactions, because that's what started this whole thing - *Power user stuff* – conversation folders, prompt library, search, keyboard shortcuts, the works - *Code blocks that don't suck* – edit them in-browser, run JS right in chat, export individual files or ZIP everything - *Git integration in the browser* – yes, really. Clone, commit, push from your chat app - *Virtual file system* – full CRUD file operations, all browser-based - *MCP server support* – hook into external tools and services
Started trying to build this "vibecoding" with https://t3.chat only (several arm joints thank me for eventually caving and using proper tools). Went full function-over-form so the UI has some... let's call them "discutable choices." Blame Gemini for that.
The repo is at https://github.com/DimitriGilbert/LiteChat/tree/beyond (beyond branch has all the latest stuff) and you can try it immediately at https://litechat.dev – no signup, no tracking, just bring your API keys.
If you're tired of vendor lock-in and want a chat that actually does what you want, maybe give it a shot?
Would love feedback on what's missing or what could be better. Flying solo on this so every star/issue/PR genuinely makes my day.