The local / "runs entirely on my machine" claim should probably come with an asterisk: the TUI part is local, but this still relies on an LLM API existing somewhere outside the machine (unless you're running an Ollama instance on the same host).
Nonetheless, this is neat!
Claude Code and other TUIs (except Codex) use a layer of abstraction over the raw terminal escape sequences.
I directly used `crossterm`, which gave me more control and lower latency.
For example if nothing is going on, I don't render anything in the terminal. Or only render at keypress.
leonardcser•1h ago
Claude Code is great, but I find it a bit laggy sometimes. I built a local alternative that's significantly more responsive with local models. Just wanted to share :)