What Cloi does:
- Contextual error capture: Grabs your stack trace, local files, and environment to understand the issue.
- Local LLM inference: Spins up Ollama on your box and generates targeted fixes—no external servers.
- Safe patch application: Presents you with diffs and only applies changes when you explicitly approve.
- Model‐agnostic: Ships with Phi-4 out of the box (surprisingly capable for its size!), but you can swap in any Ollama model you’ve installed.
Why we built it:
- Maintain full control over your code and data—ideal for security-sensitive projects
- Avoid recurring subscription fees and cloud vendor lock-in
- Keep your development flow entirely offline when you need it
Highlights: We hit 202 stars in just 5 days, which tells us we're not the only ones who wanted this! Cloi is plug-and-play (just install and run), and we designed it to be completely unopinionated, meaning you can you whatever Ollama model you want.
Get it now: npm install -g @cloi-ai/cloi
If you find Cloi useful, we’d really appreciate a star on GitHub. Try it out, let us know what you think, and happy debugging!
— Gabriel Cha & Mingyou Kim
mingyk•15h ago