For the local models, the agent must be able to run completely offline. When using remote models, it must only send network traffic to my servers. No checking for "updates" or pulling any type of dependencies from elsewhere.
When trying Opencode, it was a failure.
See: https://github.com/Chetic/opencode-offline
"This is the offline fork of OpenCode. Upstream OpenCode requires internet access for several core functions: the web UI proxies all requests to app.opencode.ai, ripgrep and LSP servers (TypeScript, Python, C++, Rust) are downloaded on first use, and model metadata is fetched from models.dev. In air-gapped or restricted network environments, this means no web interface (HTTP 500 on every request), no file search, no code intelligence, and no model definitions."
The problem is this project isn't maintained to standards and still uses Opencodes base.
Thanks for any suggestions.
josefcub•1h ago
https://github.com/charmbracelet/crush
Crush is pretty new, but getting better all the time. It's written in Go, so no node hijinks to get it working. It works fine with my ollama or llama-server localhost endpoints, and I've used it to make up a couple of internal projects without any issues.
It does have internal telemetry and such (including updating its list of external models it can use) that can be turned off in the crush.json configuration file.
If you're on a Mac, you can install via homebrew or use the more traditional route via Github.
Ms-J•22m ago
Regarding local models, can it use them? I found this discussion:
https://github.com/charmbracelet/crush/discussions/775
I didn't appreciate the meow maintainer's attitude converting it into a discussion and ignoring the issue even to this day.
"It does have internal telemetry and such (including updating its list of external models it can use) that can be turned off in the crush.json configuration file."
Is there a page or guide which explains the telemetry and any internet connected settings?
Forgot to add, I use Linux.