Credentials are persisted via bind mounts, with clean workspace access and easy removal when done experimenting. Perfect for developers who are curious about AI coding assistance but cautious about system-wide installations.
The container maintains all the functionality of Claude Code while keeping your host system pristine. Just docker run, authenticate once, and start coding with AI help. Remove the container later and there are no traces left behind.
The README on GitHub has examples for easy integration into existing projects. I welcome suggestions and feedback!
roscas•15h ago
But if it is "a completely isolated environment" why does it need to login and get a token? It defeats isolation.
This should work like any other model, like we do with Ollama, download a model and it runs strict local with no network connections or tokens.
nezhar•14h ago
The isolation here refers to the workspace. Since you run the CLI in a container, the process can only access what you have mapped inside. This is helpful if you want to avoid issues like this: https://hackaday.com/2025/07/23/vibe-coding-goes-wrong-as-ai...
roscas•14h ago
I prefer local models. All I use and used on the local model could be on an online, no secrets here. The speed is more than acceptable for a low end cpu+gpu.
I stil use Perplexity sometimes for more complex questions.
rreinold927•6h ago