Cursor -> Claude Code -> OpenCode -> OpenCode + OpenCode-Manager -> Codex + Tmux + Tailscale -> OpenCode Serve + Tailscale.
Press enter or click to view image in full size
Key takeaways from the journey:
The Provider Lock-in Trap: While Claude Code is powerful, the inability to switch easily between Azure OpenAI, Anthropic, and OpenAI became a dealbreaker for "multi-cloud" local development.
The Memory Wall: opencode consumes a lot, and I didn't find a good way to optimize it https://github.com/anomalyco/opencode/issues/12687. I didn't spend a lot of time on it, because mostly was focused on my own projects. But just query to profile and find leaks didn't help me (github.com/dzianisv/opencode).
The "Open Workspace" Shift: Moving to a browser-based IDE via opencode serve solved my remote access and Git work-tree needs more effectively than my custom Tmux + Tailscale hacks. But codex + my patches to support github copilot (github.com/dzianisv/codex) is still more reliable, because each codex agent doesn't consume more than 100MB of vRAM.
Question for HN: For those building with codingl agents, are you sticking to high-level CLI tools like Claude Code, or are you moving toward open-source layers to avoid provider lock-in? If so, what tools do you use? Does it make sense for me to focus on opencode memory leaks, or with typescript/bun there is no feature? Not a big expert in bun/nodejs VM, but think it still could be limited somehow, like --max-old-space-size=512? Or issue is more fundamental?