According to the GLM documentation at least, you can edit you .env (~/.claude/settings.json) with an API key from the GLM Settings page. [0].
{ "env": { "ANTHROPIC_AUTH_TOKEN": "your_zai_api_key", "ANTHROPIC_BASE_URL": "https://api.z.ai/api/anthropic", "API_TIMEOUT_MS": "3000000" } }
Whilst you need to generate an API Key, you are still using the 'GLM Coding Plan' as per 'Methods for Using the GLM Coding Plan in Claude Code' documentation.
IF you arn't planning a local LLm strategy, you're surely tying your lifeline to anchors.
I more and more use Codex, because token usage is a blackbox and I think that we will see the next couple of month the usual three tier model evolving: free, normal, luxury.
2027 will be the year of token regulation by administrations worldwide. Until then take care for being ripped of at the luxury level.
xvector•1h ago
They aren't far off - they burn a tiny fraction of the cash of OAI and achieve similar ARR despite this - but as they tighten the belt it's inevitable that companies like OAI come in and offer more subsidized (unsustainable) inference to get people to switch. They will inevitably do the same "rug pull".
It'll be interesting to see how this plays out.