ollama launch claude --model gemma4:26bUsing ollama's api doesn't have the same issue, so I've stuck to using ollama for local development work.
And if you somehow managed to open up a big enough VRAM playground, the open weights models are not quite as good at wrangling such large context windows (even opus is hardly capable) without basically getting confused about what they were doing before they finish parsing it.
And is running a local model with Claude Code actually usable for any practical work compared to the hosted Anthropic models?
vbtechguy•3h ago
canyon289•1h ago