We're throwing around the open-source label a little too wide for the actual goods delivered I find...
> Local backend server with full API Local model integration (vLLM, Ollama, LM Studio, etc.) Complete isolation from cloud services Zero external dependencies
Seems open source/open weight to me. They additionally offer some cloud hosted version.
edg5000•3w ago
WorldPeas•3w ago
a7m-1st•2w ago
edg5000•2w ago