frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Open in hackernews

Show HN: vLLM Studio – Web UI to manage vLLM/SGLang inference servers at home

https://github.com/0xsero/vllm-studio
1•week7820•2w ago

Comments

week7820•2w ago
Hi HN! I’m sharing vLLM Studio, a lightweight controller + web UI for managing local inference servers (vLLM / SGLang).

I built/found it because running local LLMs often becomes messy once you have multiple models/servers — you need an easy way to launch/evict models, track GPU status, and manage simple presets (“recipes”).

vLLM Studio provides: • model lifecycle (launch/evict) • GPU/server status + health endpoints • chat UI • recipes/presets for configs

Quick start: docker compose up

Would love feedback from anyone running Local LLM setups — what features would you want in a tool like this?