I built this because existing K8s GUIs felt heavy (~500MB+ RAM) for what they do. Kubeli stays under 150MB idle.
Tech: Tauri 2.0 with Rust backend, Next.js frontend. Uses kube-rs for direct K8s API access.
Interesting bits: Real-time resource watching via K8s watch API, optional AI debugging assistant that runs locally through Claude Code or OpenAI Codex CLI, and an MCP server for querying clusters from your editor.
Currently macOS only. Looking for feedback, especially on the AI integration - useful or gimmick?
atilladeniz•1h ago
Tech: Tauri 2.0 with Rust backend, Next.js frontend. Uses kube-rs for direct K8s API access.
Interesting bits: Real-time resource watching via K8s watch API, optional AI debugging assistant that runs locally through Claude Code or OpenAI Codex CLI, and an MCP server for querying clusters from your editor.
Currently macOS only. Looking for feedback, especially on the AI integration - useful or gimmick?