- installs with a single binary (useful on bare servers)
- talks to any LLM endpoint and especially self-hosted open weight model
- could run headless, composable, and Unix-like
SHAI stands for SHell AI! Along the way I couldn’t find a `lite-llm` equivalent for Rust, so I built the crate `shai-llm`. I tried to make it as modular as possible so new tools can be added, UIs swapped, or an API layered on top. It was also my first major project in rust, I hope you'll like it :)
Key features:
- Open-source and LLM-agnostic: connect to any OpenAI-compatible endpoint
- Works out of the box using OVHcloud AI Endpoints (unauthenticated, strongly rate-limited)
- Function calling and MCP support (with OAuth)
- Custom Agent Configuration (model, system prompt, tool set)
SHAI is still evolving, but it's part of Hacktoberfest so contributions and feedback are most welcome.