Sekha gives your LLM a permanent memory: - Unlimited conversation history with semantic search - Works with any model (Claude, GPT, Llama, local) - Self-hosted, your data stays local - Built with Rust + SQLite + embeddings. AGPL-3.0.
GitHub: [github.com/sekha-ai/sekha-controller] Docs: [docs.sekha.dev] | Site: [sekha.dev] Proof: https://imgur.com/a/Dgti8cO
sekha-ai•1h ago
The screenshot shows a fresh conversation where the AI remembers my favorite color from days ago—something that shouldn't be possible with a standard 8K context window.
It's self-hosted by default (Ollama + SQLite) but works with any LLM via LiteLLM/OpenRouter. Rust core, Python bridge, MCP tools for Claude Code, SDKs for JS/Python.
Happy to answer questions. Docs have architecture diagrams and Docker setup.