curl -o docker-compose.yml https://raw.githubusercontent.com/torrix-ai/install/main/doc... docker compose up
No external dependencies. All data stays in a local SQLite file on your machine.
It logs LLM calls through a HTTP proxy or a python/Node SDK : tokens, cost, latency, full prompt and response traces, reasoning token capture. Works with OpenAI, Anthropic, Gemini, Groq, Mistral, Azure Open AI and any Apen AI compatible end point.
Things I added as I actually used it on real agent pipelines: cost forecasting and hard budget caps, PII masking, model routing rules, evals with golden runs, AI judge, a prompt library with version history, run tags for filtering by environment, MCP server so AI Assistants can query your own logs and OTLP/HTTP ingestion for apps aöready using OpenTelemetry.
Community edition is free for one user with 7-day retention. Pro adds teams, RBAC, 30 day retention, API key management, full text search and audit logs.
SQLite doesn't scale to high write throughput. This is aimed at teams logging hundreds to low thousands of LLM calls per day, not millions. Happy to hear what people think and what is missing.
GitHub / install: https://github.com/torrix-ai/install Website: https://www.torrix.ai