Anyone running multi-agent setups (CrewAI, AutoGen, LangGraph) knows the pain: agents go rogue, tokens burn, no circuit breaker.
Building an open-source Go reverse proxy. Change one env var (OPENAI_BASE_URL=http://localhost:8080/v1), it detects death spirals and physically cuts the connection.
Local-only, no SDK, no cloud, single binary. Ship in 2 weeks if enough people need it.
Have you been burned by runaway agents?
veloryn•2h ago
wuweiaxin•1h ago
That's what I'm building: treat agent-to-LLM calls like microservice calls. Detect loops, enforce budgets, fail-open if the firewall itself crashes. Same patterns, new domain.
The interesting part is loop detection needs to be semantic, not just frequency-based. Two agents can politely argue in circles without ever triggering a rate limit.