AgentReady is an OpenAI-compatible proxy. You swap your base_url, and every prompt gets compressed before hitting the LLM — 40-60% fewer tokens, same responses, same streaming.
It uses a deterministic rule-based engine (not another LLM call): removes filler words, simplifies verbose constructions, strips redundant connectors. ~5ms overhead.
Works with any OpenAI-compatible SDK: Python, Node, LangChain, LlamaIndex, CrewAI, Vercel AI SDK.
christalingx•1h ago
It uses a deterministic rule-based engine (not another LLM call): removes filler words, simplifies verbose constructions, strips redundant connectors. ~5ms overhead.
Works with any OpenAI-compatible SDK: Python, Node, LangChain, LlamaIndex, CrewAI, Vercel AI SDK.
Free during beta, no credit card: https://agentready.cloud/hn
Python: pip install agentready-sdk && agentready init
Happy to answer any technical questions.