So I built CostLens. It's a drop-in replacement that automatically routes requests to cheaper models when possible, but falls back to premium ones when quality matters.
How it works: js // Just swap this: const openai = new OpenAI({ apiKey: 'sk-...' });
// For this: const costlens = new CostLens(); const openai = costlens.openai({ apiKey: 'sk-...' }); // Everything else stays exactly the same
Real savings: • Simple tasks: GPT-4 → GPT-4o-mini (95% cheaper) • Complex tasks: Still uses GPT-4 when needed • My bills dropped ~70% with zero code changes
Features: • Quality detection (auto-retries with better models if response is bad) • Works with existing code - no prompt changes needed • Caching with Redis • Instant mode (no signup required)
Try it: npm install costlens
The core SDK is free and works locally. I'm also building a dashboard for teams to track their AI spending.
NPM: https://www.npmjs.com/package/costlens
Anyone else tired of overpaying for AI APIs? What's your biggest cost pain point?
j_filipe•1h ago
I'm the dev behind this. Started as a weekend project because I kept getting sticker shock from my OpenAI bills. I'd use GPT-4 for literally everything - even "fix this typo" type requests that cost 20x more than they should.
The breakthrough was realizing most requests don't actually need the expensive models. So I built quality detection that tries the cheap model first, then upgrades only if the response is garbage.
Been using it in production for 3 months now. Went from ~$400/month to ~$120/month with zero changes to my actual prompts or code. The quality detection catches about 15-20% of requests that need the premium models.
Works with both OpenAI and Anthropic - Claude Opus → Claude Haiku saves even more than the OpenAI routing since the price gap is bigger.
Happy to answer any questions! The trickiest part was getting the quality scoring right - too aggressive and you get bad responses, too conservative and you don't save money.
Also working on a team dashboard, but wanted to get the core SDK out there first since it's been so useful for me.