GitHub: https://github.com/glama-ai/lightport
Live: https://glama.ai/ai/gateway
Why?
We're going all-in on the MCP ecosystem – it's what we're best at. Open-sourcing the gateway is both a thank-you to the community that helped us grow and a way to keep us focused.
The short backstory:
Lightport began as a fork of Portkey. We needed a way to make various LLM providers OpenAI-compatible, and Portkey provided a solid foundation. But it also came with many higher-level features (guardrails, billing, etc.) that we didn't think belonged at this layer – and that made it hard to iterate on provider compatibility. So we slimmed it down, fixed bugs, added integration tests for 80+ providers, and shaped it into one thing: a reliable, lightweight layer that makes any LLM provider OpenAI-compatible.
What's next:
More modules will follow – guardrails, billing, retries, telemetry – each open-sourced as standalone middleware. We'll continue to maintain Lightport in the open, with a focus on OpenAI compatibility across LLM providers.
For Glama users:
The Glama AI gateway will continue to function as a privacy-first gateway, but we will support only a curated set of providers (OpenAI, Anthropic, Google, Grok, Groq, DeepSeek, Alibaba, Moonshot, and a few others). For everything else, OpenRouter is a great alternative.
Try it, break it, build with it. I can't wait to see what you make.