We've known for quite some time that many existing REST APIs are simply not suitable as LLM tools and we wanted to solve this problem by letting users simply write code and we would take care of deploying that code for them and hosting MCP servers. We call this Gram Functions and you can get started with it today by running `pnpm create @gram-ai/function` (or `bun create ...`, `npm create ...`).
Under the hood, when you send us code, we provision fly.io machines that run this code behind a Go server listening for tool calls. Once you've deployed, you'll be able to use the Gram dashboard to create a number of MCP servers that have tools generated from this code and any OpenAPI documents you may want to upload. The key thing is you create small MCP servers for your different agents by cherry picking from these sources. This approach to curation helps manage context bloat that massive MCP servers tend to introduce.
Hope you give it a try! The team will be around to respond to questions and feedback.
PS: If you'd like to poke at some code, Gram's code is up here: https://github.com/speakeasy-api/gram