So we built our own MCP server on top of Postgres and added a set of context-engineering tools to make prompt-driven workflows more predictable. It gradually turned into a Postgres-based BaaS.
Key features:
1. Authentication with prebuilt UI components
2. Postgres database with typed SDK
3. Serverless functions with a secret manager
4. S3-compatible file storage
5. AI model integration with a unified inference API and simple GUI
6. MCP server with context-engineering endpoints (`fetch-docs`, `get-backend-metadata`, `get-table-schema`, etc.)
We're still early (4 months in). Some parts are rough, but we’re shipping daily and improving quickly :)
[1] Launch blog: https://insforge.dev/blog/insforge-launch
[2] Open-source repo (self-hosting): https://github.com/InsForge/InsForge
[3] Benchmark: https://github.com/InsForge/mcpmark
[4] Website (cloud hosting): https://insforge.dev/
[5] Our article on context engineering: https://insforge.dev/blog/why-context-is-everything-in-ai-co...
[6] Public roadmap: https://feedback.insforge.dev/roadmap