I've been building Sovant, a universal memory layer for AI that lets apps and agents remember user context across chats, apps and even different LLM providers.
Most AI systems forget everything once the session ends. Sovant makes memory portable and model-neutral. You can tell something to GPT and have Claude or Gemini recall it later. Instead of storing full conversations, it will remember structured facts, preferences and traits (the kind of context that stays useful and is reusable).
It's all done through a simple API + SDK, with built in rate limits and a visual dashboard.
You can try the memory persistence and cross-model recall in the Demo Chat, explore stored memories in the Dashboard, or build with the TypeScript/Python SDKs.
Would love feedback from this community on UX, technical or even conceptual.
You can test the beta here: [https://sovant.ai](https://sovant.ai)
Still early, so things may break but I'll do my best to fix them fast.
Thanks!