So I built llmswap v5.1.0 with a workspace system that gives you persistent, per-project AI memory.
How it works:
- cd ~/work/api-platform → AI loads enterprise patterns, team conventions
- cd ~/learning/rust → AI loads your learning journey, where you struggled
- cd ~/personal/side-project → AI loads personal preferences, experiments
Each workspace has independent memory (context.md, learnings.md, decisions.md) that persists across sessions. Your AI mentor actually remembers what you learned yesterday, last week, last month.Key features:
• Auto-learning journals (AI extracts key learnings from every conversation)
• 6 teaching personas (rotate between Guru, Socrates, Coach for different perspectives)
• Works with ANY provider (Claude Sonnet 4.5, IBM Watsonx, GPT-4 o1, Gemini, Groq, Ollama)
• Python SDK + CLI in one tool
• Zero vendor lock-in
Think of it as "cURL for LLMs" - universal, simple, powerful.The workspace system is what makes this different. No competitor (Claude Code, Cursor, Continue.dev) has per-project persistent memory with auto-learning tracking.
Built for developers who:
- Manage multiple projects and lose context switching
- Are tired of re-explaining their tech stack every session
- Want AI that builds on previous learnings, not starts from zero
- Need different "modes" for work/learning/side projects
Open to feedback! Especially interested in: 1. What other workspace features would be useful?
2. How do you currently manage AI context across projects?
3. Would you use auto-learning journals?
GitHub: https://github.com/sreenathmmenon/llmswapPyPI: pip install llmswap==5.1.0
Docs: https://llmswap.org