Just launched the beta of AI Nexus and wanted to share what I'm building.
The problem I'm solving: I was paying multiple LLM subscriptions (ChatGPT Plus + Claude Pro) and constantly switching tools, losing context, re-uploading files to every new chat. Realized I wasn't alone - lots of devs and power users hitting the same wall.
What I built: A professional workspace for AI that gives you:
- 100+ models through one interface (GPT-5, Claude, Gemini, etc. via OpenRouter)
- Projects with shared context files (upload once, use across all conversations in that project)
- Conversation branching (explore different approaches in parallel without losing your thread)
- Visual token management (see what's eating your context, prevent "context too long" errors)
- Prompt library, MCP server integration, full parameter control
I'm still figuring out the Pricing model.
- Option A: BYOK (bring your own OpenRouter key) at €5/month for platform access
- Option B: Managed subscriptions with included credits (€15-75/month range)
- Option C: Hybrid - offer both and see what people prefer. Leaning toward BYOK for developers, but worried it's too technical for broader audience
Happy to answer questions about the tech stack, design decisions, or why I thought building yet another AI tool was a good idea!