TL;DR Memex is a cross-platform desktop app for vibe coding. Think ChatGPT + Claude Code rolled into one.
Why we built it: We love chat tools like Perplexity and ChatGPT. We also love coding agents, like in Cursor and Windsurf. We don’t like that web-based app builders are opinionated about tech stack and we can’t run them locally. So, we built Memex to be a chat tool + coding agent that supports any tech stack.
What it can do today: Claude Code-like coding. Agentic web search / research. Pre-built templates (e.g. fullstack, iOS, python + modal, etc). Inline data analysis + viz. Checkpointing (shadow git repo). Privacy mode.
How it works: Written in TS+Rust+Python, using Tauri for the cross-platform build (macOS, Windows, Linux). It has a bundled python environment for data analysis. Agent uses a mix of Sonnet 3.7 + Haiku.
Status & roadmap: Free download with free tier and paid plan: https://memex.tech. Up next: [1] Additional model support (e.g. Gemini 2.5). [2] MCP support. [3] Computer use.
Ask: Kick the tires. Give us feedback on product + roadmap. If you love it – spread the word!
Thanks! David
diego_moita•6h ago
And only on the enterprise plan you're allowed to use other models.
Thanks but I'll stick with Aider.
davidvgilmore•5h ago
Regarding (2), we haven't supported other models yet because they each come with their own peculiarities regarding system prompting / tool use / etc. By focusing on just Sonnet+Haiku, it's allowed us to focus more time on other features (e.g. checkpointing ...).
Regarding BYOKs - a lot of our beta users didn't actually have keys setup, so it was easier for them to get started without bringing their own keys. The folks that have been interested in BYOKs have mainly wanted to bring their Bedrock/Vertex keys and are interested in enterprise/team features. Hence structuring it this way.
But we're posting here to get feedback and we are willing to make changes :)
ehsanu1•5h ago
Litellm is what we use internally, so we can support any LLM backend with any open source tool, and create virtual keys for each developer to monitor and manage usage limits etc.
davidvgilmore•5h ago
yeah - we want to get the BYOK support to be self-service but we just prioritized other things based on user feedback.
thanks again for the context.
diego_moita•5h ago
Primarily this. Models are evolving fast, every 2 months we see a model emerging with new interesting features. I want to be able to easily switch and try them.
davidvgilmore•5h ago
Our roadmap is essentially this: [1] Additional model support (e.g. Gemini 2.5). [2] MCP support. [3] Computer use.
so in the near future we aim to have the top agentic coding models supported
uyzeqzer•4h ago
davidvgilmore•2h ago
We're cooking