So I built a standalone desktop app that handles indexing locally. It acts as a secure bridge: your files stay on your disk, and the AI only sees the specific snippets it needs to answer you.
What it does
Local RAG (Chat with Docs) – Index PDFs, DOCX, and MD files directly on your device.
Data Sovereignty – The database is a local file. You can put it on a USB stick and run it on an air-gapped machine.
Eternal Memory – The AI retains context across different sessions, so you don't have to re-explain yourself.
Hybrid/Offline Mode – A dedicated switch to cut off internet access and work strictly with local tools.
Built-in Editor – A full WYSIWYG editor (not just markdown), so you can write and organize inside the app.
Tech stack
Electron (for cross-platform desktop support)
Next.js (Client side)
Next.js (Server side, running locally within the app)
Local Vector Store (File-based storage)
And this was important to me: Zero vendor lock-in. Since the database is local, you own your data physically.
Links
Download / Site: https://loomind.me
I’d love feedback! Are there any specific file formats you'd like to see supported for indexing?