Hi HN,
I’m a B.Tech student from India. I built Recall because my desktop was a disaster of downloaded PDFs and screenshots, but I didn't feel comfortable uploading my personal documents to a cloud-based AI just to get them organized.
What it is: A desktop app that runs 100% locally. It uses Ollama (Llama 3.2) to analyze file content, generate context-aware folder names, and move files automatically. I also recently added a local RAG (Retrieval-Augmented Generation) chat so I can query my notes without internet access.
The Stack:
Backend: Python
Inference: Ollama (running Llama 3.2 or Mistral)
GUI: CustomTkinter
Storage: Local ChromaDB (for the chat vectors)
Installation Note for Mac Users: Since I am a student developer, macOS may flag the app as "Unverified." This is a standard Gatekeeper check.
The Fix (2 Methods):
GUI: Go to System Settings > Privacy & Security and click "Open Anyway" next to the app.
Akhil34•1h ago
Backend: Python Inference: Ollama (running Llama 3.2 or Mistral) GUI: CustomTkinter Storage: Local ChromaDB (for the chat vectors) Installation Note for Mac Users: Since I am a student developer, macOS may flag the app as "Unverified." This is a standard Gatekeeper check. The Fix (2 Methods):
GUI: Go to System Settings > Privacy & Security and click "Open Anyway" next to the app.