Demo: https://youtu.be/i6pzHbdh0nE
Architecture approach: - AI processing: 100% local (Ollama, etc.) - Conversations: Never leave your machine - Web search: Only when needed, direct API calls - Privacy: Your data stays local, only search queries go online
This was the first feature on my roadmap after launching the basic version. The hybrid approach gives you the privacy benefits of local AI while solving the "knowledge cutoff" problem.
Built with Node.js. Started as a learning project but turning into something useful.
Since everything runs locally with no analytics, I'd love to connect on LinkedIn to hear feedback and discuss the technical approach with the community.
GitHub: https://github.com/mylocalaichat/mylocalai LinkedIn: https://www.linkedin.com/in/raviramadoss/ (Connect to discuss technical ideas!)