For the past year, I’ve been building a personal AI assistant from scratch. I was frustrated by two things: cloud-based LLMs using my conversations for training, and the lack of persistent, cross-chat memory in most UIs.
I wrote Aru Ai entirely in Vanilla JS as a PWA. There is no backend, no telemetry, and no data collection. Everything lives in your browser.
Here is how it works under the hood:
Local Storage: All chats, settings, and vector embeddings are stored locally in a SQLite database on your device. Bring Your Own Model: It connects directly to Gemini, OpenRouter, or local models via Ollama/LM Studio. Semantic Memory Module: It extracts facts about you (e.g., allergies, preferences) and dynamically injects them into the context window only when relevant. Canvas & Artifacts: Built-in support to generate code, charts (Chart.js), documents, and mini-games in a dedicated canvas. You can save these to a local library. Heuristic Module: A small math-based logic system that adjusts the AI's "mood" (sarcasm, humor, tone) based on how you interact with it. It also features different modes (Child/Teen/Adult) protected by a database password to restrict output and prevent API changes.
It’s completely free. The code isn't open-source yet as I am planning a major refactoring, but I wanted to share the functional PWA with the community to get some feedback on the architecture and the local-first approach.
Link: https://chat.aru-lab.space/
I'd be happy to answer any questions about running SQLite in the browser, semantic extraction, or building complex PWAs without frameworks.