Most local LLM setups are not portable.
This bundles model, embeddings, chunks and metadata into a single export.
The exported package runs fully in the browser without internet or install.
Trying to solve reproducibility and deployment in restricted environments.
https://github.com/muthuishere/offline-llm-knowledge-system �