I built Living Memory Dynamics (LMD), a Python framework for simulating biologically-inspired "living" episodic memory directly in embedding space—no external LLM required for the core dynamics.
Memories evolve over time like living entities: they have metabolic energy states (vivid → active → dormant → fading → ghost), emotional trajectories, and resonance fields that let them influence each other. The central piece is a new differential equation I derived (the Joshua R. Thomas Memory Equation) that drives continuous-time evolution:
dM/dt = ∇φ(N) + Σⱼ Γᵢⱼ R(vᵢ, vⱼ) + A(M, ξ) + κη(t)
This enables emergent behaviors: automatic narrative arcs (setup → conflict → climax → resolution), creative leaps via four operators (analogical transfer, manifold walking, orthogonal composition, void extrapolation), and hierarchical idea grafting.
Key points:
Pure vector operations (PyTorch + optional Triton CUDA kernels for speed)
Optional lightweight language grounding with sentence-transformers
Pip installable: pip install living-memory-dynamics (extras for language/cuda/all)
Benchmarks on consumer hardware show real-time evolution for dozens of memories
It's very new (released yesterday), and I'm submitting to arXiv soon (waiting on endorsement). Full research paper and math in the repo docs.
Would love feedback—try the examples and let me know what kinds of narratives or ideas it generates for you!
GitHub: https://github.com/mordiaky/LMD
PyPI: https://pypi.org/project/living-memory-dynamics/