The core idea: Most animation libraries work in milliseconds. Music works in beats. This creates a mismatch - hardcode a bounce for 500ms (perfect at 120 BPM), switch to 90 BPM, and everything drifts because 500ms is now 0.75 beats.
Emotive Engine uses musical time as the atomic unit. Specify animations in beats, and they automatically become: - 500ms at 120 BPM - 667ms at 90 BPM - 353ms at 170 BPM
Change tempo, everything adjusts. No recalculation needed.
Built for AI interfaces (chatbots, voice assistants) but works for any real-time character animation. Pure Canvas 2D, 60 FPS on mobile, 2,532 tests passing.
Live demo at https://emotiveengine.com/demo - the hero banner on GitHub was generated with the engine itself.
Happy to answer any technical questions! MIT licensed.
MultifokalHirn•10h ago
emotiveengine•10h ago