This is really cool! Live music, game shows, holiday light displays, and anything in between can hugely benefit from this kind of tech.
The whole Who Wants To Be a Millionaire sequence comes to mind (where, on an arbitrarily timed cue, the lights physically rotate downwards, synchronized with the electronic score and floor panel animations, to bring pressure onto the contestant). And from a bit of research, they needed to do a fair amount of work for that, which arguably could have been "orchestrated" from software like this: https://www.tpimeamagazine.com/robe-rig-lights-who-wants-to-...
> Synching the lighting consoles to receive MIDI triggers from the show’s gaming computer which activates specific commands for sound and video related to screen content was an intense talk that took plenty of work and lateral thinking. Additionally, more signals from the lighting console were used to access the media server operating a series of pixel SMD effects inbuilt in the set – so there was a lot of synching happening!
I'm also aware of software like https://lightkeyapp.com/en - but ossia score seems to focus more on temporal flexibility/coding/behavior as the primary focus, whereas Lightkey focuses on the physical layout of lighting at any given time. Arguably the feature sets should merge - Blender's ability to have multiple views that emphasize or de-emphasize the timeline comes to mind!
These things shouldn't be blocked behind massive investments. Anyone who can put a few cheap tablets on stands and plug in a MIDI keyboard should have best-in-class visualization capabilities, and be able to iterate on that work as more professional hardware becomes available. It's one of the things I love about open source.
btown•1h ago
The whole Who Wants To Be a Millionaire sequence comes to mind (where, on an arbitrarily timed cue, the lights physically rotate downwards, synchronized with the electronic score and floor panel animations, to bring pressure onto the contestant). And from a bit of research, they needed to do a fair amount of work for that, which arguably could have been "orchestrated" from software like this: https://www.tpimeamagazine.com/robe-rig-lights-who-wants-to-...
> Synching the lighting consoles to receive MIDI triggers from the show’s gaming computer which activates specific commands for sound and video related to screen content was an intense talk that took plenty of work and lateral thinking. Additionally, more signals from the lighting console were used to access the media server operating a series of pixel SMD effects inbuilt in the set – so there was a lot of synching happening!
I'm also aware of software like https://lightkeyapp.com/en - but ossia score seems to focus more on temporal flexibility/coding/behavior as the primary focus, whereas Lightkey focuses on the physical layout of lighting at any given time. Arguably the feature sets should merge - Blender's ability to have multiple views that emphasize or de-emphasize the timeline comes to mind!
These things shouldn't be blocked behind massive investments. Anyone who can put a few cheap tablets on stands and plug in a MIDI keyboard should have best-in-class visualization capabilities, and be able to iterate on that work as more professional hardware becomes available. It's one of the things I love about open source.