website: https://staccato.ai/
The AI music race has hit maturity for casual users, but for experienced music producers it’s just starting.
I’ve spoken with many producers, and their most apparent functional issues with AI tools are 1) the lack of creative control and 2) something that does all the work for them and doesn't work "with them".
AI song generators are impressive, but you’re stuck with a non-editable audio file. Or you end up hitting generate over and over until it’s “close enough” to your vision.
Back in the 80s, a protocol called MIDI provided a simple solution to editability. MIDI is a digital representation of musical notes. With it, you can move notes around, change durations, and swap instruments in and out on individual tracks.
There are many MIDI generators today, but clunky workflows contribute to creative control problems.
That’s the gap I saw, which led me to build the first chat-based AI MIDI tool.
You talk to it like any LLM tool, but with precise instructions about any aspect of music to give you full control. If you don’t like the result, you tell it what to change. You can refine, swap instruments, and shape an idea until it matches what you hear in your head. And unlike AI audio tools, the model understands music theory. You can do things like upload your own chord progressions and ask for melodies that fit.
This takes what early MIDI creators loved, total editability compared to audio, and pushes it to the limit.
If there are any music producers here, I would love to know what you think.