I had an openfloor office and that meant we clear the space, fire up the projector and can host the film club. They met between 9pm to 7am once a month, on a friday. Each meeting had 100-150 people attending.
This went on for a few months - there was a curator assigned for each meet who would take a theme, and showcase films from the first film ever made in that genre all the way to modern versions and how the evolution happened. It was deconstruction of the craft.
The conversation also involved some of these filmmakers showing their own works, and how they did certain cuts and why.
What I observed was that none of these creative folks had any data to back their decisions they were making. It was purely gut and intuition, and I could see some editors and producers rolling their eyes - because they felt what that meant. Gut and intuition meant, uncertainty and it plays tricks with your head, so you are constantly making variations till at some point you develop tunnel vision.
Something a director told me, stuck with me - he said at some point we just want to be done with the project - and between the studio's demands and the producers prodding, we just let it go and move on to the next project.
A producer once said, that if every director was allowed to have their way - every movie would be 4 hours long and there wouldn’t be a single shot footage that would be left out.
But then there are cases of films like Man of Steel where Zack Snyder's cut of the film was better - but the studio and producer's call won and audiences weren’t as thrilled about it when they watched the film. And apart for the real enthusiasts nobody really goes and hunts down the director's cut of a film anyways.
The need was clear - the industry needed analytics to know what works and doesn’t, similar to how startups got the lean startup framework where everything shifted to building the minimum lovable product and then building it out from there. The caveat, unlike a product, which can launch, analytics can be acquired, and we can tweak and release - there is no concept of re-release of a film. It gets one shot and if it misses it, its done. That explains how the film industry has a 7% hit rate.
It has been a little over 10 years since I hosted the film club, but that issue lingers - and given how that the industry spends over 150bn a year on creating production content (tv shows, movies) it is a big problem.
We started using a hardware that captures occulometric data and heart rate that can be used complementarily during audience test screenings - and that gave us a lot of insane depth - on a microsecond level where content was engaging and where it was failing.
but the question then was, after a film has been shot, the cost of redoing shots becomes extremely expensive. What the industry calls as "pickup shots" is seldom done - because the artists have moved on and recreating that exact scene and moment is extremely hard. So we built a database of 120 odd films across various genres, and used that audience data to train a custom model, that can then look at past films to build benchmark data - and then use that as comparables against scripts that someone might be planning.
We launched this as Quanten Arc (arc.quanten.co) last week. This helps filmmakers - especially indie filmmakers who could use all the data in the world, because they might not even have the budgets to do audience testing. But it even more so helps AI filmmakers and studios, who can now identify the exact scenes that arent working - and can regenerate them with the required changes in the narrative.
Im curious to hear what you think about it. Am I solving a real problem or am I imagining a problem that doesn't exist and getting caught up in the beauty of data?