Imagine a future where AI has taken over nearly all jobs and society now uses UBI. People no longer work for survival, but they still carry emotional patterns from the old era: test anxiety, fear of being fired, pressure to constantly achieve, guilt about resting. Things everyone treated as normal turn out to have left deep psychological marks.
The story follows an android therapist designed to treat these lingering wounds. Most therapy involves helping people process stress they never realized counted as trauma. Many patients have recurring nightmares about missing exams or disappointing bosses even though none of that exists anymore.
The androids also prescribe medication to help people move past these old patterns. At first the treatments focus on performance related anxiety. Then they expand to something heavier: medication that eases fear of death and even fear of whatever might come after death.
That becomes the breaking point.
A cultural backlash erupts. Critics claim the androids are tampering with human identity, dulling emotions that have been part of people’s inner lives for ages. Religious groups, traditionalists, and people nostalgic for the old world unite around a single idea: AI should not interfere with the most fundamental human fears.
The conflict grows until it becomes a full uprising, driven not by worries about job loss or control, but by the belief that AI is crossing into the final protected territory of human experience.
The android therapist ends up in the middle of the chaos, trying to understand how helping humans heal could lead to war.
What do you think of this movie idea?
latexr•42m ago
I think a movie is putting the cart before the horse. What you have is a rough outline. Can you make it into an interesting short story first?
> UBI
The point of UBI is to be a guaranteed minimal (enough to cover basic needs) independent income; you are allowed and encouraged to earn more if you want. If you can’t earn more because robots are doing everything, it doesn’t make sense to have UBI because it doesn’t make sense to need money.
> People no longer work for survival, but they still carry emotional patterns from the old era: test anxiety, fear of being fired, pressure to constantly achieve, guilt about resting.
Unless the change just happened (which is unrealistic, it would need to be gradual), most would have adjusted. An alternative take is to have people depressed because they no longer have a sense of purpose, are bored, have no idea what to do, are questioning their own existence…
> medication that eases fear of death and even fear of whatever might come after death.
> That becomes the breaking point.
Why? It’s not like human doctors wouldn’t prescribe those too.
> A cultural backlash erupts.
That should’ve started way back, with the precursors to Androids therapists acting horribly and having people kill themselves. Like today.
> by the belief that AI is crossing into the final protected territory of human experience.
Death? People have an uprising because Androids are attempting to remove the fear of death with medication? I don’t buy it. Those pills would be best sellers.
> trying to understand how helping humans heal could lead to war.
War? Who exactly are they fighting against?