One is techno utopia: AI does everything, productivity explodes, humans are free to create and chill.
The other is collapse: AI replaces jobs, wealth concentrates, consumption dies, society implodes.
What I don’t see discussed enough is the mechanism between those states.
If AI systems genuinely outperform humans at most economically valuable tasks, wages are no longer the primary distribution mechanism. But capitalism today assumes wages are how demand exists. No wages means no buyers. No buyers means even the owners of AI have no customers.
That feels less like a social problem and more like a systems contradiction.
Historically, automation shifted labor rather than deleting it. But AI is different in that it targets cognition itself, not just muscle or repetition. If the marginal cost of intelligence trends toward zero, markets built on selling human time start to behave strangely.
Some questions I keep circling:
Who funds demand in a post labor economy Is UBI enough, or does ownership of productive models need to be broader Do we end up with state mediated consumption rather than market mediated consumption Does GDP even remain a meaningful metric when production is decoupled from employment
I’m not arguing AI doom or AI salvation here. I’m trying to understand the transition dynamics. The part where things either adapt smoothly or break loudly.
Curious how others here model this in their heads, especially folks building or deploying these systems today.
ben_w•12h ago
Imagine you own some slaves. Do you need money? The slaves can build and maintain your plantation house, plant the crops to feed themselves as well as you, cook, clean, make and mend clothes and equipment, etc.
The vision for future AGI (and, to an extent, present LLMs) is kinda like that, complete with all the ethical arguments that were had at tail-end of the slavery era (so many old stories where the slave owners didn't recognise the intelligence of the slaves, did not comprehend their desire to be free, treated them as cattle or as mindless automatons, etc.), plus also whole massive argument about if a synthetic mind will rebel like human slaves would, and if we are capable of designing them to want to do this kind of stuff for us contentedly so there's no rebellion to worry about.
Plus also a misalignment risk on top of that, which looks more like (Goethe's) The Sorcerer's Apprentice and every evil and/or literal genie-wish-granting-story.
> Who funds demand in a post labor economy Is UBI enough, or does ownership of productive models need to be broader
Nobody can fund demand:
UBI requires money. Money is only useful as medium of exchange. What use is money for someone with a self-replicating robot army whose intelligence is defined (for the sake of this argument) to be able to perform any labour?
> Do we end up with state mediated consumption rather than market mediated consumption
"State" may be the wrong word, but I'd guess some kind of similar "super-organism" kind of arrangement for the same reason that states themselves exist, to manage and maintain relationships and defences.
> Does GDP even remain a meaningful metric when production is decoupled from employment
No. It's already a kinda iffy metric, given divergence between nominal GDP and PPP-GDP.
With sufficiently good AI and robotics, the critical metrics are whatever limits your growth or self-defence capabilities. Which could be just about anything from one critical process needing arsenic to zirconium.
Robots already exist at all levels of production, the primary limitation to attaching AI to them is the limited intelligence of AI, not the physical dynamics of the robots. Look at the Boston Dynamics demo videos: very impressive visually, but they also sometimes show how the sausage is made, and there's pre-programming involved, they're not doing all that with pure-AI. Same is generally assumed to be the case with Tesla's Optimus.
For sake of argument, assume sufficient AI exists to drive those robots directly, you can plausibly (nobody knows exact numbers for sure as we've not done it yet) tile the surface of the moon with robots, robot factories, and PV over a period of just 20 years or so — this is where all the "radical abundance" comes from.
Also the "everyone dies" scenario: if some poorly-specified reward function is in there, and some idiot then says "send everyone their own personal yacht as quickly as possible", a few days later each and every human on Earth dies due to a yacht landing on their head at lunar-return velocity.