And they always, always forget that it's not about "work", it's about whether a particular person will be able to contribute work that someone is willing to pay for. It's definitely NOT true that there'll always be more paid work to do that can be done by a particular person.
But this is what you get when these authors are wondering if something is good for "the economy" instead of thinking about actual people.
Pedantically, I think you mean Isaiah Berlin's foxes and hedgehogs[0].
> assuming we dont have some incredibly unlikely massive mobilization towards post-work post-scarcity thinking with active social safety nets
The problem being that we're not actually heading toward post-work or post-scarcity. We're heading towards post-knowledge-work. Any chance of UBI or similar will be summarily shot down by the Epstein class, most likely by using their ownership of 90+% of the media to drive a class war between the ascendant blue collar workers and the collapsing white collar workers.
thx lol -- why is this on the front page?
I take the negative comments here as expressions of anxiety about the future - we are, after all, this industry's auto workers in what may be the Rust Belt.
But in an AI post-work future, all the sideways moves have also been taken over by AI and robots. After all, “knowledge work” as a discipline will not be there, right? Whether I can write code, manage teams or copywrite. All of them automated.
When the complexity vs cost of automation tips in the favor of humans, that’s where I’ll have to skill to. You said it, trucking, welding, … That I have a PhD in knowledge work is just worthless paper now.
Even the 'safe' jobs are going to suffer a lot relative to today because it'll be a race to the bottom as more and more people try to shift into a reduced number of jobs with less demand.
Eg. Being a welder is safe from AI at least until the robots are perfected, but even they have two huge problems to contend with in the nearer term: reduced demand for their services as ~40% of the current workforce loses their income, plus an influx of competition for their own job as these same displaced workers look to shift jobs to a safer one.
People who assume everything will be alright post-AI because everything (mostly) worked itself out in the past are underestimating the extent to which the scale of so much changing so fast will negatively impact every aspect of our economy for anyone who isn't already a wealthy asset owner.
The economy can absorb buggy whip makers being obsoleted, or car factory workers being offshored, but even though those situations sucked for the people impacted by them, the scale of them was tiny (and the time to adjust was so much longer) compared to what is coming with AI displacement.
Unless the author is talking about learning how transformer architectures work, and I don't think they are (and if they are, it won't help the vast majority of people anyway) this is the dumbest advice I keep seeing.
You don't have to "learn AI". "Learning AI" will not be a moat, for anyone. The power of "AI" is that you prompt it in plain language. And it just goes and does the thing. Using AI is not really a skill. It arguably was a little bit when the models were a lot dumber, but now it isn't.
This "transition" is going to be way worse and way more disruptive than even people who think they are thinking about this problem assume.
I've been a daily user of LLMs since ChatGPT came out and I'm still figuring out new ways to use them on a daily basis.
I'm certainly aware of fun things I can do with local models, which takes setup, and if you're into e.g. ComfyUI those workflows can get very complicated. But, that's more a hobby—I don't actually think I get better results this way vs naively prompting a SoTA model.
There are some more advanced workflows for e.g. Claude Code, but I feel like all of that is likely to go away once the underlying models get better (for example, longer context windows mean less need to manage context).
>> The popular horse-switching fantasy answer is retraining. “Go back to school and become an engineer.” In theory, yes. In practice, rarely. The jump from an assembly-line worker to an engineer requires years of schooling and a different educational foundation.
Sure...and after "years of schooling" that work will also get taken by AI, since learning is accelerating. Remember 6yrs ago they told laid off people to learn to code? Then remember 3yrs ago they said to learn to prompt engineer. Unfortunately the tech moves faster than retraining for many.
>> So many things that we could do to help our customers.
Author assumes the customer is still in good shape. Not a great assumption, the value chain is being squeezed and disintermediated.
>> Which is why the idea that we’re somehow going to run out of work strikes me as absurd. It feels like a theory written by people who haven’t actually spent much time doing the work in the first place — serving customers, building products, and running businesses. There is always more that could be done.
Sure, there is always work. Not sure what the ROI is on the work though, is it worth paying someone to do? If so, why wasnt it done before?
Nothing has been so simple until now, and it seems strange that we just get to a certain point and then all of our problems are now just solved, completely. From my experience until now, at my current start-up, it has reduced our need to hire a tad, but not too much. However, I've also seen early stage start ups needing to hire because they started out building a product, and it became too much to handle, it is anectodal and current, I'd just find it strange that we just end up automating ourselves away, my own role has sort of turned into an AI enablement for the rest of start up, mostly C-level, business, pretty much everyone else than swe's. There is potential but mixed success for now. Agent's a good enough to build something that works, but not good enough to build the right solution.
I had a guy that ended up building a local dashboard in perl (the only thing claude could find on his mac) and wanted to distribute it to his colleagues. Engineers sometimes forget that normal people don't usually work in the unknown, they will solve problems in any way they known, in this case a airdropped folder of perl code sent to each other.
The way people use AI is the opposite of focus. It’s throw as much as you can against the wall because it is fast and cheap and possible. It is peanut buttering. Companies today mandate AI use so indiscriminately that you might as well call it a comparative disadvantage. It’s not that the AI is bad, but that people haven’t figured out that the core competency of an organization is focus and coordination to achieve a goal, and that victory is external — users, customers. But AI is used to unfocus, to spread, and to meet internal goals - build more, etc. The challenge was never writing more code or creating more content (all of which is ultimately a long tail of debt that needs to be cleaned up and managed by someone else), which can be done cheaply enough with other paths. It was figuring out what is worth doing, and aligning everyone around that. So in a sense, I welcome company AI mandates today because they are so misdirected as to make “doing things they inefficient way they were done before” a relative advantage.
thatmf•2h ago
> I am an EIR at Balderton Capital and principal of my own eponymous consulting business.
> I bring an uncommon perspective to enterprise software, having more than ten years’ experience in each of the CEO, CMO, and independent director roles in companies from zero to over $1B in revenues.
First, what the hell is an EIR.
Second, the fact that you are one at some Bertie-Wooster-ass venture capital firm means that you could probably retire tomorrow, if not necessarily in the manner to which you are accustomed
So yes, must be nice
guyzero•1h ago
polalavik•1h ago
simonw•1h ago
matthest•1h ago
gerdesj•1h ago
Is this really leading edge ... whatever it is supposed to be:
"The popular horse-switching fantasy answer is retraining. “Go back to school and become an engineer.” In theory, yes. In practice, rarely. The jump from an assembly-line worker to an engineer requires years of schooling and a different educational foundation."
They might as well pat the person who is losing their livelihood on the head and say "there, there, it will all come good in the wash".
emestifs•1h ago
locusofself•1h ago
gerdesj•1h ago
There are some uncommonly long ... m long ... dashes, sprinkled in para six and again later on. Perhaps our hero has a charmap app handy or has a remarkable keyboard or remembers a carefully curated, slack handful of compose sequences.
The system prompt for this beastie must surely have started with: You are a complete wanker, riff on the eighties "loadsa money" theme.