I'm sure some people can be replaced by AI. I'm also sure a lot of these stories are just marketing for their crappy GPT wrapper
roboboffin•2h ago
I think workplaces will have to allow people time to adapt. So that if your particular skill set is replaced by AI, you have the ability to retrain to a part that isn’t.
Ultimately, large part of many jobs are repetitive, and can be replaced by pattern matching. The other side, creating new patterns, is hard and takes time. So, employers will have to take this into account. They may be long periods of “unproductive” time, or more risky evaluation to try new ideas.
izacus•2h ago
Who's doing all the prompting for people being laid off though? How does that transition look like?
Does the middle manager which before bugged people to do the work now write a prompt and commit code and file documents themselves?
roboboffin•1h ago
I’m not saying people will be laid off, although this is what the article is about. So, I think people will still be prompting, but if you can prompt an agent and it can happily code away, what are you supposed to be doing ? Watching it do its work ? The only option is that you will have to generate ideas of new work constantly to drive value. This is something that generally happens over time now, but as implementation becomes quicker; idea generation will have to accelerate.
izacus•1h ago
> So, I think people will still be prompting, but if you can prompt an agent and it can happily code away, what are you supposed to be doing ? Watching it do its work ?
Well, what do managers do once they prompt junior developers? ;)
Also, "tell a prompt and wait for it to finish without intervention" is not something that happens even with magical Claud Code.
I'd really like to see some actual theory (and position, people) surfaced that can be laid off due to AI and who and how then actually runs the LLMs in the company after layoffs. I've never been in a company where new work wouldn't fill up the available developer capacity, so I'm really interested in how the new world would look like.
fragmede•1h ago
> Also, "tell a prompt and wait for it to finish without intervention" is not something that happens even with magical Claud Code.
That is how you interact with OpenAI's Codex though.
roboboffin•1h ago
I just had a thought. It used to be that complex C++ systems used to take so long to compile that developers used to go and have a coffee etc. This was before distributed compiling.
Maybe it will return to that, the job will have a lot of waiting around, and “free” time.
joshstrange•2h ago
It’s the responsibility of individuals to continue learning. Choosing, and to be clear, it is a choice, to stop learning can have dire consequences.
We are now a few years into LLMs being widely available/used, and if someone’s chosen to stick their head in the sand and ignore what’s happening around them, then that’s on them.
> I think workplaces will have to allow people time to adapt.
This feels like a very outdated view to me. Maybe we are worse off for that being the case but by and large that will not happen. The people who take initiative and learn will advance, while the people who refused to learn anything new or change how they’ve been doing the job for XX years will be pushed out.
roboboffin•2h ago
The difference for me, is that things are changing too fast to keep up. For example, if a large part of your job is taken away seemingly overnight, by a new model, your whole job could change in a heartbeat.
What preparation are you supposed to do for this ? Previously, change was relatively slow and it was reasonable to keep up in your own time. I believe that is no longer possible.
bluefirebrand•1h ago
> It’s the responsibility of individuals to continue learning
Using AI is the opposite of learning.
I'm not just trying to be snarky and dismissive, either
That's the whole selling point of AI tools. "You can do this without learning it, because the AI knows how"
joshstrange•1h ago
> That's the whole selling point of AI tools. "You can do this without learning it, because the AI knows how"
I'm sure we are veering into "No true Scotsman" territory but that's not the type of learning/tools I'm suggesting. "Vibe Coding" is a scourge for anything more than a one-off POC but LLMs themselves are very helpful in pinpointing errors, writing common blocks of code (Copilot auto-complete style), and even things like Aider/Claude Code can be used in a good way if and only if you are reviewing _all_ the code it generates.
As soon as you disconnect yourself from the code it's game over. If you find yourself saying "Well it does what I want, commit/ship it" then you're doing it wrong.
On the other hand, there are some people who refuse to use LLMs for a wide range of reasons ranging from silly to absurd. Those people will be passed by and have no one to blame but themselves. LLMs are simply another tool in the tool box.
I am not a horse cart driver, I am a transportation expert. If the means of transport changes/advances then so will I. I will not get bogged down in "I've been driving horses for XX years and that's what I want do till the day I die", that's just silly. You have to change with the times.
bluefirebrand•49m ago
> As soon as you disconnect yourself from the code it's game over
We agree on this
The only difference is that I view using LLM generated code as already a significant disconnect from the code, and you seem to think some LLM usage is possible without disconnecting from the code
Maybe you're right but I have been trying to use them this way and so far I find it makes me completely detached from what I'm building
harimau777•1h ago
I'm not sure that many people have time to continue learning. Certainly when one is young it is easy, but at some point people are spending their time outside of work building a life, raising a family, etc.
hulitu•1h ago
> So that if your particular skill set is replaced by AI
Is AI capable of any skill ?
I mean, Microsoft or Google SW looks like it is written by AI but this is since 20 years
ramon156•3h ago
roboboffin•2h ago
Ultimately, large part of many jobs are repetitive, and can be replaced by pattern matching. The other side, creating new patterns, is hard and takes time. So, employers will have to take this into account. They may be long periods of “unproductive” time, or more risky evaluation to try new ideas.
izacus•2h ago
Does the middle manager which before bugged people to do the work now write a prompt and commit code and file documents themselves?
roboboffin•1h ago
izacus•1h ago
Well, what do managers do once they prompt junior developers? ;)
Also, "tell a prompt and wait for it to finish without intervention" is not something that happens even with magical Claud Code.
I'd really like to see some actual theory (and position, people) surfaced that can be laid off due to AI and who and how then actually runs the LLMs in the company after layoffs. I've never been in a company where new work wouldn't fill up the available developer capacity, so I'm really interested in how the new world would look like.
fragmede•1h ago
That is how you interact with OpenAI's Codex though.
roboboffin•1h ago
Maybe it will return to that, the job will have a lot of waiting around, and “free” time.
joshstrange•2h ago
We are now a few years into LLMs being widely available/used, and if someone’s chosen to stick their head in the sand and ignore what’s happening around them, then that’s on them.
> I think workplaces will have to allow people time to adapt.
This feels like a very outdated view to me. Maybe we are worse off for that being the case but by and large that will not happen. The people who take initiative and learn will advance, while the people who refused to learn anything new or change how they’ve been doing the job for XX years will be pushed out.
roboboffin•2h ago
What preparation are you supposed to do for this ? Previously, change was relatively slow and it was reasonable to keep up in your own time. I believe that is no longer possible.
bluefirebrand•1h ago
Using AI is the opposite of learning.
I'm not just trying to be snarky and dismissive, either
That's the whole selling point of AI tools. "You can do this without learning it, because the AI knows how"
joshstrange•1h ago
I'm sure we are veering into "No true Scotsman" territory but that's not the type of learning/tools I'm suggesting. "Vibe Coding" is a scourge for anything more than a one-off POC but LLMs themselves are very helpful in pinpointing errors, writing common blocks of code (Copilot auto-complete style), and even things like Aider/Claude Code can be used in a good way if and only if you are reviewing _all_ the code it generates.
As soon as you disconnect yourself from the code it's game over. If you find yourself saying "Well it does what I want, commit/ship it" then you're doing it wrong.
On the other hand, there are some people who refuse to use LLMs for a wide range of reasons ranging from silly to absurd. Those people will be passed by and have no one to blame but themselves. LLMs are simply another tool in the tool box.
I am not a horse cart driver, I am a transportation expert. If the means of transport changes/advances then so will I. I will not get bogged down in "I've been driving horses for XX years and that's what I want do till the day I die", that's just silly. You have to change with the times.
bluefirebrand•49m ago
We agree on this
The only difference is that I view using LLM generated code as already a significant disconnect from the code, and you seem to think some LLM usage is possible without disconnecting from the code
Maybe you're right but I have been trying to use them this way and so far I find it makes me completely detached from what I'm building
harimau777•1h ago
hulitu•1h ago
Is AI capable of any skill ? I mean, Microsoft or Google SW looks like it is written by AI but this is since 20 years