When replacing SWEs in the future gets discussed, a lot of people say things like "Oh someone has to review the code" and "they'll always need to be a human in the mix". But when are these humans supposed to acquire this knowledge?
Claude Code can help me create things a lot faster. I can vibe code stuff that would take me a lot of time to learn and build. But I understand none of it. When people talk about productivity, it seems like most gloss over the fact that those who already know how to do things & have experience are going to be the most productive. Yet I often hear no discussion as to how people should be bridging the knowledge gap.
I am sure others make a deliberate effort to learn while they leverage these tools, but human beings are lazy. With the constant pressure to increase velocity & productivity at all costs, people aren't going to prioritize learning things. At work I already see SWEs & people in technical roles taking the path of resistance:
- Asking copilot in agent mode to run a command instead of literally typing it themselves
- Suggesting a mermaid diagram for a large legacy system written in COBOL is accurate because "that's what the LLM said"
- Making the statement that "we really won't need to understand data structures & algorithms in the future, those problems are already solved"
Doesn't it seem like the era of AI is devaluing knowledge or talent and dressing it up as though it's making people more productive in ways they otherwise wouldn't be?
dataviz1000•1h ago
Likely the way calculators replaced calculators (people who sat in a room and calculated), AI will replace some jobs. Economics and history have shown that this tends to create more jobs of different sorts and than certain jobs that have been lost to automation.
There is no reason riot and burn Murdoch's printing presses to the ground.