I’ve thought about psychology. I know LLMs can work as pseudo-therapists but I feel like that’s a field where the human connection / human element will remain important.
I think applying AI to other white collar roles that also require problem solving but do not have as much training, will prove much more difficult. Even coding on proprietary dominated domains is a much, much worse experience than people have with more accessible code. Using it for electronics has been hit or miss, embedded software is a bit shakey, game development is also challenging to use it for etc.
LLMs suit some jobs more than others. Its quite possible SWE's are the only profession massively affected - whether that means a evolution of the role or decline/death is another question.
I could say the same thing for software engineers I know as recently as the middle of last year, things can change very quickly.
Up until about December 2025 the fact that LLMs would replace us all (SWEs) was the punchline to a joke for most working developers I know. But most of the ones I know aren't laughing anymore, unless its a nervous laugh.
LLMs may (likely will) disrupt software developers first, but I don't think we are particularly unique and I don't see any reason why the same risks won't spread to virtually all knowledge work, especially if executives in those fields see a significant amount of SWEs being replaced by LLMs as an initial test case.
We'll see.
They may never reach that point.
But even if they never get good enough to replace all software developers, they can still cause massive job losses by allowing companies to do the same work with far fewer developers.
I think many of us who have been in software for a while will fantasize about low-tech jobs, I imagine there will be a bunch of hobby farms...
A condo costs $2500/month so I will either be homeless and freeze to death or be euthanized.
Maybe I'm a contrarian but I don't think there's hope for anyone that doesn't control resources.
Best choice would be moving up north and slaving in a mineral mine along with everyone else that lost their jobs. Like the 1920s.
I don't see myself being qualified for such a role since I am too short and don't have the physical leverage.
On one hand some jobs with human element are safe, at first. Think of artists being made obsolete by the camera. Portrait artists became mostly obsolete, but we still pay for art. It's the story behind the art that became important. Or, I still go to cafes with nice atmosphere and friendly staff. There are restaurants with robot staff here in Japan, much cheaper. After the meal you pay at the table without ever talking to a person. But it does not feel nice to sit in there, so I gladly pay a premium for the nice coffee.
On the other hand, it is not only software jobs in danger, but all office jobs. So a lot of people may suddenly be out of money. Let's say you open a cafe, but no one has money to come and pay. Society has to change a lot from the current model to be able to handle this.
It was manual labour first. Then there were teactors. Now robots join in - does that mean that personel cutting grass is obsolete? No , you need all of them. That means that city becomes nicer.
With software and AI I somehow feel the same will happen. How many features have you skipped just because it would help some niche set of users and PM or Management would not approve the spending. It is low priority. Or bugs that were annoying but financially not bringing much value.
I hope switching some work to AI , some companies will capture opportunity to make software better while others will make the same software cheaper
Most importantly there's often a period of general uncertainty and adoption, during which the new law is already in force, but LLMs will rely on whatever there was previously.
Most people find this job stressful and boring, but the same can be said about software engineering. Regular people pay money to have it dealt with.
Overall I think there will always be demand for handling the messiness of the real world and humans have the upperhand here because they learn as they go, not via release cycles costing a sizeable sum and taking months.
Seriously if the future manifests, all of these standard effort based jobs would become redundant...
The issue with outdated information is way overstated, it'd just add the current rules to the context when evaluating and be done with it. We're already at 1 million context size... That's enough for a lot of rules - and the number will likely go higher as time progresses
One thing I will add: while AI is getting really good at _doing_ the software building bits, I haven't yet seen it well integrated into the decision-making and political structure of organizations. Right now, it seems best in the hands of a high-agency individual empowered and able to make changes or 'ship' something, with them acting as the bridge.
This of course, is not a technical challenge, but I would expect the change in structure of organizations to make this more efficient to be slower than the pace of improvement we've seen over the last few years.
However I am not quite so defeated, I think that developers will continue to find employment in tech even as AI augments the roles. Experienced developers are the obvious pick for a hire to run agentic AI development tools, and even the obvious pick for managing a no-code endgame scenario as they are just smart technologists with strong problem solving skills.
I think the devs who were only here for the paycheck and would not reasonably pick software if it didn't pay so much, will probably be happy to retrain into something else but be disappointed by the paycut.
I am also excited by the prospect of being able to take on bigger scale side projects solo as that's really where my passion lies.
I think general purpose technologists will really excel in this new ecosystem as the industry will be back to moving fast and breaking stuff for a while, for better or worse. A lot of them call themselves programmers right now but will evolve pretty quickly.
Pragmatism, small teams and fast pace will best deliver software based projects, and the bottleneck in big orgs will become (or already was) the bureaucracy and communication layers. Small team, greenfield projects have a huge advantage in getting an MVP to market, which is pretty exciting for someone excited mostly by solving problems with technology.
Time will tell though, this is not career advice and times are chaotic. At the end of the day, there are other careers, and you were smart enough to get into software. You will be smart enough to find a new career.
this could be one of the silver linings to AI disrupting the industry. tech was better for the world when it was run by nerds that were in it for the love of the game.
But I don't think the demand will ever be zero, or that laypersons will ever write (useful) software using AI, because most people do not understand what software is, what it does, what it can do, where to start, what to ask, what is data, what is input vs. output, etc. They are incredibly clueless, and it's not a problem of intelligence. Some of the most clever people I know have no idea about this. (Maybe they don't care enough to understand, or maybe it's a mindset that you either have or don't have, IDK.)
I just don't see how we could do without people to think things through.
The way to survive it was to 1) move to a village/small town where you could have a garden for fruits, vegetables, corn, chickens, maybe a pig or two for winter. 2) Young people lived with their parents while the parents saved up to build/buy their children their own flat or house. Children whose parents saved up enough would often start a family after getting their own place. Those who didn’t, co-lived with parents.
The secure middle class corporate employment in the US is getting severely downgraded by AI. While there is talk of universal basic income the reality is that many many companies depend on the surplus that middle class families enjoy spending and without it, vast swaths of industries will get starved as well. The solution is to show people quickly how to hunt and gather and farm as makers instead of just employees. Figuring out what is needed, taking on a small corner of needs somewhere in meat space or online, and planting there. AI has been fantastic at helping even solo founders with that. They need to encourage a cornucopia of ideas and experimenters as early as k12. They need to set up more favorable conditions for handling the admin side as well.
If the US government does not encourage cornucopia of AI-powered small business entrepreneurship and lets monopolies squash that early, they will end up with far FAR worse conditions. Any monopoly who keeps pitching “universal basic income” while actively avoids paying taxes, will end up forced into more taxes.
Big tech needs to make room for people to build and grow businesses (looking at you, Apple, for copying every successful app with a native, and you Meta for eating every social competitor) or they will end up paying for everyone’s universal basic income and then some.
If this country wants to survive the AI era, it needs to remove the pink glasses of “secure corporate job” and teach people how to plant, hunt, and gather as independent players in the market really fast.
I don't really feel like it's a "bad" thing; I've said for a long time if a job can be automated, then it should be automated. I still do believe that, even if I am probably on the losing end of that in the not-too-distant future.
I think I am reasonably good at software, and I think I write code that's still a bit better than what Claude does. In fact, I suspect that will actually be true for quite awhile, but the problem is that "writing code 20% better" isn't exactly a selling point when my competition is $100/month and takes like 1/20th the time. Most software, even before AI, wasn't optimal and was kind of shitty, and good engineers were still always replaceable with shittier cheaper ones if it was economically viable.
I tend to land on my feet for this stuff, so I still think I'll be ok; I know how to use the tools and there will still need to be some humans who understand how this shit works, so I'm not worried about becoming homeless or anything. What I'm mostly worried about is that I won't ever have fun at work anymore. I liked solving problems, I liked thinking of clever solutions to avoid a mutex or increase concurrency, I liked figuring out how to squeeze a few percent more performance out of my given limitations. It's something I'm good at, and it's basically the only way to get decent money while doing math.
Since the ceiling for writing software has been significantly lowered, I think eventually the cushy yuppie status of software is going to shrink.
Maybe I should learn to weld or something.
I built significant pieces of the Copilot onboarding, purchasing, billing and settings flow. For eight months I headed up the Copilot anti-abuse effort. I then led the launch of GitHub Models, and am now working on other Copilot projects.
Tay.ai, Zoe and Copilot bots being deployed wrecking the platform being unable to fix infrastructure issues whilst the humans are just tweaking all the tiniest issues.
They should instead focus on GitHub actions and improving the uptime of the whole platform first before doing anything AI.
If so, why didn’t you personally fix them so that nobody could associate you as an individual with a broken CX?
If not, please let me know where to apply, because that sounds like a unicorn organization.
Humans in large groups do amazing and crappy things at the same time. Playing gotcha with someone’s resume is a shitty thing to do.
For example: Software engineer role is about automating people -> often not.
That just indicates lack of rigor. Also, if so. Who will make the ai automate people? GOD? People think poorly understood theory and gradient descent will produce God.
I wouldnt be so sure. They'll keep the people who can do what needs to be done with new tools. Current title is irrelevant.
In addition loosing a 400k tc vs. 2x 200ktc makes more sense if they are all prooompters and AI handlers anyway.
Most IC6/7+ would not code anyways - in fact a friend of mine said "we had our own agents we just called them IC4/5" - which was ironic but funny.
I am curious if we would ever get a new programming language like rust or go, without this creativity.
In a way, we have different products that does more or less same things (postgres vs mysql for example). The reason is there's difference of thought in the process. I doubt this will go away.
This is what bugs me the most. Those who are now at IC6/7 rose through the daily grind of coding and debugging from L1. But now that those jobs are getting automated how will someone rise to IC6!!? It’s as if first 10 rungs of a ladder are missing and only someone with an exceptionally good athleticism can jump up and start from 11th rung.
I think in the coming decades we will see IKEA effect in woodworking. Like it’s extremely easy to build cheap furniture whose individual parts are really compressed papers. There’s hardly any good carpenters left to do the real wood carpentry. Those who are left will cost a bomb (rightly so) and can only be afforded by rich people.
Maybe companies haven't seen it yet, but most office jobs can and will be eliminated in the next decade or two.
Don't ruminate on the future too much folks, you won't die by hunger.
Software validation was always the interesting part vs software verification. Validation asks the question, did you build what was actually necessary?
The job is changing, and I don't like it in many ways, but there we go. It's not the first time new tech has nuked my dev job and I had to change.
I have personal projects that I hand-code, and personal projects I hand to Claude. Depends on how boring the project is. If it's stuff I've already solved a bunch of times, I hand it off. If I have room for good learning, I code it myself.
Due to a text predictor?
I'm a daily user of the most recent Claude and while it's amazing at presenting other people's knowledge and reducing cognitive load by filling in the gaps, it's still just a machine that predicts text and that is a limitation which won't be overcome in this generation of such tools which, including research demonstrations, are close to a decade old already.
To me the main issue is that investors are not aware of these limitations and will keep pouring money into this way beyond everyone's breaking point. But really that's a failing of the world's economic system, which relies too much on their whims.
The "just" here is minimizing what has been the crux of the problem for the past ~5 years.
This technology has been capable of producing code all this time. The end result has been improving due to massive scaling efforts, and some relatively trivial engineering ("reasoning", "agents", etc.).
And yet reliability is still a massive problem. The tools still hallucinate, still lead the user in dead-end directions, and still do so confidently and randomly, without any discernible reason. Expert users are able to guide them to a certain extent, but whether the prompting incantation is done manually or via the trendy Markdown file of the week, it's all guesswork based on feelings and anecdata.
I'm personally not too worried about being replaced by these tools, even though my skillset is nothing remarkable. My opportunities might shrink, but this is a two-way street. Companies that use "AI" indiscriminately don't interest me either. The demand for quality human work and ingenuity will always exist, even within a sea of mediocrity.
I'm much more concerned with the societal impact of the mountains of shoddy software being produced, deployed into increasingly more critical infrastructure, and put into hands of incompetent and malicious people. There is very little thought and discussion on this topic, let alone any guardrails. "AI" companies are now attracting governments and advertisers, both full of malicious and incompetent people. The next decade is going to be interesting, that's for sure.
Most of the startups that get the attention are attempting to be the next big thing but a startup can just be a startup. It doesn't have to be big or glorious.
Someone who sells hot dogs (on a small scale) cant really hire a programmer but if he could (or could write it themselves) there would be plenty of software to write. You can make a nice interface with all of the sales statistics, inventory management, maps with competition and demographic data, work schedules, etc etc There is infinite complexity to even the simplest job. You could hire help and have an app talk them though every step in great detail with pictures, videos and animations. You can encode all of the little tricks that could normally take decades to learn. Say, on a busy spot you might not have time to spend 8 minutes properly cleaning the grills every 47 minutes but you could wipe down the glass every 4 minutes and clean part of the grills with alcohol every 11 minutes then clean it properly every 3 hours. The app might instruct to google location related news or other topics to talk about with the customer. If people are walking their dog they expect you to guess what breed it is and where it comes from then ask how old it is.
You might build a tech stack to help recognize their face, remember their name, what they ordered last time and how long ago. You're not suppose to but you know you want to.
You wont be coding for a glorious salary but will earn depending on the sector your chose. The software will be pure dog food of the finest quality in the world.
Grilling hot dogs is also very relaxing, can let the mind float a bit and have software ideas the way you should. Lots of bad ideas will come, I can show people pictures of themselves eating my hotdogs!
You can basically look at programming as the new literacy. You might want a fancy job writing letters for a nobleman but it is hardly the only application.
ninju•1h ago
The wheel of industry rolls forward and crushes everyone underneath it
socalgal2•1h ago
ofrzeta•1h ago
satvikpendem•1h ago
ofrzeta•58m ago
EDIT: also so-called "breaking your back" has the same effect as going to the gym. Sure I am aware that there are really back-breaking jobs and they should be helped by machines. But there's no rule to say that the helping machines need to to all of the work. A moderate amount of physical work is just beneficial to everyone.
parpfish•1h ago
AI-based job displacement will do wonders for raising class consciousness when it's too late.
the best time to unionize is when you don't need a union.
dmitrygr•1h ago
satvikpendem•1h ago
Unions and indeed any bargaining organization only have leverage when their people are needed, but what happens when the people themselves are needed no longer?
ThrowawayR2•1h ago
darth_avocado•1h ago
You can’t raise class awareness in other professions that have been undergoing job displacement for decades. Good luck trying to do it among software engineers where self worth rides high and empathy is non existent. They will still be arguing on HN that unionization is a bad idea.