But snark away. It’s lazy. And yes it is so damn tedious.
> Finally, LLM-generated prose undermines a social contract of sorts: absent LLMs, it is presumed that of the reader and the writer, it is the writer that has undertaken the greater intellectual exertion. (That is, it is more work to write than to read!) For the reader, this is important: should they struggle with an idea, they can reasonably assume that the writer themselves understands it — and it is the least a reader can do to labor to make sense of it.
https://rfd.shared.oxide.computer/rfd/0576#_llms_as_writers
The heavy use of LLMs in writing makes people rightfully distrustful that they should put the time in to try to read what's written there.
Using LLMs for coding is different in many ways from writing, because the proof is more there in the pudding - you can run it, you can test it, etc. But the writing _is_ the writing, and the only way to know it's correct is to put in the work.
That doesn't mean you didn't put in the work! But I think it's why people are distrustful and have a bit of an allergic reaction to LLM-generated writing.
Looks like this comment is embracing the tools too?
I'd take cheap snark over something somebody didn't bother to write, but expect us to read.
People put out AI text, primarily, to run hustles.
So its writing style is a kind of internet version of "talking like a used car salesman".
With some people that's fine, but anyone with a healthy epistemic immune system is not going to listen to you.
If you want to save a few minutes, you'll just have to accept that.
I mean, obviously you can't know your actual error rates, but it seems useful to estimate a number for this and to have a rough intuition for what your target rate is.
Did chatGPT write this response?
I can hate LLMs for killing my craft while simultaneously using it to write a "happy birthday" message for a relative I hate or some corpo speak.
The post in the same vain, "We mourn our craft", did a much better job at this communicating the point without the AI influence.
"Upgrading your CPU wasn’t a spec sheet exercise — it was transformative."
"You weren’t just a user. You were a systems engineer by necessity."
"The tinkerer spirit didn’t die of natural causes — it was bought out and put to work optimising ad clicks."
And in general a lot of "It's not <alternative>, it's <something else>", with or without an em dash:
"But it wasn’t just the craft that changed. The promise changed."
it's really verbose. One of those in a piece might be eye-catching and make someone think, but an entire blog post made up of them is _tiresome_.
(2) Phrasing like this seems to come out of LLMs a lot, particularly ChatGPT:
"I don’t want to be dishonest about this. "
(3) Lots of use of very short catch sentences / almost sentence fragments to try to "punch up" the writing. Look at all of the paragraphs after the first in the section "The era that made me":
"These weren’t just products. " (start of a paragraph)
"And the software side matched." (next P)
"Then it professionalised."
"But it wasn’t just the craft that changed."
"But I adapted." (a few paragraphs after the previous one)
And .. more. It's like the LLM latched on to things that were locally "interesting" writing, but applies them globally, turning the entire thing into a soup of "ah-ha! hey! here!" completely ignorant of the terrible harm it does to the narrative structure and global readability of the piece.
I'd wish people would stop doing that. AI writing isn't even particularly good. Its not like it makes you into Dostoevsky, it just sloppifies your writing with the same lame mannerisms ("wasn't just X — it was Y"), the same short paragraphs, the same ems.
Bullshit. While abstraction has increased over time, AI is no mere incremental change. And the almost natural language interaction with an agent is not the same as Typescript over assembly (not to mention you could very well right C or Rust and the like, and know most of the details of the machine by heart, and no, microcode and low level abstractions are not a real counter-argument to that). Even less so if agents turn autonomous and you just herd them onto completion.
My experience so far is that to a first approximation, the quality of the code/software generated with AI corresponds to the quality of the developer using the AI tool surprisingly well. An inexperienced, bad dev will still generate a sub-par result while a great dev can produce great results.
The choices involved in using these tools are also not as binary as they are often made out to be, especially since agents have taken off. You can very much still decide to dedicate part of your day to chiseling away at important code to make it just right and make sure your brain is engaged in the result and exploring and growing with the problem at hand, while feeding background queues of agents with other tasks.
I would in fact say the biggest challenge of the AI tool revolution in terms of what to adapt to is just good ol' personal time management.
I don't think that's what people are upset about, or at least it's not for me. For me it's that writing code is really enjoyable, and delegating it to AI is hell on earth.
What I'm finding is that it's possible to integrate AI tools into your workflow in a big way without giving up on doing that, and I think there's a lot to say for a hybrid approach. The result of a fully-engaged brain (which still requires being right in there with the problem) using AI tools is better than the fully-hands-off way touted by some. Stay confident in your abilities and find your mix/work loop.
It's also possible to get a certain version of the rewards of coding from instrumenting AI tools. E.g. slicing up and sizing tasks to give to background agents that you can intuit from experience they'll be able to actually hand in a decent result on is similar to structuring/modularization exercises (e.g. with the goal to be readable or maintainable) in writing code, feelings-wise.
Reality: Promoted to management (of AI) without the raise or clout or the reward of mentoring.
I really feel this. Claude is going to forget whatever correction I give it, unless I take the time and effort to codify it in the prompt.
And LLMs are going to continue to get better (though the curve feels like it's flattening), regardless of whatever I do to "mentor" my own session. There's no feeling that I'm contributing to the growth of an individual, or the state-of-the-art of the industry.
I do use these tools, clearly see their potential, and know full well where this is going: capital is devaluing labor. My skills will become worthless. Maybe GP is right that at first only skilled developers can wield them to full effect, but it's obviously not going to stop there.
If I could destroy these things - as the Luddites tried - I would do so, but that's obviously impossible.
For now I'm forced to use them to stay relevant, and simply hope I can hold on to some kind of employment long enough to retire (or switch careers).
Related, the word “meritocracy” was coined in a book which was extremely critical of the whole concept. AI thankfully destroys it. Good riddance, don’t let the door hit your ass on the way out.
You'd destroy all industrial machinery? Do you realize what a huge negative impact this would have on modern quality of life?
thankfully I started down the FIRE route 20 years ago and now am more or less continuing to work because I want to
which will end for my employer if they insist on making me output generative excrement
Don't take this the wrong way but this is more of an age thing rather than a technology advancement thing.
Kids growing up nowadays that are interested in computers grow up feeling the same magic. That magic is partly derived from not truly understanding the thing you are doing and creating a mental "map" by yourself. There is nothing intrinsic to computing nowadays that makes it less magic than fiddling around with config.sys, in 50 years there will be old programmers reminiscing of "Remember when all new models were coming out every few months and we could fiddle around with the vector dimensionality and chunking length to get the best of gpt-6.2 RAG? Those were the times".
I am much younger than the poster you are replying to, but I feel much the same.
When people first contact ML, they fool themselves into believing it is intelligent... rather than a massive plagiarism and copyright IP theft machine.
Fun is important, but people thinking zero workmanship generated content is sustainable are still in the self-delusion stage marketers promote.
https://medium.com/ideas-into-action/ikigai-the-perfect-care...
I am not going to cite how many fads I've seen cycle in popularity, but many have seen the current active cons before. A firm that takes a dollar to make a dime in revenue is by definition a con kids. =3
"The Ice King"
There definitely is: the rent-seeking behavior is out of control. As a kid I could fiddle with config.sys (or rather autoexec.bat) while nowadays wrestling a file path out of my phone is a battle and the system files of my phone are kept from me.
Your last point is probably correct though, because AI will also allow systems to become orders of magnitude more complex still. So like the early days of the internet, these are still the fun days of AI, when the tool is overpowered compared to its uses.
maybe that just means it's a maturing field and we gotta adapt?
yes, the promise has changed, but you still gotta do it for the love of the game. anything else doesnt work.
Generally, I get that feeling from work projects that I've self-initiated to solve a problem. Fortunately, I get the chance to do this a lot. With the advent of agentic coding, I am able to solve problems at a much higher rate.
Quite often, I'll still "raw dog" a solution without AI (except for doc lookups) for fun, kind of as a way to prove to myself I can still do it when the power's out.
But everybody on this site lived through the first half of a logistic curve so that perspective seems strange to us.
it isn't all funeral marches and group crying sessions.
And don't let the blog post fool you , it is a rant about AI -- otherwise we would have heard complaints about the last 200 paradigm shifts in the industry over the past thirty years.
Sure, we got our share of dilbert-style agile/waterfall/tdd jokes shoved in our face, but no one wrote a blog post about how their identity was usurped by the waterfall model .
>And different in a way that challenges the identity I built around it and doesn’t satisfy in the way it did.
Everyone should do their own thing, but might I suggest that it is dangerous for anyone in this world to use a single pillar as their foundation for all identity and plinth of their character.
Today iron is produced by machines in factories by the mega-tonne.
We just happened to live in the age where code when from being beaten by hand to a mass produced product.
And so the change of technology goes.
Think of the wonderful world we could have if everyone just got their shit together and became paper trillionaire technocrats.
Go back 10 years and post "SWE's should form labor unions"
Then watch as your post drops to [dead] and people scream "How dare you rob me of theoretical millions of dollars I'll be making".
I wonder how many of these same downvoters are now worried about getting replaced with AI.
I'm probably 7 or 8 years from an easy retirement myself, so I can appreciate how that feels. Nobody really wants to feel disruption at this age, especially when they're the breadwinner for a family.
AI can't produce code yet with 100% predictability. If that day ever arrives, the blacksmith analogy will be apt.
Not sure what world you're from, but lots of products get sent back to the manufacture because they break.
a) They asked an LLM
b) "This is what all our competitors are doing"
c) They saw a video on Youtube by some big influencer
d) [...insert any other absurd reason...]
True story:
In one of our recent Enterprise Architecture meetings, I was lamenting the lack of a plan to deal with our massive tech debt, and used an example of a 5000 line regulatory reporting stored procedure written 10 years ago that noone understood. I was told my complaint was irrelevant because I could just dump it into ChatGPT and it would explain it to me. These are words uttered by a so-called Senior Developer, in an Enterprise Architecture meeting.
I wouldn't keep a ball of mud just because LLMs can usually make sense of them but to refactor such code debt is becoming increasingly trivial.
Yes. I mean... of course he was?. Firstly, I had already gone through this process with multiple LLMs, from various perspectives, including using Deep Research models to find out if any other businesses faced similar issues, and/or if products existed that could help with this. That lead me down a rabbit hole of data science products related to regulatory reporting of a completely different nature which was effectively useless. tl;dr: Virtually all LLMs - after understanding the context - recommended us doing thing we had already been urging the business to do - hire a Technical BA with experience in this field. And yes, that's what we ended up doing.
Now, give you some ideas about why his idea was obviously absurd:
- He had never seen the SP
- He didn't understand anything about regulatory reporting
- He didn't understand anything about financial derivatives
- He didn't understand the difference between Transact SQL and ANSI SQL
- No consideration given to IP
- etc etc
Those are the basics. Let's jump a little bit into the detail. Here's a rough snippet of what the SP looks like:
SELECT
CASE
WHEN t.FLD4_TXT IN ('CCS', 'CAC', 'DEBT', ..... 'ZBBR') THEN '37772BCA2221'
WHEN t.FLD4_TXT IN ('STCB') AND ISNULL(s.FLD5_TXT, s.FLD1_TXT) = 'X' THEN 'EUMKRT090011'
END as [Id When CounterParty Has No Valid LEI in Region]
-- remember, this is around 5000 lines long ....
Yes, that's a typical column name that has rotted over time, so noone even knows if it's still correct. Yes, those are typical CASE statements (170+ of them at last count, and no, they are not all equal or symmetric).So... you're not just dealing with incredibly unwieldy and non-standard SQL (omitted), noone really understands the business rules either.
So again... yes he was entirely wrong. There is nothing "trivial" about refactoring things that noone understands.
I'm turning 50 in April and am pretty excited about AI coding assistants. They make a lot of personal projects I've wanted to do but never had the time feasible.
What does it mean to be a productive developer in an AI tooling age? We don't quite know yet and it's also shifting all the time, so it becomes difficult to sort yourself into the range stably. For a lot of accomplished folks this is the first time they've felt that level of insecurity in a while, and it takes some getting used to.
Which also makes me refute the idea that AI coding is just another rung up on the programming abstraction ladder. Depending on how much you delegate to AI, I don't think it's really programming at all. It's project management. That's not a bad thing! But it's not really still programming.
Even just in the context of my human team, I feel less mentally engaged with the code. I don't know what everything does. (In principle, I could know, but I don't.) I see some code written in a way that differs from how I would have done it. But I'm not the one working day-in, day-out with the code. I'll ask questions, make suggestions, but I'm not going to force something unless I think it's really super important.
That said, I don't 100% like this. I enjoy programming. I enjoy computer science. I especially enjoy things more down the paths of algorithm design, Lisp, and the intersection of programming with mathematics. On my team, I do still do some programming. I could delegate it entirely, but I indulge myself and do a little bit.
I personally think that's a good path with AI too. I think we're at the point where, for many software application tasks, the programming could be entirely hands-off. Let AI do it all. But if I wish to, why not indulge in doing some myself also? Yeah, I know, I know, I'll get "left behind in the dust" and all of that. I'm not sure that I'm in that much of a hurry to churn out 50,000 lines of code a day; I'm cool with 45,100.
You can indulge even more by letting AI take care of the easy stuff so you can focus on the hard stuff.
I think that's very true. But... there's a reason I'm not a team lead or manager. I've done it in the past and I hate it. I enjoy doing the work, not tasking others with doing work.
I think it's healthy for everyone to evaluate whether one's personal reaction to AI is colored by this trend, or whether it's really being evaluated independently. Because while I share many of the negative feelings listed earlier, to me AI does still feel different; it has a lot more real utility.
It’s literally the same argument over and over and it’s the same comments over and over and over
HN will either get back to interesting stuff or simply turn into a support group for aging “coders” that refuse to adapt
I’m going to start flagging these as spam
I feel like the conversation does a good job of couching the situation we find ourselves in.
It also lets me focus more on improving things since I feel more liberated to scrap low quality components. I’m much braver to take on large refactors now – things that would have taken days now take minutes.
In many ways AI has made up for my growing lack of patience and inability to stay on task until 3am.
That is called...programming.
Last year I found out that I always was a creator, not a coder.
Why ask though?
If I’m familiar with a project, more often than not, I usually have a very good idea of the code I have to write within minutes of reading the ticket. Most of the time taken is finding the impact of the change, especially with dependencies that are present in the business domain, but are not reflected in the code.
I don’t need to ask what to code. I can deduce it as easily as doing 2+2. What I’m seeking is a reason not to write it the way I envisioned it. And if those reasons are technical, it’s not often a matter of code.
Steve Yegge recently did an interview on vibe coding (https://www.youtube.com/watch?v=zuJyJP517Uw) where he says, "arch mage engineers who fell out-of-love with the modern complexity of shipping meaningful code are rediscovering the magic that got them involved as engineers in the first place" <-- paraphrased for brevity.
I vividly remember, staying up all night to hand-code assembler primitive rendering libraries, the first time I built a voxel rendering engine and thinking it was like magic what you could do on a 486... I remember the early days at Relic, working on Homeworld and thinking we were casting spells, not writing software. Honestly, that magic faded and died for me. I don't personally think there is magic in building a Docker container. Call me old-fashioned.
These days, I've never been more excited about engineering. The tedium of the background wiring is gone. I'm back to creating new, magical things - I'm up at 2 AM again, sitting at my desk in the dark, surrounded by the soft glow of monitors and casting spells again.
These days, I've never been more excited about building. The frustration of being slow with the code is gone. I'm back to creating new, magical things - I'm up at 2 AM again, sitting at my desk in the dark, surrounded by the soft glow of monitors and casting spells.
I'm 45 yo. And also started programming quite early around 1988. In my case it was GWBAsic games and then C ModeX and A Later Allegro based games.
Things got so boring in the last 15 years, I got some joy in doing AI research (ML, agents, Genetic Algorithms, etc).
But now, it's so cool how I can again think about something and build it so easily. I'm really excited of what I can do now. And im ot talking about the next billion dollar startup and whatnot. But the small hacky projects that LLMs made capable.yo build in no time.
AI development actually feels like a similar rate of change. It took 8 years to go from the Atari 2600 to the Amiga.
An 8 year old computer doesn't quite capture the difference today.
This seems like a false dichotomy. You don't have to do this. It is still possible to build magical things. But agents aren't it, I don't think.
It is honestly extremely depressing to read this coming from the founder of Relic. Relic built magic. Dawn of War and Company of Heroes formed an important part of my teenage years. I formed connections, spent thousands of hours enjoying them together with other people, and pushed myself hard to become one of the top 100 players on the CoH leaderboards. Those competitive multiplayer games taught me everything there was to know about self-improvement, and formed the basis of my growth as an individual - learning that if I put my mind to it, I could be among the best at something, informed my worldview and led me to a life of perpetually pushing myself to further self-improvement, and from there I learned to code, draw, and play music. All of that while being part of amazing communities where I formed friendships that lasted decades.
All of this to say, Relic was magic. The work Relic did profoundly impacted my life. I wonder if you really believe your current role, "building trust infrastructure for AI agents", is actually magic? That it's going to profoundly impact the lives of thousands or millions?
I'm sorry for the jumbled nature of this post. I am on my phone, so I can't organize my thoughts as well as I would like. I am grateful to you for founding Relic, and this post probably comes off stupidly combative and ungrateful. But I would simply like to pose to you, to have a long think if what you're doing now is really where the magic is.
Edit: On further consideration, it's not clear the newly-created account I'm responding to is actually Alex Garden. The idea of potentially relating this personal anecdote to an impersonator is rather embarrassing, but I will nonetheless leave this up in the hope that if there are people who built magical things reading this, regardless of whether they're Alex Garden or someone else, that it might just inspire them to introspection about what building magic means, about the impact software can have on people's lives even if you don't see it, and whether this "agent" stuff is really it.
With AI, it is like coding is on GOD mode and sure I can bang out anything I want, but so can anyone else and it just doesn't feel like an accomplishment.
I think it's possible that we'll get to the point where "so can anyone else" becomes true, but it isn't today for most software. There's significant understanding required to ask for the right things and understand whether you're actually getting them.
That said, I think the accomplishment comes more so from the shaping of the idea. Even without the typing of code, I think that's where most of the interesting work lies. It's possible that AI develops "taste" such that it can sufficiently do this work, but I'm skeptical it happens in the near term.
I'm so excited about gardening again. Can't wait to do some. Employing a gardener to do my gardening for me is really making me enjoy gardening again!
I think it's hard for some people to grasp that programmers are motivated by different things. Some are motivated by shipping products to users, others are motivated to make code that's a giant elegant cathedral, still others love glorious hacks to bend the machine into doing things it was never really intended to do. And I'm sure I'm missing a few other categories.
I think the "AI ain't so bad" crowd are the ones who get the most satisfaction out of shipping product to users as quickly as possible, and that's totally fine. But I really wish they'd allow those of us who don't fall into that category to grieve just a little bit. This future isn't what I signed up for.
It's one thing to design a garden and admire the results, but some people get into their "zen happy place" by pulling up weeds.
It's your studio now. You have a staff of apprentices standing by, eager for instructions and commands. And you act like it's the worst thing that ever happened to you.
Last night I was thinking about this "xswarm" screen saver I had in 1992 on my DEC Ultrix workstation. I googled for the C source code and found it.
I asked Claude to convert it to Java, which it did in a few seconds. I compiled and ran it, and there it was again, like magic
I love messing about with computers still. I can work at the byte level on ESP-32s on tiny little devices, and build massive computation engines at the time time on the same laptop. It's amazing.
I feel for those who have lost their love of this space, but I have to be honest: it's not the space that's the problem. Try something new, try something different and difficult or ungainly. Do what you rail against. Explore.
That's what it's always been about.
Staying up late, hacking away at stuff like I used to, and it's been a blast.
Finally, Homeworld was awesome and it felt magical playing it.
The author is right. The magic has faded. It's sad. I'm still excited about what's possible, but it'll never create that same sense of awe, that knowledge that you can own the entire system from the power coming from the wall to the pixels on your screen.
All other professions had their time when technology came and automated things.
For example wood carvers, blacksmiths, butchers, bakers, candlestickmakers etc etc. All of those professions have been mostly taken over by machines in factories.
I view 'ai' as new machines in factories for producing code. We have reached the point where we have code factories which can produce things much more efficiently and quicker than any human can alone.
Where the professions still thrive is in the artisan market. There is always demand for hand crafted things which have been created with love and care.
I am hoping this stays true for my coding analogy. Then people who really care about making a good product will still have a market from customers who want something different from the mass produced norm.
Very, very few of those professions are thriving. Especially if we are talking true craftsmanship and not stuffing the oven with frozen pastries to create the smell and the corresponding illusion of artisinal work.
This does not seem true for AI writing software. It's neither reliable nor rigid.
IMO that is exactly what is happening here. Ai is making coding apps possible for the normal person. Yes they will need to be supervised and monitored, just like workers in a factory. But groups of normal low skilled workers will be able to create large pieces of software via ai, whic has only ever been possible by skilled teams of professinoals before.
It's so strange to read because to me its never been more fun to make software, its especially never been easier for an individual. The boring parts are being automated so I can work on the bespoke and artistic parts. The feedback loop is getting shorter to making something nice and workable. The investigation tools for profiling and pinpointing performance bottlenecks are better than ever, where Claude is just one new part of it.
I had my first paid programming job when I was 11, writing a database for the guy that we rented our pirate VHS tapes from.
AI is great.
As model costs come down that $20,000 will become a viable number for doing entirely AI-generate coding. So more than ever you don't want to be doing work that the AI is good enough at. Either jobs where performance matters or being able to code the stack of agents needed to produce high quality code in an application context.
Another commentor mentioned embedded, and after a brief phase of dabbling in that, mainly with nRF5x micros, I tend to agree. Less training data and obtuse tooling.
The culture change in tech has been the toughest part for me. I miss the combination of curiosity, optimism, creativity, and even the chaos that came with it. Nowadays it's much harder to find organizations like that.
I feel like I turned around and there seem to be no jobs now (500+ applications deep is a lot when you've always been given the first role you'd applied to) unless you have 2+ years commercial AI experience, which I don't, or perhaps want to sit in a SOC, which I don't. It's like a whole industry just disappeared while I had my back turned.
I looked at Java in Google Trends the other day, it doesn't feel like it was that long ago that people were bemoaning how abstracted that was, but it was everywhere. It doesn't seem to be anymore. I've tried telling myself that maybe it's because people are using LLMs to code, so it's not being searched for, but I think the game's probably up, we're in a different era now.
Not sure what I'm going to do for the next 20 years. I'm looking at getting a motorbike licence just to keep busy, but that won't pay the bills.
If vendors can't be bothered to use a C compiler from the last decade, I don't think they'll be adopting AI anytime soon.
At my work, as of 2026, we only now have a faction riled up about evangelizing clean code, OOP, and C++ design patterns. I hope the same delay keeps for all the rest of the "abstraction tower".
The anxiety I have that the author might not be explicitly stating is that as we look for places we add genuine value in the crevices of frontier models' shortcomings those crevices are getting more narrow by the day and a bit harder to find.
Just last night I worked with Claude and at the end of the evening I had it explain to me what we actually did. It was a "Her" (as in the movie) moment for me where the AI was now handholding me and not the other way around.
That's exactly it. And then people say "pivot to planning / overall logic / high-level design," but how long do we have before upper management decides that AI is good enough at that stuff, too, and shows us all the door?
If they believe they can get a product that's 95% of what an experienced engineer would give them for 5% of the cost, why bother keeping the engineer around?
I started programming in 1980, and I having just as much fun now as I did then. I literally cannot wait to sit down at my IDE and start writing.
But that was not always true. When I worked for a larger company, even some startups, it was not always fun. There's something about having full control over my environment that makes the work feel like play.
If you feel like programming isn't fun anymore, maybe switching to a consulting gig will help. It will give you the independence and control that you might be craving.
I’ve seen the code current tools produce if you’re not careful, or if you’re in a domain where training data is scarce. I could see a world where a couple of years from now companies need to bring outside people to fix vibe coded software that managed to gain traction. Hard to tell.
Working in AI startups strangely enough I see a lot of the same spirit of play and creativity applied to LLM based tools - I mean what is OpenClaw but a fun experiment
Those kids these days are going to reminisce about the early days of AI when prompts would be handwritten and LLMs would hallucinate
I’m not really sure 1983, 1993 or 2003 really was that gold of age but we look at it with rose colored glasses
Bad times to be a programmer. Start learning business.
About a decade ago, I went through a career crisis where I couldn't decide what job to do - whether technology was really the best choice for my particular temperament and skills.
Law? Too cutthroat. Civil service? Very bureaucratic. Academia? Bad pay. Journalism? An industry in decline.
It is a shame, what is happening. But I still think, even with AI hollowing out the fun parts, tech remains the best job for a smart, motivated person who's willing to learn new things.
https://medium.com/ideas-into-action/ikigai-the-perfect-care...
Fact is, the tech sector is filled with folks that find zero joy in what they do, chose a career for financial reasons, and end up being miserable to everyone including themselves.
The ex-service people would call these folks entitled Shitbirds, as no matter the situation some will complain about everything. Note, everyone still does well in most large corporate settings, but some are exhausting to be around on a project. =3
Bertrand Russel literally wrote a book called “in defense of idleness” because he knew that heavy hitters like him had to defend work abolitionism. The “work is good” crowd is why we can’t have nice things. You guys are time thief’s and ontologically evil. May all work supporters reincarnate as either durian fruits or cockroaches.
I observe that the way we taught math was not oriented on the idea that everyone would need to know trigonometric functions or how to do derivatives. I like to believe math curricula was centered around standardizing a system of thinking about maths and those of us who were serious about our educational development would all speak the same language. It was about learning a language and laying down processes that everyone else could understand. And that shaped us, and it's foolish to challenge or complain about that or, God forbid, radically change the way we teach math subjects because it damages our ability to think alike. (I know the above is probably completely idealistic verging on personal myth, but that's how I choose to look at it.)
In my opinion, we never approached software engineering the same way. We were so focused on the compiler and the type calculus, and we never taught people about what makes code valuable and robust. If I had FU money to burn today, I'd start a Mathnasium company focused around making kids into systems integrators with great soft skills and the ability to produce high quality software. I would pitch this business under the assumption that the jenga tower is going to be collapsing pretty much continuously for the next 25-50 years and civilization needs absolute unit super developers coming out of nowhere who will be able to make a small fortune helping companies dig their way out of 75 years of tech debt.
> …Not burnout…
Than meybe wadeAfay? ;)The sense of nostalgia that can turn too easily into a lament is powerful and real. But for me this all came well before AI had become all consuming... It's the just the latest manifestation of the process. I knew I didn't really understand computers anymore, not in the way I used to. I still love coding and building but it's no longer central to my job or lif3. It's useful, I enjoy it but at the same time I also marvel at the future that I find myself living in. I've done things with AI that I wouldn't have dared to start for lack of time. It's amazing and transformative and I love that too.
But I will always miss the Olden Days. I think more than anything it's the nostalgia for the 8-bit era that made me enjoy Stranger Things so much. :)
I feel that LLMs have finally put the ball in MY court. I feel sorry for the others, but you can always find puzzles in the toy section of the bookstore.
and they still call themselves 'full stack developers' :eyeroll:
* "Then it professionalised. Plug and Play arrived. Windows abstracted everything. The Wild West closed. Computers stopped being fascinating, cantankerous machines that demanded respect and understanding, and became appliances. The craft became invisible."
* "The machines I fell in love with became instruments of surveillance and extraction. The platforms that promised to connect us were really built to monetise us. The tinkerer spirit didn’t die of natural causes — it was bought out and put to work optimising ad clicks."
* "Previous technology shifts were “learn the new thing, apply existing skills.” AI isn’t that. It’s not a new platform or a new language or a new paradigm. It’s a shift in what it means to be good at this."
* "They’re writing TypeScript that compiles to JavaScript that runs in a V8 engine written in C++ that’s making system calls to an OS kernel that’s scheduling threads across cores they’ve never thought about, hitting RAM through a memory controller with caching layers they couldn’t diagram, all while npm pulls in 400 packages they’ve never read a line of... But sure. AI is the moment they lost track of what’s happening."
* "Typing was never the hard part."
* "I don’t have a neat conclusion. I’m not going to tell you that experienced developers just need to “push themselves up the stack” or “embrace the tools” or “focus on what AI can’t do.” All of that is probably right, and none of it addresses the feeling."
To relate to the author, I think with a lot of whats going on I feel the same about, but other parts I feel differently than they do. There appears to be a shallowness with this... yes we can build faster than ever, but so much of what we are building we should really be asking ourselves why do we have to build this at all? Its like sitting through the meeting that could have been an email, or using hand tools for 3 hours because the power tool purchase/rental is just obscenely expensive for the ~20min you need it.I'm 49.... Started at 12... In the same boat
First 286 machine had a CMOS battery that was loose so I had to figure that out to make it boot into ms-dos
This time it does feel different and while I'm using them ai more than ever, it feels soulless and empty even when I 'ship' something
I think there may be a counterpoint hiding in plain sight here: back in 1983 the washing machine didn't have a chip in it. Now there are more low-level embedded CPUs and microcontrollers to develop for than before, but maybe it's all the same now. Unfathomable levels of abstraction, uniformly applied by language models?
I still enjoy the physical act of programming so I'm unsure why I should do anything that changes that. To me it's akin to asking a painter to become a photographer. Both are artists but the craft is different.
Even if the AI thing is here to stay, I think there will be room for people who program by hand for the same reason there's still room for people who paint, despite the invention of the camera.
But then, I'm somebody who doesn't even use an IDE. If I find an IDE obtrusive then I'm certain I'll find an AI agent even more so.
And the part of programming that wasn't your projects, whether back in the days of TPS reports and test coverage meetings, or in the age of generative AI, that bit was always kinda soul draining.
This is a huge one for me. Claude is significantly better at Googling than I am.
I don't know if I am the only one, but developing with chatbots in my experience turns developing software into something that feels more akin to filling out forms or answering to emails. I grieve for the day we'll lose what was once a passion of mine, but unfortunately that's how the world has always worked. We can only accept that times change, and we should follow them instead of complaining about it.
Same. It scratches my riddle-solving itch in a way that the process of "prompt-honing" has yet to do.
I stuck with C and C++ as my bread and butter from 1996-2011 with other languages in between.
I don’t miss “coding” because of AI. My vision has been larger than what I could do myself without delegating for over a decade - before LLMs.
“coding” and/or later coordinating with people (dotted line) reporting to me has been a necessary evil until a year or two ago to see my vision go to implementation.
I absolutely love this new world. For loops and while loops and if statements don’t excite me in my 50s. Seeing my vision come to life faster than I ever could before and having it well archited does.
I love talking to “the business” and solving XYProblems and getting to a solution 3x faster
The difference is that the first camp is re-experiencing that feeling of wonder while the second camp is lamenting it. I thankfully fall in the first camp. AI is allowing me to build things I couldn't, not due to a lack of skills, but a lack of time. Do you want to spend all your time building the app user interface, or do you want to focus on that core ability that makes your program unique? Most of us want the latter, but the former takes up so much time.
I'm enjoying it to a point, but yes, it does eliminate that sense of accomplishment - when you've spent many late nights working on something complex, and finally finish it. That's pretty much gone.
That's exactly what it is.
Doom does not use mode-X :P ! It uses mode-Y.
That being said as a 47 years old having given 40 years to this thing as well, I can relate to the feeling.
AI has just vastly extended your reach. No sense crying about it. It is literally foolish to lament the evolution of our field into something more.
I've seen things you people wouldn't believe. Attack ships on fire off the shoulder of Orion. I watched C-beams glitter in the dark near the Tannhäuser Gate. All those moments will be lost in time, like tears in rain. Time to die.
where we came from and where we're going this whole time in my career those things are kind of hard to pinpoint. Abstraction is killing us for sure. Time to market above all else. It's no wonder why software in cars, appliances and medical equipment is a factor that is killing people.
The people who are anti-AI are largely building other people's ideas, for work. And they have no desire to ramp up velocity, and it's not helpful to them anyway because of bureaucratic processes that are the real bottleneck to what they're building.
Not everyone falls into these silos, of course.
not sure how that relates to llms but it does become an unblocker to regain some of that "magic", but also i know to deep dive requires an investment i cannot shortcut.
the new generation of devs are already playing with things few dinosaurs will get to experience fully, having sunk decades into the systems built and afraid to let it go. some of that is good (to lean on experience) and some of it holding us back.
But you would not be able to make anything anywhere near as complex as you can with modern tools.
I think that's one of the biggest things that gives me pause about AI: the fact that, if they prove to be a big productivity boost, you're beholden to huge corporations, and not just for a one-time purchase, but on an ongoing basis.
Maybe the open source models will improve, but if keeps being driven by raw compute power and big numbers, it seems to tilt things very much in favor of those with lots and lots of capital to deploy.
40+ years later, been through many BASICs, C, C++ (CFront on onwards) and now NodeJS, and I still love writing code.
Tinkering with RPi, getting used to having a coding assistant, looking forward to having some time to work on other fun projects and getting back into C++ sooooon.
What's not to love?
At the time, I didn't know the LWP::Simple module existed in Perl so I ended up writing my own socket based HTTP library to pull down the posts, store them in a database etc. I loved that project as it taught me a lot about HTTP, networking, HTML, parsing and regexes.
Nowadays, I use playwright to scrape websites for thing I care about (e.g. rental prices at the Jersey Shore etc). I would never think to re-do my old HTTP library today while still loving the speed of modern automation tools.
Now, I too have felt the "but I loved coding!" sense of loss. I temper that with the above story that we will probably love what comes next too (eventually).
I got moved up the chain to management and later worked to get myself moved back down to a dev role because I missed it and because I was running into the Peter Principle. I use AI to learn new concepts, but mostly as a search engine. I love the tech behind it, but I don't want it coding for me any more than I want it playing my video games for me. I was hoping AI would show up as robots doing my laundry, not doing the thing I most enjoy.
And I feel like an old man grumbling about things changing, but... it's not the same. I started programming in BASIC on my Tandy 1000 and went to college and learned how to build ISA cards with handwritten oscilloscope software in the Computer Engineering lab. My first job was writing firmware. I've climbed so far up the abstraction chain over a thirty year career and I guess I don't feel the same energy from writing software that first got me into this, and it's getting harder to force myself to press on.
Welcome to the human condition, my friend. The good news is that a plurality of novels, TV shows, country songs, etc. can provide empathy for and insight into your experience.
Claude is a godsend to me, but fuck, it is sometimes dumb as door, loves to create regressions, is a fucking terrible designer. Small, tiny changes? Those are actually the worse, it is easy for claude, on the first setback, decides to burn the whole world and start from zero again. Not to mention when it gets stuck in an eternal loop where it increasingly degenerates the code.
If I care about what I deliver, I have to actively participate in coding.
What a poetic ending. So beautiful! And true, in my experience.
It feels as though a window is closing upon the feeling that software can be a powerful voice for the true needs of humanity. Those of us who can sense the deepest problems and implications well in advance are already rare. We are no more immune to the atrophy of forgetting than anyone.
But there is a third option beyond embrace or self-extinguish. The author even uses the word, implying that consumers wanted computers to be nothing more than an appliance.
The third option is to follow in the steps of fiction, the Butlerians of Dune, to transform general computation into bounded execution. We can go back to the metal and create a new kind of computer; one that does have a kind of permanence.
From that foundation, we can build a new kind of software, one that forces users to treat the machine as appliance.
It has never been done. Maybe it won't even work. But, I need to know. It feels meaningful and it has me writing my first compiler after 39 years of software development. It feels like fighting back.
For example, the author has coded for their entire career on silicon-based CPUs but never had to deal with the shittiness of wire-wrapped memory, where a bit-flip might happen in one place because of a manufacturing defect and good luck tracking that down. Ever since lithography and CPU packaging, the CPU is protected from the elements and its thermal limits are well known and computed ahead of time and those limits baked into thermal management so it doesn’t melt but still goes as fast as we understand to be possible for its size, and we make billions of these every day and have done for over 50 years.
Moving up the stack you can move your mouse “just so” and click, no need to bit-twiddle the USB port (and we can talk about USB negotiation or many other things that happen on the way) and your click gets translated into an action and you can do this hundreds of times a day without disturbing your flow.
Or javascript jit compilation, where the js engine watches code run and emits faster versions of it that make assumptions about types of variables - with escape hatches if the code stops behaving predictably so you don’t get confusing bugs that only happen if the browser jitted some code. Python has something similar. Thanks to these jit engines you can write ergonomic code that in the typical scenario is fast enough for your users and gets faster with each new language release, with no code changes.
Lets talk about the decades of research that went into autoregressive transformer models, instruction tuning, and RLHF, and then chat harnesses. Type to a model and get a response back, because behind the scenes your message is prefixed with “User: “, triggering latent capabilities in the model to hold its end of a conversation. Scale that up and call it a “low key research preview” and you have ChatGPT. Wildly simple idea, massive implications.
These abstractions take you further from the machine and yet despite that they were adopted en masse. You have to account for the ruthless competition out there - each one would’ve been eliminated if they hadn’t proven to be worth something.
You’ll never understand the whole machine so just work at the level you’re comfortable with and peer behind the curtain if and when you need (eg. when optimizing or debugging).
Or to take a moment to marvel.
Surveillance and Extraction
"We were promised flying cars", and what we got was "investors" running the industry off the cliff into cheap ways to extract money from people instead of real innovation.
1. I shouldn't be so tied to what other people think of me (craftsman, programmer, low level developer) 2. I shouldn't measure my satisfaction by comparing my work to others'. Quality still matters especially in shared systems, but my responsibility is to the standards I choose to hold, not to whether others meet them. Plus there are still community of people that still care about this (handmade network, openbsd devs, languages like Odin) that I can be part of it I want to 3. If my values are not being met either in my work or personal life I need to take ownership of that myself. The magic is still there, I just have to go looking for it
A lot of people started building projects like mine when the EVM was newer. Some managed to get a little bit of popularity, like Dark Forest. But most were never noticed. The crypto scene has distracted everyone from the work of tinkerers and artists who just wanted to play with a new paradigm. The whole thing became increasingly toxic.
It was like one last breath of fresh cool air before the pollution of AI tools arrived on the scene. It's a bitter sweet feeling.
towndrunk•2h ago
jamesrandall•1h ago