As a teacher or many other professions? Forget about it. You need to either marry someone with a more lucrative career, or move somewhere more affordable.
Working on fixing our housing shortage has felt extremely meaningful to me.
I'd like to find some of that idealism in software again.
Disclosure: I work at govstream.ai - we work in this space [we're hiring!]
They have a business tax rate above 106% of profit[]. That is it is illegal for a business to make a profit.
Yet there is apparently a video out there of a black market cart seller selling wares right in front of the Argentine tax office, totally unbothered.
It made me wonder if this was just an allegory of what's in store for us.
[] https://archive.doingbusiness.org/content/dam/doingBusiness/...
Under that model, 10% of your time is completely up to you (within reason) to work on things that aren't your main project or scope.
Works out well for R&D or more open ended positions, since you can have the flexibility to explore hunches without having to justify a whole project around it.
Unfortunately this is the consequence.
Sure I was a hobbyist in the 80s programming in assembly language on four separate architectures by the time I graduated college in 1996 in CS. But absolutely no one in my graduating class did it for the “passion” we all did it because we needed to exchange money for food and shelter.
The people jumping into tech during the first tech boom were definitely doing it for the money and the old heads I met back then programming on Dec Vax and Stratus VOS mainframes, clocked in, did their job and clocked out.
When I went into comp-sci, it wasn't the cool path to riches it's been marketed as for the past 15 years.
EDIT: you graduated around the same time as me. Sure everyone wanted jobs. But there were easier paths to getting a good job in '95/96 that cramming late night comp-sci work. Almost everyone in my class had grown up in the age of hacking around on Apple IIs, their first PCs, etc. No one just randomly ended up in the Comp Sci part of the university because they just wanted a job to make money.
And most of the people in my class had their first exposure to computer programming at my state school in south GA was in college.
I had a focus on programming languages in my undergrad studies and went into an academic R&D programming job, basically making tools for computational sciences. I've basically spent my whole career using and writing open source software.
I've definitely seen a kind of culture shift where the frontier nature of our old R&D culture is getting drowned out by boring process. To me, it is largely the influx of cybersecurity compliance that is killing the old culture. The bureaucratic compliance overhead is antithetical to the small team prototyping approach that drove progress during most of my career. It inspires a sort of cargo cult risk-management process that seems more about appearances and plausible deniability than actual secure systems.
Most developers from the beginning worked at banks, government, defense etc doing boring enterprise work. That has always been the case.
They weren’t doing “research” writing COBOL for banks and the government.
In 30 years, I’ve worked at 10 jobs for startups, boring big enterprise companies, BigTech and I’ve been working in consulting (3 of those were working full time in AWS’s consulting department) for five years working with developers from startups, enterprises and government.
I think my exposure to a wide swath of the industry is a little bit more than working in California for 30 years…
Even when I was younger and single, all of us would hang out after work - males and females and go to the bar, the club, the strip club (yes the women too - it was their idea they were mostly BAs and one programmer), and just really enjoy the money we were making. We were all making $50K-$80K back when you could easily get a house built in the burbs for $150k-$170K.
Even as I got older, and change jobs in my mid 30s, by then my coworkers were mostly involved with other hobbies and our families. We weren’t even thinking about computers after work.
But, I do think HN has a cultural fixation on the fabled Silicon Valley experience. This includes an attachment to (nostalia for?) the old university/startup axis. This used to be a more fluid exchange, rather than just the regular enterprise hiring pipeline.
The tech startup scene was a thing between the mid 90s and 2000 with the first dot com boom and even most of them outside of the hardware companies like Sun, Cisco, Intel and software at Netscape were not doing cutting edge for the time development. The startups were throwing dumb (or some premature) things at the wall with no business plan backed by VC money. The people at startups were definitely there for the money. No one had a “passion” to deliver pet food or groceries (Webvan) or early web advertising.
Tech before the mid 90s were programmers building software on mainframes mostly doing boring things.
You had the dark days between 2001-2008 before mobile, web apps, SaaS and high speed internet took off where all most people really could find were boring enterprise jobs. Back then, I was living in Atlanta and the dot com bust didn’t affect the local market at all. The banks, the airlines, the credit card processing companies were hiring boring Microsoft devs and Java devs like crazy.
I’ve just seen too often where 90% of the devs who spend their career at boring old enterprises don’t have any idea what it’s like for those who are in the top 10% working at BigTech and adjacent while at the same time those 10% can’t fathom the fact that there are “Dark Matter developers” living their lives in tier 2 cities with their big houses in the burbs treating a job as just a job and most people always have.
https://www.hanselman.com/blog/dark-matter-developers-the-un...
I’ve been on both sides (and now in the middle doing strategy cloud consulting). I had my first house built in 2003 for $170K and even my second house built in the “good school system” in the burbs of Atlanta in 2016 for $335K. We moved and downsized three years ago.
There was also a Small World effect where we kept in touch across these various R&D spaces. I don't mean to sound grandiose, but I think my cohort built up a lot of the open source that props up the current web world. I don't quite agree with the article this HN post links to, as I know a lot of that open source was written on salary. It wasn't all hobbyists in moms' basements. Whether we worked in government, university, or corporation, we had figured out ways to work on these things we wanted to work on and release to the world and be paid a wage to do it.
I do feel like our microcosm is dying out. I'm not sure if it is a net change where the tech world is reverting to just the dominant corporate tech you describe, or if there is some replacement microcosm bubbling into existence and I'm just out of the loop now.
And whether startup founders had passion or not, once they took outside funding, money was all that mattered.
Took me half a year to get them to value Sentry, lol.
I’ll just collect my check and go do something else.
Fighting the Product Manager, fighting the Designer, sometimes even fighting some micromanaging stakeholder that won't leave you alone.
It's definitely fights I can win, but do I even have the energy anymore? Development work involves more than meets the eye. While there are some folks who understand the technical intricacies, it's tiring having to join discussions you know won't go anywhere.
I wish it was 2000-2010 again, when my biggest problems were Sales promising features we don't have and then having fun with the other devs coding it.
I used to have a lot more mental bandwidth and energy to be "curious" and to tinker once upon a time. But now the world is so literally and figuratively on fire and every executive is so rabidly frothing at the mouth over AI that now I just want things to "just work" with the least amount of bullshit so I can just go home on time and forget about work until the next business day.
I just want this fucked decade to be over already.
World will not end if project is delayed by few weeks. You get time for your own tinkering (never tinker on company stuff, even if that would improve things, unless you are shareholder).
Apart from may be few core infrastructure primitives at public Cloud providers most of IT stuff today is API calling API calling API and so on.
It will be the case until Human is Out Of Loop from most of the IT work.
for me it feels like I have to spend all day fighting with folks who are constantly “holding it wrong” and using libraries and frameworks in really weird/buggy ways who then seem completely uninterested in learning.
In my free time I love working on my own projects because I don’t have to deal with this bullshit and I can just craft things finely without any external pressure.
Honestly some of my best jobs were at places that had a nicely balanced practice in place and the backbone to remind execs that if they interrupt makers daily with new shiny asks they will in effect get nothing (because nothing would ever be completed)...
But obviously we can both have worked at places with those labels with vastly different implementations and thus experiences :)
Any non-small company has plenty of people that need to justify their salaries.
Meetings is one of the most effective ways to actually pretend to be working.
Final straw for me was RTO. Silently quitting and getting my ticket punched (laid off) was the best thing for me.
Subsequent governments turned the profession into the captive market, where you can only realistically work for corporations who fix the wages by following so called "market rates" and you cannot create your own job if you disagree with the rates.
Jobs in London pay less than peanuts and if you earn six figures in the UK, income tax takes half of it anyway even if you go to FAANG.
The average university grad would be better off in law/finance/medicine by income in London. This isn't to stay the top software Devs don't get paid a lot, but it's a minority compared to the legions of high paid people in finance in London and the surrounding industries.
Many consultant friends I know and business owners have moved away from the UK to low tax areas in Portugal or UAE.
I got started in the 1980s, and super-curious and technical people were the norm. We were incredibly strongly attracted to computers.
The first real growth in computers in that kind of era was Wall Street and banks. Wall Street in particular started paying huge bonuses to developers because it was clear that software could make huge piles of money. Then we started seeing more people joining in for the money who were not necessarily passionate about technology.
Then came the dot com era and bust, and then the rise of social media, FAANG, and absurd corporate valuations allowing ridiculous total comp to developers, and the needle moved even more towards money.
The net result is the curious and the passionate are still here, but our numbers are massively diluted.
I come places like here to find that passionate niche.
This has been happening since the 2008 financial crash when a lot of people would have normally gone into careers on Wall Street, but the shrinking Wall Street job market led people into tech as a high-performant, decent paying career..... (U.S. biased opinion, of course)
That's not entirely true. We (society, definitely US) pushed going to college HARD for the last 3-4 decades and glamorizing how much money you'll make. Now, we have an overabundance of people with college degrees and thousands of dollars in debt to those degrees.
There's plenty of career paths where you could make decent money that don't require a college degree.
We should have been pushing people to figure out what they wanted to do, not "Make lots of money", and figure out the path that gets them there.
The sad reality is that "everyone learn to code" was by and large a marketing distraction from the severe structural unemployment the fast and loose economy is in. No a coal miner can't just learn to code and get a job in WV, certainly not 1,000's of other miner sin the same position, not can the millions of people that corporate laid off over those same decades.
Coding was a way out of poverty, but for most people it was just a distraction to keep them from seeing how bad the economy is.
Americans are poor: PNC Bank's annual Financial Wellness in the Workplace Report shows that 67 percent of workers now say they are living paycheck to paycheck, up from 63 percent in 2024. https://www.newsweek.com/2025-rise-americans-living-paycheck...
It's wild how this site has turned into reddit over the last couple years.
I interviewed many people from top universities and they absolutely scream "I couldn't care less about the field, I'm just here to maximize the compensation".
At the same time I get 19 year old self taught kids who are miles better at programming, learning and are genuinely passionate.
My first trip through college I studied business and then the economy collapsed. Most people my age eeked their way through menial jobs (like me) and survived, found a way to break through, or, (like me) went back to school years later when the economy improved to try to find another opportunity. For me the choices were CS or nursing at that time, and I have always been good at math and with computers, so I chose that.
I wouldn't say I ever "loved" development, especially not the current corporate flavor of it. I've had some side projects when I get time and energy. But there's never really been a point in my life where I could ever have afforded getting the level of expertise I possess now just for the "curiosity" of it. Not everyone has a trust fund or safety nets.
Just the bar is so high now, so much competition, so many cargo culting startups that only do bad leetcode interviewing.
It's very hard to both find and get hired at places that want more than a coding monkey to just blindly move Jira tickets.
Here's an example from my perspective.
Recently while developing a way to output a PAL video signal from two digital lines(an endeavour obviously driven by curoiosity more than utility). I learned a great deal about far more than I would have if I had not have used AI. I wasn't blind to what the AI was emitting, It helped me decide upon shifting one output to 0.285v and the other to .715v. Write a program to use pytorch to learn a a few resistors/capacitors to smooth out the signal and a sample of 2-bit data that when emitted though the filters produced a close sine wave for the color burst. AI enabled me to automatically emit Spice code to test the waveform. I had never used spice before, now I know the pains of creating a piecewise linear voltage source in it.
Yesterday I used an AI to make a JavaScript function that takes a Float32Array of voltage level samples and emits a list of timings for all parts of a scanline along with the min and max voltage levels, calculates the position and frequency of the color burst and uses the measured color burst to perform the quadrature decoding to produce a list of YUV values at a lower sample rate.
This should let me verify the simulated waveform so that I can compare the difference between what I intend to emit, and what should be a correct signal. If there turns out to be a discrepancy between what I am emitting and what I think I am emitting, this will come in quite handy.
Perhaps I might learn more if I did all of this myself unaided by an AI, but it would also take much longer, and likely not get done at all. The time I save writing tests, wrangling libraries and software is not stored in the bank. I used that time too, doing other things, learning about those as I did so.
They want to produce something without having the skills to produce it. Which, you know, probably isn't uncommon. I'd love to be able to rock out the guitar solo in Avenged Sevenfold's "Bat Country" [0] or "Afterlife" [1] or the first solo in Judas Priest's "Painkiller" [2], but to get to that skill level takes years of practice, which I'm quite frankly not willing to put in.
The difference is the honesty. A vibe coder produces something barely more than "Hello world" and brags about being able to produce software without learning to code. Nobody grabs a guitar, learns two chords, then claims to be a guitarist.
[0] (mildly nsfw) https://youtu.be/IHS3qJdxefY?t=137
That's off by large factor.
What I could find quickly was an estimate that the top 400 own a little over 4%, not 50%.
What dampens the Spirit is same as everyone - a treadmill you cannot get off, punishment for independnat thinking.
Dev culture is not one thing that is found in dozens of companies - dozens of companies have their own culture - and if that is a curious and empowering culture you have curious and empowered devs, and salespeople and operations and chemists and …
Culture is what we make it
You won't tour for long as a one hit wonder and I think what's being said by the OP is quite similar
I think it depends on the circles you're in. For example, I see a lot of interest in the "Handmade" way of doing things, largely inspired by Handmade Hero. Almost feels like a comeback of what you consider to be dying. There are people who are interested, but one needs to look for them. I recommend it.
That wave feels definitively over now, making mobile apps in 2025 is much like doing WinForms in 2003. Hopefully something new will come along that shakes things up. In theory that's AI but as a developer I find AI tremendously unsatisfying. It can do interesting things but it's a black box.
For me personally... I'm older and married with kids. My free time is so much more valuable than it was back in the day. I still try to be a curious developer but at the end of the day I need to get my work done and spend time with my family. There's enough of a financial squeeze that if I did find myself with an excess of free time I'd probably try to spend it doing freelance work. So whenever this next wave does arrive I might not be catching it.
SWE culture was very different in a low interest rate environment. Teams were over staffed. No new tech came around for a long time so everyone was focused inward on how to improve the quality of the craft. At my big tech company some teams had 2-3 people dedicated to purely writing tests, maintainability, documentation, and this was for a <1m MAU product.
Then boom free money gone. Endless layoffs year over year. Companies pushing “AI” to try and get SWEs to deprecate themselves. It’s basically just trying to survive now.
That wizard that used to nag everyone about obscure C++ semantics or extremely rare race conditions at distributed scale has disappeared. There’s no time for any of that stuff.
Like all cultures, this was all performative. People astutely observed how to say and care about the things that they saw, the people above them, saying and caring about, and mimicked them to promotions. That doesn’t work anymore, so that wizard culture is long gone.
There's still people taking on new frontiers... even if you don't love crypto (and I don't!), a lot of very curious developers found a home there. AI is tougher (due to the upstart costs of building a model), but still discovery is happening there.
I don't think curious developers are gone... there's just an increase of un-curious developers looking for a paycheck. You just have to look harder now (although I think it only seems like we had a cohort of curious devs because we're looking at it in hindsight, where the outcomes are obvious).
TFS was introduced in 2005 for Microsoft shops for instance.
I'm rather envious of kids today who have access to Google, Wikipedia, YouTube, and (with caveats) ChatGPT when they're truly interested in a topic. They can dive a lot deeper than I had the opportunity to without bringing in adult assistance.
"Can" is a critical word in your comment.
But after acknowledging this you just said that most people want to be lazy. Which is something I was actually agreeing with. Though, I guess I should add that I don't think people are doing that so much because they are lazy by nature but rather that they are overwhelmed. There's definitely a faster pace these days and less time given to have fun and be creative. One might say that it's work and work isn't meant to be fun, but this is mental work and critically, it is creative work. In those domains, "fun" is crucial to progress. It is more a description of exploration and problem solving. If we're all just "wanting to go to the pub and watch TV" (nothing wrong with that) then we'll just implement the quickest dirtiest solution to get things done. I think this can work in the short run but is ineffective in the long run. A little shortcut here and there isn't a big deal, but if you do that every day, every week, every year, those things add up. They are no longer little shortcuts, but a labyrinth. Instead of creating shortcuts, people are actually just navigating this labyrinth they made. It's the difference between being lazy by sitting on the couch all day and being lazy by "work smarter, not harder."
My main concern is with the environment we've created these days. It's the "free time" that davidw mentions. As several people have mentioned, things shifted towards money. My honest belief is that by focusing too much on money we've actually given up on a lot of potential profits. Take Apple. Instead of thinking different and trying new things, they've really mostly concentrated on making things smaller and thinner. That's good and all, but honestly, I'd more than happily have a thicker laptop to give my screen and keyboard more clearance. It's just so fucking difficult to avoid those smudge marks on my screen. We take less risks because we're profit focused. The risks just seem far riskier than they are. We've created walled gardens, which undermine what made the computer and smartphone so successful in the first place! (The ability to program them!) We hyper fixate on spreadsheets. We dismiss our powerusers because they are small in quantity, ignoring the role that they play in the ecosystem. In every ecosystem it is a small portion that does the most. Everything is just so myopic.
I'd agree that all these problems existed to some extent. But what the difference now is scale. I think what changed is the population of the developers. In "the old days" there was a good balance between the coding monkeys and business monkeys, where they pushed back against one another. The business people couldn't do it without the coders and the coders could do it without the business people, but were more effective with them. But I think these days the business monkeys just took over and dominated. The paradigm shifted from wanting to build good products and being happy to get rich while doing so to focusing on the getting rich part. We lost the balance. I think people are still creative, but I think we do not give them room to breathe, I think we do not give them enough room to take their chances. In some ways things are far easier than they've ever been, but in many ways they're also far harder. So are we measuring success by the fact that we have multiple trillion dollar companies, something never seen in 2010, or are we measuring success by the number of products and technologies that have changed peoples' lives. We made Android, iPhone, Maps, YouTube, Twitch, WiFi, Bluetooth, and so much more in just such a short time. But in (more than) that same timeframe, what innovations have we made? There's been some good leaps, don't get me wrong, but even AI is only a small portion of that. During most of that time we saw more vaporware than actual products. For the love of god, there's bitcoin billionaires. Love or hate crypto, it hasn't changed the world in a huge way.
/rant
But since then the Apple Watch is an innovation both technologically and from a business standpoint, In 2010, I wouldn’t have imagined you could have a processor faster than the original iPhone, with Wifi, Bluetooth, GPS, cellular, satellite communication, 32 GB of storage in something that size with that battery life.
While I think the newly announced Meta glasses are ugly and don’t provide enough value for the money, it was risky and not just another social media platform to provide ads.
Gen AI is some real sci fi shit that I wouldn’t have thoight it would be as far as it is in 2020.
Self driving cars are a real thing on the road right now. Even Uber itself is innovative and has made travel to different cities much better than dealing with taxi services. As much as I dislike Musk as a person, you can’t deny SpaceX and Starlink are game changers. All of the major tech companies are spending a lot of money on better custom processors and TSMC is doing some wild stuff on the manufacturing side.
The medical industry is also doing some life changing things.
Why could we make those great leaps then and not now? What changed?
But why nothing as ground breaking - physics.
At a certain point you come up with speed of light limitations and other sub atomic interferences that are way out of pay grade when it comes to producing smaller processors.
The industry keeps coming up with new techniques as far as wireless communications. But eventually again you come against physics, certain spectrum is worse for communications and Shannon-Hartley theorem.
With cameras, you can only do so much in a phone form factor. You can also only do so much with current battery technology, heat dissipation vs power etc.
And a lot of the technology is “good enough”. There is only so much video and audio fidelity that you need to reach the limits of human perception.
What exactly break through technology are you looking for?
Probably it would have been possible to get something via inter-library loan, but I would have been 9 or 10, didn't know this was possible and didn't think to ask. The handful of topical books I obtained from parents and schoolfriends was a far cry from just scrolling on your phone to the information you want.
It's not better in every way. But it is better in some ways.
MySQL was available for free in 2000 and anyone could download any number of language runtimes for free like Perl and Java. If your corporate overlords weren’t cheap (or you were in college) an MSDN subscription was amazing.
JavaScript has been around for decades. But jQuery made it so much easier, and then React built on top of that even more. And jQuery wasn't the first DOM library, nor was React the first framework – but both were where it seemingly clicked between ideas, usability and whatever else made them successful.
(I will agree that Microsoft had a run of things where anyone who bought in to their ecosystem had a lot of things that worked well together.)
Especially today while the IDEs are free, people are paying for LLM coding assistants.
Not to mention the dark days of Window GUI development. How exactly is Vim better than a modern IDE?
[1] Yes I know all bets are off when you are using reflection.
We (ie, people that do not have a safety net) do not have this luxury you people did in the 1990s of experimentation and curiosity. Boomers and leaders using shitty Reaganomic economic policies have decimated our safety nets by so much that it makes experimentation a luxury for the rich and powerful.
Cost of living is higher than ever. Inflation is higher than ever. We are handcuffed to this shitty system in America called “private health insurance.” Get sick? No job? You are fucked m8.
The risks of "curiousity" are much much higher than it was during your time buddy
I was young and didn't have many responsibilities then, and lots of free time. Now I'm a dad with a mortgage and an interest in local politics because I want to 'leave it better than I found it'.
All that said... I do think there have been some shifts over time. I grew up in the era of open source taking off, and it was pretty great in a lot of ways. We changed the world! It felt like over time, software became mainstream, and well-intentioned ideas like PG's writing about startups also signaled a shift towards money. In theory, having F U money is great for a hacker in that they don't have to worry about doing corporate work, but can really dig into satisfying their curiosity. But the reality is that most of us never achieve that kind of wealth.
Now we find ourselves in a time with too much concentrated corporate power, and the possibility that that gets even worse if LLM's become an integral part of developer productivity, as there are only a handful of big ones.
Perhaps it's time for a new direction. At my age I'm not sure I'll be leading that charge, but I'll be cheering on those who are.
It's certainly true that IT has grown vastly since those good old days, but there has always been a proportion of people who're just... not that interested in what they're doing. For example I remember being mildly horrified in around 1998 that a colleague didn't know how to run his compiler from the command line; without an IDE he was lost - but I doubt he was the only one.
Meanwhile the idea that there's a dearth of cool new stuff seems quite quaint to me. There's a whole bunch of cool things that pop up almost daily right here on Hacker News². Just because they haven't spread to ubiquity doesn't mean they're not going to. Linux was not mainstream right out of Linus's Usenet announcement - that took time.
As to corporate power? They ebb and flow and eat each other (Data General, Compaq, DEC ... remember them? Remember when Microsoft was the major enemy? Or IBM?)
¹ https://en.wikipedia.org/wiki/Good_old_days
² Edit: Not to mention, there's also a whole bunch of crap that's not very interesting. But survivor bias means we'll have forgotten those in 20 years time when we're surveying this time period; as Sturgeon's law reminds us, "90 percent of everything is crap."
It just feels like "it's a job" is more of the zeitgeist these days.
And yes, I'm also well aware of what came before 'my time' - mainframes and such were definitely an era where the power was more with the large companies. One of the reasons Linux (and *BSD) was so cool is that finally regular people could get their hands on this powerful OS that previously was the exclusive purview of corporations or, at best, universities.
As to cool projects, sure. They're fun, interesting and creative, but perhaps not part of (a very vague, admittedly) "something bigger", like "the open source movement" was back in the day.
But if you're looking for that spark and excitement again, you need to get back out to the frontier. One frontier that is particularly exciting to me is using AI to speed up the tedious parts of the development process, and to tackle areas where I don't have specialist knowledge. Similarly to how Linux opened up a powerful OS to individuals, AI is enabling individuals to create things that would have previously required large teams.
Perhaps over time it'll get efficient enough to run outside of huge companies; that could be an interesting aspect to keep an eye on.
Though certain novel uses could lead to new individuals or entities gaining power.
I'd like to be hopeful and would like to hear good arguments for how this could happen - but it seems to me improved technology on the whole leads to increased concentration of power - with exceptions and anomalies but that being the dominant trend.
It was about how only big companies have the resources to make big computers that take up a whole room that are powerful enough to run smart AI models.
But if tech progress is any indication, in say 50 year or probably less, we will have the power of a modern day datacenter in our pockets and be able to run smart AI models locally without it being a large corp monopoly.
In all seriousness though there’s plenty of room for improvement both in current models and hardware.
> it seems to me improved technology on the whole leads to increased concentration of power
Which is why we are dominated by IBM, AT&T, Kodak and Xerox.
Or, you know, if AI is the mainstream hotness or just doesn't float your boat, look for what the iconoclasts are up to and go dive into that, not whatever the VCs are flinging their gold at today.
But... they're still there. They're a little diluted, but I've not yet worked somewhere where I had no like-minded tinkerers amongst my colleagues. I don't think I'd want to, but it just hasn't come up.
> As to cool projects, sure. They're fun, interesting and creative, but perhaps not part of (a very vague, admittedly) "something bigger", like "the open source movement" was back in the day.
But the free software movement dates back to the early 80s, not the 2000s that we're talking about. Open source itself was being seen as a dilution of the principles of free software in the late 90s/early 2000s. More to the point, free and open source software is still very much here - we're absolutely surrounded by it.
> mainframes and such were definitely an era where the power was more with the large companies
It's oscillated. DEC used to be the zippy young upstart snapping at IBM's heels you know. Microsoft didn't start out big and evil; nor did Google if it comes to that. Put not thy faith in shiny new companies for they shall surely betray thee once they devour the opposition... :D
edit: I hadn't scrolled down to https://news.ycombinator.com/item?id=45303388 when I wrote this
That isn't the case anymore. That sort of monoculture where everyone is reading the same stories, discussing the same topics, and reading about shared values and principles, is long gone.
That's a cheap dismisal. There's nothing wrong with "good old days" thinking if old days were actually better.
>Meanwhile the idea that there's a dearth of cool new stuff seems quite quaint to me. There's a whole bunch of cool things that pop up almost daily right here on Hacker News²
Hardly of the breadth and ambition of the 1998-2012 or so period.
>As to corporate power? They ebb and flow and eat each other (Data General, Compaq, DEC ... remember them? Remember when Microsoft was the major enemy? Or IBM?)
Yes, and also remember then players like Sun did cool stuff in the UNIX space. Or when FOSS wasn't basically billion dollar corporate owned wholesale, with mere corporate employees buying the majority of contributors and IBM, Oracle, Google and co running the show. Even RedHat was considered too corporate and now it's IBM...
Hits so much harder as a middle aged adult than when I saw it on tv ~2 decades ago.
Old man yells at cloud services
But most of the people I went to uni to study computer science with at the end of the nineties were there for the money. Even back then it was all about money for most programmers.
And then there is a generation that grew up knowing that there was money in computers, so many of them learned to use them even if they didn't care about them per se. This generation also contains many hackers, but they are surrounded by at least 10x more people who only do it for money.
Twenty years ago, most programmers were nerds. These days, nerds are a minority among the programmers. Talking about programming during an IT department teambuilding event is now a serious faux pas.
Then again, I did spend some time in e.g. lisp and Haskell just for the heck of it. And there ate still plenty more unsolved problems outside of the mainstream today.
You can't keep that curiosity and at the same time see one of the most wonderful and awe-inspiring technologies of the last decades as something threatening.
I lamented when my career first started (2000 or so) that there were devs I worked with who didn't even own computers at home. While my bookshelves were full of books I aspired to learn and my hard drive was full of half-baked projects, they clocked out and their thinking was done.
I still know a few of those now 25 years after the fact. Some of them have made a career out of software. But they never got curious. It was a means to an end. I don't begrudge them that. But as someone who is internally driven to learn and improve and produce, I can't relate.
My primary fustration today is how many of my software peers are satisfied with updating a Jira status and not seeking to build excellent software. I've seen it at all levels - engineers, managers, and executives. I'm actualized by shipping good, useful software. They seem to be actualized by appearing busy. They don't appear to deliver much value, but their calendars are full. It has me at my professional wits end.
Truth be told, the phenomenon of appearing productive without being productive is an epidemic across multiple industries. I've had conversations with people in manufacturing and agriculture and academia and they all echo something similar. Eventually, Stein's law indicates that the productivity charade will end. And I fear it will be ugly.
I have never in 30 years written a single line of code that I didn’t get paid for except a little work I did for charity.
And plenty who are not, it takes all kinds.
It's a matter of taste and still all tastes may not be satisfied anyway :)
For years I was a part time fitness instructor and runner. I loved hanging out with friends, being in front of people, meeting them at races and us training together. It’s completely different than being at a computer at home - after working all day on one.
You expect someone who writes software for 8+ hours a day professionally to go home and do more of it for fun?
Those who are interested in doing that are free to do so, but most people have more than 1 interest or would like to be compensated for the additional hours they are effectively working in their profession.
Not that one or the other is a less-respectable approach.
Sorry if my text can be misinterpreted so easily.
This is hardly a new phenomenon. Dilbert and its ilk have been lampooning this since the 80s.
In another case, I had recently moved to a new city and we were targeting an internal proprietary platform (again with Windows NT) and also targeting Solaris.
There was a time when you would go to work and you would be working with header files and libraries that were proprietary and for which your company was paying an exorbitant per-head license fee.
IMO this is the part that the author is missing. Back in the 2000's, software development was a much smaller field and your main focus was the "curiosity pond" where all the developers went to tinker.
Now software dev has expanded into an ocean. That pond is still there but the author missed the pond for the ocean.
where is this ocean? that I have all these big pre-cooked components I can use to make saas spaghetti?
The total workforce has expanded dramatically over time, so even if everybody in the started-40-years-ago cohort remained alive and employed, those (now much older) people would still be a tiny minority among the bigger and bigger cohorts that kept joining since then.
Couldn't agree more. Like many, I've had my honeymoon phase with AI and have now learned what it is good for and what it is not. What it has truly been good for is satisfying the nauseating number of topics I want to learn about. I can spend $20 a month and drill down into any topic I like for as long as I like in an incredibly efficient way. What a time to be alive.
I think a lot of people have lost faith that technology can improve the things that they care about. Even open source doesn’t seem to have made much of a difference in preventing, well anything bad in the last few years.
If we want to have a better dev culture there has to be a reason for people to believe that the software they make is actually going to improve people’s lives and not just accelerate the profits of multi billion dollar corporations.
Web and the whole cloud/backend scene has become toxic because of the work culture around them. I know of a therapist on the west coast that has become completely snowed under by a surge of software developers claiming mental problems on account of their working environments, and she was in such disbelief that she was asking around if what she was hearing was possibly real. Other professionals simply would not accept what has been going on.
Omarchy Bitchat Ghostty Crush
None of those are chasing metrics. And that’s just off the top of my head.
I’ve been there, looking for pennies in the couch to be able to afford a burger while I waited for my next contract gig deposit. Even if your project doesn’t become the next big thing, you’ll end up with something to show in your resume. That will open tons of doors.
I am doing more side projects, and finishing more projects, and feel a much greater level of confidence in starting new projects since I feel more confident that I will get at least an MVP working. These are not commercial efforts, I am just tinkering and scratching my own itches.
I attribute 3 reasons to this change:
- Vibe coding helps me do parts of the tech stack that I used to procrastinate on (UI, css)
- Gemini helps me solve all the inscrutable devops issues that used to block me in the past.
- A great open source tech stack that just works (Postgres, docker, node, ollama....)
AI helping me with the above has allowed me to focus on the "fun" parts of the side projects that I do. And the UIs also end up being much prettier than what I could create myself, which gives me the confidence to share my creations with friends and family.
-- a 28 year old
There was time when being a software developer was not a particularly prestigious or well-paying job in corporations, or maybe a weird hobby of developing games for the toy 8-bit entertainment computers of the day. It was mostly attracting people who enjoyed interacting with computers, were highly curious, etc.
Then there was a glorious time when the profession of software engineering was growing in importance by the day, hackers became heroes, some made fortunes (see e.g. Carmack or, well, Zuckerberg). But this very wave was the harbinger of the demise: the field became a magnet for people who primarily wanted money. These people definitely can be competent engineers! But the structure of their motivation is different, so the culture was shifting, too. Now programming is a well-paid skilled trade, like being a carpenter or a nurse.
If you want hacker ethos again, look for an obscure field which is considered weird, is not particularly well-paid, but attracts you.
In the past I did many mistakes like pulling all nighters to because I found a way to make checkout experience more pleasant. That resulted in massive increase of revenue and none of that benefitted me. Or unblocked other team, they couldn't find a reason why their app would randomly crash. Board was panicking as client was going to pull out. I saved the day. Multi-million contract gone through. "Thank yous" didn't help me pay off debts.
Only be curious for your own stuff. For corporations? Do bare minimum.
As far as knowing more, the best way to get promotions raises and job opportunities is via networking, the ability to market yourself inside and outside of the company and soft skills.
Well the best way to make more money is to work for companies that pay more money - ie BigTech and adjacent [1] - and then learn the politics of promotions.
[1] Yes “grind leetCode and work for a FAANG” (tm r/cscareerquestions)
Luckily though, none of those places would ever even look at my resume.
I would hope there to be a healthy medium between "pulling all nighters" and "Do bare minimum" -- perhaps somewhere where we all try to do our best, but don't push ourselves too hard for no reason? I mean, that's more reasonable than imagining we'll one day overthrow our corporate overlords. Probably, I'm naive and idealistic. But I can't help but feel like the result of apathy is not satisfaction.
That isn't reality, however, and so most of that energy is consumed by my day job, and it feels wasteful to put what little remains into projects that have little chance of any practical return. Any time I start settling into work taken up out of pure personal interest, the "responsible adult" part of my personality starts stratching at the back of my mind and pushing me to go do something more productive.
Such is life.
Arguably there might be more curious tinkerers nowadays, but they might represent a smaller slice of the pie.
Maybe only possible once you could finally own a whole "system" single-handedly and do whatever you wanted, for the first time ever.
Perhaps the fundamental concepts of "owning" your own and doing whatever you want with it have been allowed to dwindle so badly it seems like no comparison.
Theres a layer of pessimism to engineers and hacker news that has been steadily growing (as I assume the average age increases).
To me it's hit a critical level and I have to disregard most negative comments here by default because I can no longer trust the average commenter to have a productive balance of optimism and curiousity.
----
On a different note, the point the author is trying to make is massively undercut by the ad spam all over their page.
It was so grotesque (and out of character for a dev blog) that my first assumption was that I had a malicious extension somewhere.
Ownership, royalties, voting would be embedded in a block chain. Proof of work would be by vote. And votes given for proof of work. Or something like that. In music they have "royalties" and it seems like that could be used for contributors.
If you would like to be part of a discussion send an email to my firefox relay 3tdm026f9@mozmail.com
Feel free to use a relaly.
I've already seen how people scratch other's backs in peer feedback during performance reviews, and I've heard plenty of description of negative aspects about promotion-oriented behaviors driving what people work on at companies notorious for that kind of stuff. Not to mention all the actual biases pervasive to the "meritocracy" crowd.
In modern society, if you're not trying to monetize all of your hobbies and every little thing you do you are seen as doing something wrong. Everything has to be a hustle these days. You're not allowed to do things simply because you enjoy it.
Now it's on github, and if you don't get enough followers or forks or it's not in a popular language or framework or you haven't updated it recently enough it's seen as a "dead project" or a failure. A project can never be "done" because then it's dead. That's demotivating.
Social media damages everything it touches.
The most social coding I've ever experienced was Bukkit, the old Minecraft server thing. I was noob in high school, made plugins for little things I wanted, people installed them, they gave good/critical feedback, I learned, it was great.
Personally, I am excited that AI is steering people away from tech that aren't actually interested in it. Reverting to the mean a bit. And like the downvoted comment below, I actually think a swath of "vibe coders" are much more inline with the hacker mindset than most developers. A lot of them are the "make a quick buck" types but there is also a ton of insane tinkering going on, which is awesome.
But maybe we are talking about two different things. There is a distinction between "I want to hack on this to see how it works" and "I want to hack on this to see if this IDEA works". So product hackers are ascending while engineering hackers are starting to dwindle.
It reminds me of the shift in car culture when car computers meant you couldn't just rebuild a rusty car over a summer but a new culture of car hackers bubbled up mostly around modding cars for drifting or whatever. The people were different, the work was different, but the curiosity, excitement and subculture grew into something very similar.
This may be hitting developer culture hard but it's much broader than that.
We used to have to hack things together because nothing worked. There was no consistency, standards were all over the map, software solutions for most things didn't exist, and running software on the major vendor ecosystems was heavily silo'd.
Dozens and dozens of technologies changed that. Web protocols and virtual machines broke siloing. Search engines and community forums made discoverability much, much easier. We passed the tipping point where hardware was only valuable if it could be connected to an ecosystem, so engineers started building standards like USB, wifi, bluetooth, and a TCP-accessible interface into everything. And an army of open-source hobbyists wrote hundreds of thousands of libraries to "X but in Y."
So hacking itself has moved away from problems like "get a telephone multiplexer to translate a bitstream to colors on an anlog TV" and towards "What nine libraries and five gadgets will you glue together to" (for example) "let your refrigerator track your shopping list," or "How can you make setting up email not feel like hacking your left arm off for the average non-computer person?" Because those are the kinds of problems that are still unsolved.
It's a different kind of hacking requiring curiosity at a different level and sometimes a different problem-solving skillset (less experimentation, more curation and cataloguing).
But these things are ignored by most people anyway. I guess my problem with the article is that it’s a bit confusing. It started talking about people not being curious and using tech they hate, then later he’s annoyed that people like to experiment and tinker with new frameworks. Then later it’s talking about some other thing, and there doesn’t seem to be anything tying all of the author’s complaints. As if it’s just a freestyle rambling instead of trying to get to a point.
I think there needs to be a distinction between artist and artisan. Art exists for its own sake, code exists because its useful. I don't want code that reads like poetry, I want code that works so I read actual poetry later.
> Have a project in mind that you’ve always wanted to tackle but it never made sense to you to do it because it would never be used by anyone else or it would never make you any money?
I appreciate the tinker's and hobbyists, software is endlessly interesting as a career, and I'm thankful to be here. But I only want to build code that is useful.
And anyway, how useful is your code, really? I will not generalize or make assumptions, but you’re also not going to tell me what it is, right? So scrutiny for thee, but not for me?
And if it’s like, “I make Dagger wrapped implementations 17 layers deep in a Google product you’ve heard of”: by now you should know that the thing sincere people say about insincere people, “We watch what Hollywood says is good,” applies to shit that Google, Apple, Amazon and all these super high paying job companies do too. If you are conflating many users with useful, that’s the problem. Facebook, TikTok and Instagram could vanish tomorrow, and literally nothing meaningful would be lost.
Is “useful” to you, “everything that I do is useful, and everything I don’t do, maybe”? You don’t get to decide if your POVs are reductive. They just are.
I appreciate exposing yourself for a contrarian point of view, noble if fatally flawed.
It is very impressive (in a disheartening way) how easy it was for The System to convince us to constantly spy on each other “for our own good”.
People play and tinker when they feel that they are in a secure enough environment to fritter away time without feeling like they've incurred risk by doing so.
Given the state of the climate, economy and politics today, I think a whole lot of people feel a whole lot less secure. When I look back at recent US history when there seemed to be the most innovation going on, it was the 90s after the fall of the Berlin Wall and before 9/11. That was probably the "OK-est" a lot of folks in the US felt in their lives.
You might rightly point out that people are wasting lots of time these days, staring at screens, binging TV shows, re-reading giant sci-fi and fantasy series. That's true. But there's a big difference between wasting time escaping the world versus "wasting" time creatively engaging with it.
Are you still trying for remote? A few years ago when I was down bad and RTO was starting, I found that remote was near impossible to negotiate for “normal” devs even with such extreme concessions.
I personally love the craft, but battle the entrepreneur in my brain telling me not to waste time learning things that won't bring tangible value.
Most of my curiosity is tempered by how it can make me money.
I do appreciate that 50-70% of the boring work can be done with AI agents now. As long as you know enough to have opinions and guide the process along, it can be helpful.
Expertise and learning don't seem to be AS important with the upcoming gen of developers. However, there is also so much out there now that it would be much harder to start from zero as opposed to being there from the beginning.
But I think gen X and Millenials were probably peak interest and curiosity, now it's just a job for the later generations.
The curiosity hasn't disappeared from the culture, but it might not be brought in to a workplace anymore.
I think a lot of us have stopped bringing the tinkerer itch to work.
Outside of the workplace, there's an entire parade of tinkering by folks who at best post it on Youtube, not here (I watch "Stuff Made Here" for the code).
Of all the events of the past decade, the worst hit to the tinkering visibility has been Github making personal repos private by default.
Mostly the folks who were like me still have pet projects, most of them will share their code but only if you ask because it is "Well, not as nice as it should be".
I've got hundreds of repos in my github, but there's a sharp fall-off in what's public (there's ~113 public and 180 private) right when that happened and I'm sure I'm not the only one.
The tinkering is more active than ever now with vibe coding tools, where I can draw an svg out and then ask it to "I want to use manim.py to animate this" to get something which is a neat presentation tool to describe how data moves between systems.
But is it worth showing you when all the fun there was in making?
What if all I am likely to get "So what?" as the only response. Wouldn't that it make it less fun?
Since then devs got squeezed more and more so that nobody has any time for trying out stuff. Tech debt accumulates and nothing improves. When you have an idea, you have to submit a proposal to a review board which approves requests from politically connected people and rejects other requests because other deadlines.
This development has taken out everything that made me enjoy about the job and I am good at. Thankfully I am reaching retirement so I am happy to leave.
Later it was just all scrum wannabe agile and task after task, while experimentation was phased out and engineers were even forbidden to explore other topics than the team lead wanted, because according to him those topics were not close enough to the job. Guess what, all good engineers left. Now that team lead left as well, and if they didn't learn anything from that experience, I am sure they will wreak havoc somewhere else.
The industry has flooded with money motivated people, rising the income of the curious (but not exactly marketable) engineer. Yes those people who flooded in might be uninspired, loathsome, buffoons (in the eyes of elite nerds). But also it's the opportunity for your hobby to be mainstream, encountered by those who likely never would have, to not be denigrated for your skill with technology etc.
I'm grateful for how software has progressed from IQ 160, to 140, to 100, to 95 segments of the populace. It means we're winning culture over. It means we're solving problems (including how difficult it used to be to engage with). We've made previously wildly difficult things be table stakes for todays app. (one trite example: long polling became websocket pushes)
We should be celebrating how mainstream we've become.
In 2000, at my very first job, was when I first met a developer who got into it for the money and not for the love. When he told us he picked computer science in college because it seemed like it was a good way to make a living, and a lot easier than law or medicine, the rest of us programmers looked at him like he had sprouted a second head. By 2010, people like him were the norm.
Who cares, though?
If you're a "curious" developer, the existence of a massive preponderance of incurious engineers who are in it for the money doesn't change who you are. It doesn't have to change how you see yourself.
Socially, there are more "curious" developers to connect with and learn from than ever before.
The downside is that people outside of the industry will draw conclusions about you based on their perception of engineers as a whole. Boring and mercenary.
But let's face it, in the eyes of most of the population, boring and mercenary is a step up from how we were perceived when it was just us nerds who were weird enough to enjoy it.
I could've been annoyed that everyone was doing it for the money while I really cared, but you know what, most people cannot afford to go to college for a low paying career. And I took a high paying job in the end when I could've easily done research or tinkering instead, so can't complain.
Now if anything, I'm pissed off at the few coworkers who also care a lot and act superior doing stuff the "real coder" ways, actually don't get very much done, and hold others back.
Millions of people entered the field, many of them explicitly because they saw it is a good job opportunity. The average software developer is now a completely different person than they were ten or twenty years ago. Importantly there has been a major shift towards people in India and other Asian countries, where development has been outsourced or where developers are hired from as well as differences in college graduates. This is clearly reflected in the job market, which is getting more competitive.
It’s fundamental, if you want to find that pioneering spirit again you have to leave your comfort zone and go exploring somewhere off the map.
It's gotten significantly harder now that I have a toddler and my S.O. works, but I can't help myself from stealing time for it.
Imposter roles are jobs that are created working backwards from "job at company" to something that an individual can realistically claim they do at the company. They became prominent in the last tech bubble when there was a lot of wealth being created and people wanted to go work at places like software companies, where they could not realistically contribute.
"Product Manager" and "SCRUM Master" are just some of the imposter roles that you've probably encountered. When you scrutinize the existence of these roles, there is a swift and immediate backlash from people who's lifestyle and livelihood is at stake. Product managers will point to famous people at Apple called "product managers" to distract from the fact that the median product manager does not add value.
When an organization creates a role that subsumes all of the creative control, and fills it from a pool of entirely unqualified people, the product gets worse, and the industry gets less innovative. You're either an avid user of the software, or an avid builder of it, and if you don't fit into one of those groups, it's unlikely that you can make a software product better.
But then we get into another problem, just because somebody as a job title it doesn't mean they fill the role that title implies. And if they do, it doesn't mean they do it competently.
> just because somebody as a job title it doesn't mean they fill the role that title implies.
This is the exact problem. When more people subscribing to the title "product manager" are incompetent than competent, then it's not really a legitimate role.
It might be true that there are talented people out there, who when given creative control over a product, add value which pays for their salary many times over. The grift is the widespread belief that they can be found and hired with a job listing. It's difficult to tell who these people are, unless you know one through your network you are more likely to be conned by hiring a product manager from the market than not.
A good comparison is to hedge fund managers. There are some people who can consistently perform better than the market, they exist. But it costs a lot to invest with them, and unless you know of one specifically, just searching for people willing to invest your money for you will not turn out well.
Most companies are better off just listening to their engineers and users, some of them will have a knack for product direction, and leadership can weight their decision that way.
But somewhere along the way it became just a job title and they somehow manage to hire people fresh out of college to do this thing, which makes no sense and leaves them totally out of their depth, but forces them to do something to attempt to demonstrate to higher management there's a point to keeping them there so they can still have jobs. I don't blame the people in these positions at all. It's the blind leadership that has no understanding of what the role is supposed to be, that realistically you can't just put out a job req for this, it has to be someone who has actually worked already as part of these projects.
At least I think this is what you're saying, that ideally product managers should be people who were engineers at some point and understands how the product they're managing works, as well as the customer needs, not because they majored in business but because they've actually worked with that customer and have some experience meeting those needs.
It takes knowledge of the problem and solution domain to maintain a vision and lead a team. But in this case, the scarcity is with the problem domain. People who really understand what problem the product solves, and the best way to think about it, especially when their view leads the industry, are very hard to find. Too hard to build an organization around the implicit assumption of finding them.
My advice for organization building is not to have product managers, or even allow the term into your company. There are people with good product taste, they occur at some rate, and when you find them doing other things like writing software or supporting users you nudge them in the direction of curating a product vision and controlling resources to execute on it. Framed that way, the importance and hardness of the problem are very clear.
Oh, yeah.
The real grift management sold all over the world is the idea that your people should just leave after 3 or 4 years, you can hire new people for the same roles and they will be much cheaper.
You can't hire people for roles. If you change the people, the roles will change, and all the bets are off on whether they will be functional roles or not.
Note that the author of the article is doing webdev, which by now ought to be as routine as using PowerPoint. It's rather embarrassing that it's not.
The maturation & industrialization of development means that developers no longer are puzzling out the world from first principles. We aren't evaluating each library that comes along to figure out how it might fit into our bespoke apps.
> You become a Next.js developer, a React developer, a Rust developer etc
React in particular is such a distinction of development. It is it's own Terra Firma, solid ground, upon which developers stand, only barely coupled to the underlying platform. Knowing the web itself is still enormously good and helpful, but there is such a huge engine at your back, doing so much work, that is extremely hard to take as more than a black box. Even if you know the web very well, there is still a huge opaque engine between the code you write and the web actual that's targetted.
We don't have the raw experience anymore to be broader developers. Vs React, I think it's more ok to consider oneself a Rust developer, where one is still quite close to the metal, where the only focus-narrowing is to a general purpose language that's good for anything (especially with Rust being such a fore-runner at WebAssembly!).
It seems like big companies are doing some great things with WebComponents, but there's still so little broader attention, and little cultural energy for them. The lack is cyclical: there's scant developer culture around webcomponents, and so scant webcomponent acceptance & knowledge. It feels like such an opportunity for shared knowledge, for excitement, for figuring things out & making patterns, for a more grounded closer to the firm earth potential, and one that obviously needs the curiosity and excitement. But it's React Uber Alles, React on and on.
For those with ears, the old men always say the world is getting worse.
Who has time to work on free open source projects when your bills and groceries cost far more than they did 10 years ago? The kind of money that being a developer made me in 2015 was enough to pay all my bills, save money and buy cool stuff to experiment on for my hobbies. Now I make twice that much but I’m delaying paying my medical bills until they start leaving voicemails.
My spare time that I used to spend hacking and making things that other people could use is now spent trying to earn a little more, or away from work of any kind dealing with the stress.
I do agree that there has been a significant shift in developer culture because of business bros and hustle culture, but it's not nearly gone.
On the other hand, if that can cheer someone up... * https://ladybird.org/ * https://github.com/gorhill/uBlock * Zig rebuilt a C compiler from scratch https://ziggit.dev/t/zig-as-a-c-and-c-compiler/10963 * Rust rebuilt the core nix utilities from scratch https://github.com/uutils/coreutils I've heard a bunch of people making their own OS from scratch just to see how it works, heck there are guides online (https://github.com/cfenollosa/os-tutorial ... 30k stars...)
So... Cheer up ! If people are building all that, I'm sure innovation and creativity are at least not dead everywhere.
I understand the author, and if these days the developers are on it for the money and deliver, but don't bring up stupid shit like this, I don't care much. Not everybody is on it for any kind of creativity relief. Spending hours on their Lottie animations or shaving off miliseconds out of request is not really for everybody either. We are lucky to cross paths with people that care.
Mark Zuckerberg is a metric, given human form. He smokes meats because statistics show human males engage in meat-smoking behavior.
Ultimately I think it boils down to normie influx, people with no love for the craft getting in it for the money. That said I've seen grown "normies" approaching middle age catch "the bug" and get really into programming for its own sake. They rapidly develop tastes that align with my own.
And computing has always had its share of people whose interest in the technology was entirely in using it to meet business objectives, nothing more. Here's a video from 1975 that shows exactly what I mean. It's about the time-sharing facility recently added to IBM mainframes. Note the utter lack of imagination. Nothing is mentioned about the new things one could do with their mainframe with time-sharing. It's strictly about how much faster you can do the things you're already doing. Time and money saved. Numbers on a balance sheet. Seems stodgy to us, but that's how IT professionals thought in 1975.
That was not sustainable, the industry needs predictable employees. Even if that means many more of them. "Industrialization" of a process consists of nothing more than splitting that process into smaller, simpler tasks: Front end / back end / system ops / architecture, each split again by technologies, frameworks, languages, etc etc. Gone are the days where you could and needed to know everything.
The workforce increases massively, but since workers no longer need 10 years of intensive practice before being useful they are also cheaper, and most importantly, again : the whole process becomes predictable. You can replace a worker without jeopardizing your business.
The same process happened to many crafts during the industrial revolution, and that spawned similar culture wars between the old gard of craftsmen lamenting the poor quality of the industrial output. Maybe Stallman will be remembered as the Proudhon of our times?
Our times may be a bit more epic, because we were not only craftmen, we were building a new society, or so we though. Computers being machines of logic would help us become more rational, being accessible universal means of production they would blur the distinction between consumers and producers by making everyone a producer, and worldwide networks would turn our divided societies into a global village. Well, in just a few years the oposite happened: the machines locked consumers into walled gardens, greatly reinforced the power structure in place, and made us more divided than ever.
In 20 years from now, very few people will remember how free and powerful we have been.
Like my algorithm for drawing lines as fast as possible in 68000 on my ST. Then a few years later, I learned that someone else had invented it almost 30 years before me, and I was able to put a name to my algorithm: Bresenham. Dammit! The Amiga had it natively too. Dammit again!
Again, I invented preemptive multitasking on my 68000. Interrupt, little beast. Vector branch to my task switcher, thanks... save registers, including SP... restore registers from another previously interrupted routine, including SP... write the PC to switch and off we go. A fucking idea, a simple idea. That was cool to see it work, to see the 0 index color changed by each task every 10 scanlines or so. Made me smile. And then a few years later, I learned the word preemptive, and the concept of multitasking that goes with it. And I learned that the Amiga OS (that bastard rival again!) already did it natively. And other machines long before it. DAM-DAM-DAMMIT! I was born too late!
And my email reader on PC, running DOS, in 1991 or 1992, I can't quite remember. It was my first relatively big project in C, because for more than 10 years before that, I swore only by assembler, and I wasn't about to do that in x86, yuck. I didn't know curses or ncurses, but I still made a small TUI with windows and buttons. I was the only one using it for months, for email, mailing lists, newsgroups maybe too, I don't remember... Then one day, a conscientious sysop sent me an email asking me the name of the email reader I was using, because the machine he was administering had flagged a header that wasn't quite right, and he wanted to let the author know :-) My first bug report... under those circumstances, it's not something you forget. “Thanks for the feedback, buddy, but it doesn't have a name, I'm the author, and I'll fix that crap!”
I'm sure you have some fond memories like that too.
These are just examples. There have been others. Maybe even things that no one has ever done before, but that doesn't matter. Because it's still fun, whether we're reinventing the wheel or not. Especially when we don't know anything. It's rewarding when it works. Lots of little moments of pride that we keep to ourselves. Pride in having invented something without anyone's help, when all we had were “XXX Bible” to glean technical informations from, or BBS.
So don't listen to them. They have nothing interesting to say. They never loved programming, they always pretended. And today they tell you that finally, we no longer need to code, that we are finally relieved of this thankless task, that we can finally focus on what really matters.
Bullshit. Either die with your mouth open or let me die in peace! What matters is what we love. The rest is just survival. So if I have to die from my obsession, so be it.
By all means, OP, don't implore them. They've choosen their path, and we've choosen ours. Whatever you say about that won't change anything.
I did not share anything. Am I selfish? Not sure. I did not think it could be fun to others, or worth it. Especially when you consider stuff that already existed since ages, and undoubtedly much more elaborate. “All those moments will be lost in time, like tears in rain” :-)
This parallels another creative domain: cooking.
If you've ever wondered what's worth making yourself, it really comes down to the goal of building skills, or obtaining the unobtainable.
The latter is all over cuisine. There are lots of dishes and ingredients that are not economically viable as products. Factors like shelf-life, seasonal availability, cost of production, complexity of preparation - all that stuff is absolutely worth taking on yourself. We never see a huge portion of the world's cuisine at the supermarket for those reasons. Restaurants are better since they cook meals, but they're limited by similar economics and making money at volume & scale. The only way to go deeper is to DIY.
Just like with cooking, there is a huge range of possibilities outside economic viability that applies to any technology. Build a kit car that can't be manufactured on an assembly line. Make stuff out of wood that you'll never see at a craft fair. Build electronics and software with insane BoMs that no entrepreneur would touch with a 10-foot pole. Renovate a room in your house to taste in a way that no contractor would dare take on. If this stuff scratches an itch or enriches your life in some way, learn, explore, and go do it.
Lack of profit motive (mostly). Sadly gone are the days of making a killer app and getting fame and fortune. So it is harder to justify spending years of all your free time to build something new.
Through the early 2000's most people entering the IT fields did so for paychecks. There are far fewer pure geeks (as a percentage) than there used to be. My first job out of college as a programmer paid about a dollar an hour over minimum wage. I did not go into this field to compete with the finance bro's financially. I went for the love of technology. That changed so more people started doing this as a job not a lifestyle. These people are not nearly as interested in the experimentation that leads to new innovations.
I'm far more interested in the "many others" this guy had in mind.
The distracting outer point is "Build what you Can’t Ship", which has limits, and is just another iteration of the old don't sell out idea about art. It's like saying "be an outsider artist". That's not a good goal. It's all very well spending years writing a book in secret - supposedly without a thought for the audience - but if you accidentally wrote it in a private language that nobody can ever read, or made it unrelatable or incoherent, then that's not creative brilliance, it just isn't any good. Without pandering to popularity, you should still write for some audience, and if you're making art at all, imagining an audience is inherent to that. Similarly if your software only runs on your own machine, or (referencing a recent HN post) you perhaps spent years writing an adventure game in QBasic for the love of it, but accidently made it 64-bit so nobody can run it on DosBox, well, that's suboptimal.
So another point is that writing for a niche market is a fine attitude. On the other hand all this you only rent it stuff makes the niches smaller and more obscure, and that's a bad thing. You want a comfortably-sized niche where there's a medium-sized audience, so you get some attention in return for being subject to only some pressure to perform and conform.
That is a choice you make. Software development doesn't have to be that way.
In fact, it's saner for your productivity, ease of maintenance and onboarding new team members, and ultimately for your users, that you choose boring technology over the shiny new thing that just launched, or that is currently trending. Sure, you won't be able to add yet another buzzword to your CV, but you will deepen your knowledge of the stable technology, won't have to keep track of a constant stream of updates that may or may not break your application, and will have a much better chance of finding developers familiar with the boring tech. Most importantly, you will subject your users to fewer risks, since the product will be built on stable ground.
Developers often forget that the reason we write software is to solve a problem, and not to serve our own nerdy desire to play with tech. Enjoying your work is important, and the choice of tech plays a role in that, but the main motivation should come from solving a problem first, not from the tech itself.
This reminds me of a heated debate I had recently[1] about a popular project that was rewritten multiple times for little reason beyond the developer "felt like it". This is insane to me, yet for some reason, many people don't mind.
The criteria I prefer to use when choosing a tech stack is:
1. Pick the right tool for the job. Depending on your requirements, narrow down your options to those that would be most helpful for building the product, whatever "helpful" means in your context. I.e. if you need it done fast, learning how to use it wouldn't be a good idea.
2. Pick the boring, battle-tested, and proven tool. Discriminate. Do you really need all the features of tool A? Err on the side of simplicity.
3. And other things like: what does the company/environment already use, what is the team most comfortable with, and so on. Consistency and familiarity are important.
But I agree with others here. As much as I lament the current state of the software industry, a lot of it has to do with the sheer explosion in popularity of the field, not with us losing anything. The same people who were building awesome stuff decades ago, are still doing it today. They just have to deal with a lot more bullshit now.
But it is simply untrue.
Just the other day a single engineer completed a 25 year project to emulate VideoDisc games. There’s a new JS framework or a new static site generator every day of the week. And with LLMs it’s never been easier to be curious about something and go tinker.
balamatom•4mo ago
jebarker•4mo ago
defgeneric•4mo ago
E.g. there would be enormous difficulty in replacing the Dewey Decimal System with something else, if only due to its physical inertia, but with a computer system a curious clerk can invent an alternative categorization and retrieval system which inevitably touches on mathematical topics.