As a teacher or many other professions? Forget about it. You need to either marry someone with a more lucrative career, or move somewhere more affordable.
Working on fixing our housing shortage has felt extremely meaningful to me.
I'd like to find some of that idealism in software again.
Disclosure: I work at govstream.ai - we work in this space [we're hiring!]
They have a business tax rate above 106% of profit[]. That is it is illegal for a business to make a profit.
Yet there is apparently a video out there of a black market cart seller selling wares right in front of the Argentine tax office, totally unbothered.
It made me wonder if this was just an allegory of what's in store for us.
[] https://archive.doingbusiness.org/content/dam/doingBusiness/...
Under that model, 10% of your time is completely up to you (within reason) to work on things that aren't your main project or scope.
Works out well for R&D or more open ended positions, since you can have the flexibility to explore hunches without having to justify a whole project around it.
Unfortunately this is the consequence.
Sure I was a hobbyist in the 80s programming in assembly language on four separate architectures by the time I graduated college in 1996 in CS. But absolutely no one in my graduating class did it for the “passion” we all did it because we needed to exchange money for food and shelter.
The people jumping into tech during the first tech boom were definitely doing it for the money and the old heads I met back then programming on Dec Vax and Stratus VOS mainframes, clocked in, did their job and clocked out.
When I went into comp-sci, it wasn't the cool path to riches it's been marketed as for the past 15 years.
EDIT: you graduated around the same time as me. Sure everyone wanted jobs. But there were easier paths to getting a good job in '95/96 that cramming late night comp-sci work. Almost everyone in my class had grown up in the age of hacking around on Apple IIs, their first PCs, etc. No one just randomly ended up in the Comp Sci part of the university because they just wanted a job to make money.
And most of the people in my class had their first exposure to computer programming at my state school in south GA was in college.
Took me half a year to get them to value Sentry, lol.
I’ll just collect my check and go do something else.
Fighting the Product Manager, fighting the Designer, sometimes even fighting some micromanaging stakeholder that won't leave you alone.
It's definitely fights I can win, but do I even have the energy anymore? Development work involves more than meets the eye. While there are some folks who understand the technical intricacies, it's tiring having to join discussions you know won't go anywhere.
I wish it was 2000-2010 again, when my biggest problems were Sales promising features we don't have and then having fun with the other devs coding it.
I used to have a lot more mental bandwidth and energy to be "curious" and to tinker once upon a time. But now the world is so literally and figuratively on fire and every executive is so rabidly frothing at the mouth over AI that now I just want things to "just work" with the least amount of bullshit so I can just go home on time and forget about work until the next business day.
I just want this fucked decade to be over already.
World will not end if project is delayed by few weeks. You get time for your own tinkering (never tinker on company stuff, even if that would improve things, unless you are shareholder).
Apart from may be few core infrastructure primitives at public Cloud providers most of IT stuff today is API calling API calling API and so on.
It will be the case until Human is Out Of Loop from most of the IT work.
for me it feels like I have to spend all day fighting with folks who are constantly “holding it wrong” and using libraries and frameworks in really weird/buggy ways who then seem completely uninterested in learning.
In my free time I love working on my own projects because I don’t have to deal with this bullshit and I can just craft things finely without any external pressure.
Honestly some of my best jobs were at places that had a nicely balanced practice in place and the backbone to remind execs that if they interrupt makers daily with new shiny asks they will in effect get nothing (because nothing would ever be completed)...
But obviously we can both have worked at places with those labels with vastly different implementations and thus experiences :)
Any non-small company has plenty of people that need to justify their salaries.
Meetings is one of the most effective ways to actually pretend to be working.
Subsequent governments turned the profession into the captive market, where you can only realistically work for corporations who fix the wages by following so called "market rates" and you cannot create your own job if you disagree with the rates.
Jobs in London pay less than peanuts and if you earn six figures in the UK, income tax takes half of it anyway even if you go to FAANG.
The average university grad would be better off in law/finance/medicine by income in London. This isn't to stay the top software Devs don't get paid a lot, but it's a minority compared to the legions of high paid people in finance in London and the surrounding industries.
Many consultant friends I know and business owners have moved away from the UK to low tax areas in Portugal or UAE.
I got started in the 1980s, and super-curious and technical people were the norm. We were incredibly strongly attracted to computers.
The first real growth in computers in that kind of era was Wall Street and banks. Wall Street in particular started paying huge bonuses to developers because it was clear that software could make huge piles of money. Then we started seeing more people joining in for the money who were not necessarily passionate about technology.
Then came the dot com era and bust, and then the rise of social media, FAANG, and absurd corporate valuations allowing ridiculous total comp to developers, and the needle moved even more towards money.
The net result is the curious and the passionate are still here, but our numbers are massively diluted.
I come places like here to find that passionate niche.
This has been happening since the 2008 financial crash when a lot of people would have normally gone into careers on Wall Street, but the shrinking Wall Street job market led people into tech as a high-performant, decent paying career..... (U.S. biased opinion, of course)
That's not entirely true. We (society, definitely US) pushed going to college HARD for the last 3-4 decades and glamorizing how much money you'll make. Now, we have an overabundance of people with college degrees and thousands of dollars in debt to those degrees.
There's plenty of career paths where you could make decent money that don't require a college degree.
We should have been pushing people to figure out what they wanted to do, not "Make lots of money", and figure out the path that gets them there.
The sad reality is that "everyone learn to code" was by and large a marketing distraction from the severe structural unemployment the fast and loose economy is in. No a coal miner can't just learn to code and get a job in WV, certainly not 1,000's of other miner sin the same position, not can the millions of people that corporate laid off over those same decades.
Coding was a way out of poverty, but for most people it was just a distraction to keep them from seeing how bad the economy is.
Americans are poor: PNC Bank's annual Financial Wellness in the Workplace Report shows that 67 percent of workers now say they are living paycheck to paycheck, up from 63 percent in 2024. https://www.newsweek.com/2025-rise-americans-living-paycheck...
I interviewed many people from top universities and they absolutely scream "I couldn't care less about the field, I'm just here to maximize the compensation".
At the same time I get 19 year old self taught kids who are miles better at programming, learning and are genuinely passionate.
My first trip through college I studied business and then the economy collapsed. Most people my age eeked their way through menial jobs (like me) and survived, found a way to break through, or, (like me) went back to school years later when the economy improved to try to find another opportunity. For me the choices were CS or nursing at that time, and I have always been good at math and with computers, so I chose that.
I wouldn't say I ever "loved" development, especially not the current corporate flavor of it. I've had some side projects when I get time and energy. But there's never really been a point in my life where I could ever have afforded getting the level of expertise I possess now just for the "curiosity" of it. Not everyone has a trust fund or safety nets.
Just the bar is so high now, so much competition, so many cargo culting startups that only do bad leetcode interviewing.
It's very hard to both find and get hired at places that want more than a coding monkey to just blindly move Jira tickets.
Here's an example from my perspective.
Recently while developing a way to output a PAL video signal from two digital lines(an endeavour obviously driven by curoiosity more than utility). I learned a great deal about far more than I would have if I had not have used AI. I wasn't blind to what the AI was emitting, It helped me decide upon shifting one output to 0.285v and the other to .715v. Write a program to use pytorch to learn a a few resistors/capacitors to smooth out the signal and a sample of 2-bit data that when emitted though the filters produced a close sine wave for the color burst. AI enabled me to automatically emit Spice code to test the waveform. I had never used spice before, now I know the pains of creating a piecewise linear voltage source in it.
Yesterday I used an AI to make a JavaScript function that takes a Float32Array of voltage level samples and emits a list of timings for all parts of a scanline along with the min and max voltage levels, calculates the position and frequency of the color burst and uses the measured color burst to perform the quadrature decoding to produce a list of YUV values at a lower sample rate.
This should let me verify the simulated waveform so that I can compare the difference between what I intend to emit, and what should be a correct signal. If there turns out to be a discrepancy between what I am emitting and what I think I am emitting, this will come in quite handy.
Perhaps I might learn more if I did all of this myself unaided by an AI, but it would also take much longer, and likely not get done at all. The time I save writing tests, wrangling libraries and software is not stored in the bank. I used that time too, doing other things, learning about those as I did so.
They want to produce something without having the skills to produce it. Which, you know, probably isn't uncommon. I'd love to be able to rock out the guitar solo in Avenged Sevenfold's "Bat Country" [0] or "Afterlife" [1] or the first solo in Judas Priest's "Painkiller" [2], but to get to that skill level takes years of practice, which I'm quite frankly not willing to put in.
The difference is the honesty. A vibe coder produces something barely more than "Hello world" and brags about being able to produce software without learning to code. Nobody grabs a guitar, learns two chords, then claims to be a guitarist.
[0] (mildly nsfw) https://youtu.be/IHS3qJdxefY?t=137
That's off by large factor.
What I could find quickly was an estimate that the top 400 own a little over 4%, not 50%.
What dampens the Spirit is same as everyone - a treadmill you cannot get off, punishment for independnat thinking.
Dev culture is not one thing that is found in dozens of companies - dozens of companies have their own culture - and if that is a curious and empowering culture you have curious and empowered devs, and salespeople and operations and chemists and …
Culture is what we make it
You won't tour for long as a one hit wonder and I think what's being said by the OP is quite similar
I think it depends on the circles you're in. For example, I see a lot of interest in the "Handmade" way of doing things, largely inspired by Handmade Hero. Almost feels like a comeback of what you consider to be dying. There are people who are interested, but one needs to look for them. I recommend it.
That wave feels definitively over now, making mobile apps in 2025 is much like doing WinForms in 2003. Hopefully something new will come along that shakes things up. In theory that's AI but as a developer I find AI tremendously unsatisfying. It can do interesting things but it's a black box.
For me personally... I'm older and married with kids. My free time is so much more valuable than it was back in the day. I still try to be a curious developer but at the end of the day I need to get my work done and spend time with my family. There's enough of a financial squeeze that if I did find myself with an excess of free time I'd probably try to spend it doing freelance work. So whenever this next wave does arrive I might not be catching it.
SWE culture was very different in a low interest rate environment. Teams were over staffed. No new tech came around for a long time so everyone was focused inward on how to improve the quality of the craft. At my big tech company some teams had 2-3 people dedicated to purely writing tests, maintainability, documentation, and this was for a <1m MAU product.
Then boom free money gone. Endless layoffs year over year. Companies pushing “AI” to try and get SWEs to deprecate themselves. It’s basically just trying to survive now.
That wizard that used to nag everyone about obscure C++ semantics or extremely rare race conditions at distributed scale has disappeared. There’s no time for any of that stuff.
Like all cultures, this was all performative. People astutely observed how to say and care about the things that they saw, the people above them, saying and caring about, and mimicked them to promotions. That doesn’t work anymore, so that wizard culture is long gone.
There's still people taking on new frontiers... even if you don't love crypto (and I don't!), a lot of very curious developers found a home there. AI is tougher (due to the upstart costs of building a model), but still discovery is happening there.
I don't think curious developers are gone... there's just an increase of un-curious developers looking for a paycheck. You just have to look harder now (although I think it only seems like we had a cohort of curious devs because we're looking at it in hindsight, where the outcomes are obvious).
TFS was introduced in 2005 for Microsoft shops for instance.
I'm rather envious of kids today who have access to Google, Wikipedia, YouTube, and (with caveats) ChatGPT when they're truly interested in a topic. They can dive a lot deeper than I had the opportunity to without bringing in adult assistance.
"Can" is a critical word in your comment.
MySQL was available for free in 2000 and anyone could download any number of language runtimes for free like Perl and Java. If your corporate overlords weren’t cheap (or you were in college) an MSDN subscription was amazing.
JavaScript has been around for decades. But jQuery made it so much easier, and then React built on top of that even more. And jQuery wasn't the first DOM library, nor was React the first framework – but both were where it seemingly clicked between ideas, usability and whatever else made them successful.
(I will agree that Microsoft had a run of things where anyone who bought in to their ecosystem had a lot of things that worked well together.)
I was young and didn't have many responsibilities then, and lots of free time. Now I'm a dad with a mortgage and an interest in local politics because I want to 'leave it better than I found it'.
All that said... I do think there have been some shifts over time. I grew up in the era of open source taking off, and it was pretty great in a lot of ways. We changed the world! It felt like over time, software became mainstream, and well-intentioned ideas like PG's writing about startups also signaled a shift towards money. In theory, having F U money is great for a hacker in that they don't have to worry about doing corporate work, but can really dig into satisfying their curiosity. But the reality is that most of us never achieve that kind of wealth.
Now we find ourselves in a time with too much concentrated corporate power, and the possibility that that gets even worse if LLM's become an integral part of developer productivity, as there are only a handful of big ones.
Perhaps it's time for a new direction. At my age I'm not sure I'll be leading that charge, but I'll be cheering on those who are.
It's certainly true that IT has grown vastly since those good old days, but there has always been a proportion of people who're just... not that interested in what they're doing. For example I remember being mildly horrified in around 1998 that a colleague didn't know how to run his compiler from the command line; without an IDE he was lost - but I doubt he was the only one.
Meanwhile the idea that there's a dearth of cool new stuff seems quite quaint to me. There's a whole bunch of cool things that pop up almost daily right here on Hacker News². Just because they haven't spread to ubiquity doesn't mean they're not going to. Linux was not mainstream right out of Linus's Usenet announcement - that took time.
As to corporate power? They ebb and flow and eat each other (Data General, Compaq, DEC ... remember them? Remember when Microsoft was the major enemy? Or IBM?)
¹ https://en.wikipedia.org/wiki/Good_old_days
² Edit: Not to mention, there's also a whole bunch of crap that's not very interesting. But survivor bias means we'll have forgotten those in 20 years time when we're surveying this time period; as Sturgeon's law reminds us, "90 percent of everything is crap."
It just feels like "it's a job" is more of the zeitgeist these days.
And yes, I'm also well aware of what came before 'my time' - mainframes and such were definitely an era where the power was more with the large companies. One of the reasons Linux (and *BSD) was so cool is that finally regular people could get their hands on this powerful OS that previously was the exclusive purview of corporations or, at best, universities.
As to cool projects, sure. They're fun, interesting and creative, but perhaps not part of (a very vague, admittedly) "something bigger", like "the open source movement" was back in the day.
But if you're looking for that spark and excitement again, you need to get back out to the frontier. One frontier that is particularly exciting to me is using AI to speed up the tedious parts of the development process, and to tackle areas where I don't have specialist knowledge. Similarly to how Linux opened up a powerful OS to individuals, AI is enabling individuals to create things that would have previously required large teams.
Perhaps over time it'll get efficient enough to run outside of huge companies; that could be an interesting aspect to keep an eye on.
Though certain novel uses could lead to new individuals or entities gaining power.
I'd like to be hopeful and would like to hear good arguments for how this could happen - but it seems to me improved technology on the whole leads to increased concentration of power - with exceptions and anomalies but that being the dominant trend.
Or, you know, if AI is the mainstream hotness or just doesn't float your boat, look for what the iconoclasts are up to and go dive into that, not whatever the VCs are flinging their gold at today.
But... they're still there. They're a little diluted, but I've not yet worked somewhere where I had no like-minded tinkerers amongst my colleagues. I don't think I'd want to, but it just hasn't come up.
> As to cool projects, sure. They're fun, interesting and creative, but perhaps not part of (a very vague, admittedly) "something bigger", like "the open source movement" was back in the day.
But the free software movement dates back to the early 80s, not the 2000s that we're talking about. Open source itself was being seen as a dilution of the principles of free software in the late 90s/early 2000s. More to the point, free and open source software is still very much here - we're absolutely surrounded by it.
> mainframes and such were definitely an era where the power was more with the large companies
It's oscillated. DEC used to be the zippy young upstart snapping at IBM's heels you know. Microsoft didn't start out big and evil; nor did Google if it comes to that. Put not thy faith in shiny new companies for they shall surely betray thee once they devour the opposition... :D
Old man yells at cloud services
But most of the people I went to uni to study computer science with at the end of the nineties were there for the money. Even back then it was all about money for most programmers.
I lamented when my career first started (2000 or so) that there were devs I worked with who didn't even own computers at home. While my bookshelves were full of books I aspired to learn and my hard drive was full of half-baked projects, they clocked out and their thinking was done.
I still know a few of those now 25 years after the fact. Some of them have made a career out of software. But they never got curious. It was a means to an end. I don't begrudge them that. But as someone who is internally driven to learn and improve and produce, I can't relate.
My primary fustration today is how many of my software peers are satisfied with updating a Jira status and not seeking to build excellent software. I've seen it at all levels - engineers, managers, and executives. I'm actualized by shipping good, useful software. They seem to be actualized by appearing busy. They don't appear to deliver much value, but their calendars are full. It has me at my professional wits end.
Truth be told, the phenomenon of appearing productive without being productive is an epidemic across multiple industries. I've had conversations with people in manufacturing and agriculture and academia and they all echo something similar. Eventually, Stein's law indicates that the productivity charade will end. And I fear it will be ugly.
I have never in 30 years written a single line of code that I didn’t get paid for except a little work I did for charity.
And plenty who are not, it takes all kinds.
It's a matter of taste and still all tastes may not be satisfied anyway :)
For years I was a part time fitness instructor and runner. I loved hanging out with friends, being in front of people, meeting them at races and us training together. It’s completely different than being at a computer at home - after working all day on one.
This is hardly a new phenomenon. Dilbert and its ilk have been lampooning this since the 80s.
In another case, I had recently moved to a new city and we were targeting an internal proprietary platform (again with Windows NT) and also targeting Solaris.
There was a time when you would go to work and you would be working with header files and libraries that were proprietary and for which your company was paying an exorbitant per-head license fee.
IMO this is the part that the author is missing. Back in the 2000's, software development was a much smaller field and your main focus was the "curiosity pond" where all the developers went to tinker.
Now software dev has expanded into an ocean. That pond is still there but the author missed the pond for the ocean.
where is this ocean? that I have all these big pre-cooked components I can use to make saas spaghetti?
Couldn't agree more. Like many, I've had my honeymoon phase with AI and have now learned what it is good for and what it is not. What it has truly been good for is satisfying the nauseating number of topics I want to learn about. I can spend $20 a month and drill down into any topic I like for as long as I like in an incredibly efficient way. What a time to be alive.
I think a lot of people have lost faith that technology can improve the things that they care about. Even open source doesn’t seem to have made much of a difference in preventing, well anything bad in the last few years.
If we want to have a better dev culture there has to be a reason for people to believe that the software they make is actually going to improve people’s lives and not just accelerate the profits of multi billion dollar corporations.
Web and the whole cloud/backend scene has become toxic because of the work culture around them. I know of a therapist on the west coast that has become completely snowed under by a surge of software developers claiming mental problems on account of their working environments, and she was in such disbelief that she was asking around if what she was hearing was possibly real. Other professionals simply would not accept what has been going on.
Omarchy Bitchat Ghostty Crush
None of those are chasing metrics. And that’s just off the top of my head.
I’ve been there, looking for pennies in the couch to be able to afford a burger while I waited for my next contract gig deposit. Even if your project doesn’t become the next big thing, you’ll end up with something to show in your resume. That will open tons of doors.
I am doing more side projects, and finishing more projects, and feel a much greater level of confidence in starting new projects since I feel more confident that I will get at least an MVP working. These are not commercial efforts, I am just tinkering and scratching my own itches.
I attribute 3 reasons to this change:
- Vibe coding helps me do parts of the tech stack that I used to procrastinate on (UI, css)
- Gemini helps me solve all the inscrutable devops issues that used to block me in the past.
- A great open source tech stack that just works (Postgres, docker, node, ollama....)
AI helping me with the above has allowed me to focus on the "fun" parts of the side projects that I do. And the UIs also end up being much prettier than what I could create myself, which gives me the confidence to share my creations with friends and family.
-- a 28 year old
There was time when being a software developer was not a particularly prestigious or well-paying job in corporations, or maybe a weird hobby of developing games for the toy 8-bit entertainment computers of the day. It was mostly attracting people who enjoyed interacting with computers, were highly curious, etc.
Then there was a glorious time when the profession of software engineering was growing in importance by the day, hackers became heroes, some made fortunes (see e.g. Carmack or, well, Zuckerberg). But this very wave was the harbinger of the demise: the field became a magnet for people who primarily wanted money. These people definitely can be competent engineers! But the structure of their motivation is different, so the culture was shifting, too. Now programming is a well-paid skilled trade, like being a carpenter or a nurse.
If you want hacker ethos again, look for an obscure field which is considered weird, is not particularly well-paid, but attracts you.
In the past I did many mistakes like pulling all nighters to because I found a way to make checkout experience more pleasant. That resulted in massive increase of revenue and none of that benefitted me. Or unblocked other team, they couldn't find a reason why their app would randomly crash. Board was panicking as client was going to pull out. I saved the day. Multi-million contract gone through. "Thank yous" didn't help me pay off debts.
Only be curious for your own stuff. For corporations? Do bare minimum.
Luckily though, none of those places would ever even look at my resume.
That isn't reality, however, and so most of that energy is consumed by my day job, and it feels wasteful to put what little remains into projects that have little chance of any practical return. Any time I start settling into work taken up out of pure personal interest, the "responsible adult" part of my personality starts stratching at the back of my mind and pushing me to go do something more productive.
Such is life.
Arguably there might be more curious tinkerers nowadays, but they might represent a smaller slice of the pie.
Maybe only possible once you could finally own a whole "system" single-handedly and do whatever you wanted, for the first time ever.
Perhaps the fundamental concepts of "owning" your own and doing whatever you want with it have been allowed to dwindle so badly it seems like no comparison.
Theres a layer of pessimism to engineers and hacker news that has been steadily growing (as I assume the average age increases).
To me it's hit a critical level and I have to disregard most negative comments here by default because I can no longer trust the average commenter to have a productive balance of optimism and curiousity.
----
On a different note, the point the author is trying to make is massively undercut by the ad spam all over their page.
It was so grotesque (and out of character for a dev blog) that my first assumption was that I had a malicious extension somewhere.
Ownership, royalties, voting would be embedded in a block chain. Proof of work would be by vote. And votes given for proof of work. Or something like that. In music they have "royalties" and it seems like that could be used for contributors.
If you would like to be part of a discussion send an email to my firefox relay 3tdm026f9@mozmail.com
Feel free to use a relaly.
I've already seen how people scratch other's backs in peer feedback during performance reviews, and I've heard plenty of description of negative aspects about promotion-oriented behaviors driving what people work on at companies notorious for that kind of stuff. Not to mention all the actual biases pervasive to the "meritocracy" crowd.
In modern society, if you're not trying to monetize all of your hobbies and every little thing you do you are seen as doing something wrong. Everything has to be a hustle these days. You're not allowed to do things simply because you enjoy it.
Now it's on github, and if you don't get enough followers or forks or it's not in a popular language or framework or you haven't updated it recently enough it's seen as a "dead project" or a failure. A project can never be "done" because then it's dead. That's demotivating.
Social media damages everything it touches.
The most social coding I've ever experienced was Bukkit, the old Minecraft server thing. I was noob in high school, made plugins for little things I wanted, people installed them, they gave feedback like "this is genius" or "pls add MySQL support cause your flat files suck," I learned, it was great.
Personally, I am excited that AI is steering people away from tech that aren't actually interested in it. Reverting to the mean a bit. And like the downvoted comment below, I actually think a swath of "vibe coders" are much more inline with the hacker mindset than most developers. A lot of them are the "make a quick buck" types but there is also a ton of insane tinkering going on, which is awesome.
But maybe we are talking about two different things. There is a distinction between "I want to hack on this to see how it works" and "I want to hack on this to see if this IDEA works". So product hackers are ascending while engineering hackers are starting to dwindle.
It reminds me of the shift in car culture when car computers meant you couldn't just rebuild a rusty car over a summer but a new culture of car hackers bubbled up mostly around modding cars for drifting or whatever. The people were different, the work was different, but the curiosity, excitement and subculture grew into something very similar.
This may be hitting developer culture hard but it's much broader than that.
We used to have to hack things together because nothing worked. There was no consistency, standards were all over the map, software solutions for most things didn't exist, and running software on the major vendor ecosystems was heavily silo'd.
Dozens and dozens of technologies changed that. Web protocols and virtual machines broke siloing. Search engines and community forums made discoverability much, much easier. We passed the tipping point where hardware was only valuable if it could be connected to an ecosystem, so engineers started building standards like USB, wifi, bluetooth, and a TCP-accessible interface into everything. And an army of open-source hobbyists wrote hundreds of thousands of libraries to "X but in Y."
So hacking itself has moved away from problems like "get a telephone multiplexer to translate a bitstream to colors on an anlog TV" and towards "What nine libraries and five gadgets will you glue together to" (for example) "let your refrigerator track your shopping list," or "How can you make setting up email not feel like hacking your left arm off for the average non-computer person?" Because those are the kinds of problems that are still unsolved.
It's a different kind of hacking requiring curiosity at a different level and sometimes a different problem-solving skillset (less experimentation, more curation and cataloguing).
I think there needs to be a distinction between artist and artisan. Art exists for its own sake, code exists because its useful. I don't want code that reads like poetry, I want code that works so I read actual poetry later.
> Have a project in mind that you’ve always wanted to tackle but it never made sense to you to do it because it would never be used by anyone else or it would never make you any money?
I appreciate the tinker's and hobbyists, software is endlessly interesting as a career, and I'm thankful to be here. But I only want to build code that is useful.
And anyway, how useful is your code, really? I will not generalize or make assumptions, but you’re also not going to tell me what it is, right? So scrutiny for thee, but not for me?
And if it’s like, “I make Dagger wrapped implementations 17 layers deep in a Google product you’ve heard of”: by now you should know that the thing sincere people say about insincere people, “We watch what Hollywood says is good,” applies to shit that Google, Apple, Amazon and all these super high paying job companies do too. If you are conflating many users with useful, that’s the problem. Facebook, TikTok and Instagram could vanish tomorrow, and literally nothing meaningful would be lost.
Is “useful” to you, “everything that I do is useful, and everything I don’t do, maybe”? You don’t get to decide if your POVs are reductive. They just are.
I appreciate exposing yourself for a contrarian point of view, noble if fatally flawed.
It is very impressive (in a disheartening way) how easy it was for The System to convince us to constantly spy on each other “for our own good”.
People play and tinker when they feel that they are in a secure enough environment to fritter away time without feeling like they've incurred risk by doing so.
Given the state of the climate, economy and politics today, I think a whole lot of people feel a whole lot less secure. When I look back at recent US history when there seemed to be the most innovation going on, it was the 90s after the fall of the Berlin Wall and before 9/11. That was probably the "OK-est" a lot of folks in the US felt in their lives.
You might rightly point out that people are wasting lots of time these days, staring at screens, binging TV shows, re-reading giant sci-fi and fantasy series. That's true. But there's a big difference between wasting time escaping the world versus "wasting" time creatively engaging with it.
Are you still trying for remote? A few years ago when I was down bad and RTO was starting, I found that remote was near impossible to negotiate for “normal” devs even with such extreme concessions.
I personally love the craft, but battle the entrepreneur in my brain telling me not to waste time learning things that won't bring tangible value.
Most of my curiosity is tempered by how it can make me money.
I do appreciate that 50-70% of the boring work can be done with AI agents now. As long as you know enough to have opinions and guide the process along, it can be helpful.
Expertise and learning don't seem to be AS important with the upcoming gen of developers. However, there is also so much out there now that it would be much harder to start from zero as opposed to being there from the beginning.
But I think gen X and Millenials were probably peak interest and curiosity, now it's just a job for the later generations.
The curiosity hasn't disappeared from the culture, but it might not be brought in to a workplace anymore.
I think a lot of us have stopped bringing the tinkerer itch to work.
Outside of the workplace, there's an entire parade of tinkering by folks who at best post it on Youtube, not here (I watch "Stuff Made Here" for the code).
Of all the events of the past decade, the worst hit to the tinkering visibility has been Github making personal repos private by default.
Mostly the folks who were like me still have pet projects, most of them will share their code but only if you ask because it is "Well, not as nice as it should be".
I've got hundreds of repos in my github, but there's a sharp fall-off in what's public (there's ~113 public and 180 private) right when that happened and I'm sure I'm not the only one.
The tinkering is more active than ever now with vibe coding tools, where I can draw an svg out and then ask it to "I want to use manim.py to animate this" to get something which is a neat presentation tool to describe how data moves between systems.
But is it worth showing you when all the fun there was in making?
What if all I am likely to get "So what?" as the only response. Wouldn't that it make it less fun?
Since then devs got squeezed more and more so that nobody has any time for trying out stuff. Tech debt accumulates and nothing improves. When you have an idea, you have to submit a proposal to a review board which approves requests from politically connected people and rejects other requests because other deadlines.
This development has taken out everything that made me enjoy about the job and I am good at. Thankfully I am reaching retirement so I am happy to leave.
The industry has flooded with money motivated people, rising the income of the curious (but not exactly marketable) engineer. Yes those people who flooded in might be uninspired, loathsome, buffoons (in the eyes of elite nerds). But also it's the opportunity for your hobby to be mainstream, encountered by those who likely never would have, to not be denigrated for your skill with technology etc.
I'm grateful for how software has progressed from IQ 160, to 140, to 100, to 95 segments of the populace. It means we're winning culture over. It means we're solving problems (including how difficult it used to be to engage with). We've made previously wildly difficult things be table stakes for todays app. (one trite example: long polling became websocket pushes)
We should be celebrating how mainstream we've become.
In 2000, at my very first job, was when I first met a developer who got into it for the money and not for the love. When he told us he picked computer science in college because it seemed like it was a good way to make a living, and a lot easier than law or medicine, the rest of us programmers looked at him like he had sprouted a second head. By 2010, people like him were the norm.
Who cares, though?
If you're a "curious" developer, the existence of a massive preponderance of incurious engineers who are in it for the money doesn't change who you are. It doesn't have to change how you see yourself.
Socially, there are more "curious" developers to connect with and learn from than ever before.
The downside is that people outside of the industry will draw conclusions about you based on their perception of engineers as a whole. Boring and mercenary.
But let's face it, in the eyes of most of the population, boring and mercenary is a step up from how we were perceived when it was just us nerds who were weird enough to enjoy it.
Millions of people entered the field, many of them explicitly because they saw it is a good job opportunity. The average software developer is now a completely different person than they were ten or twenty years ago. Importantly there has been a major shift towards people in India and other Asian countries, where development has been outsourced or where developers are hired from as well as differences in college graduates. This is clearly reflected in the job market, which is getting more competitive.
It’s fundamental, if you want to find that pioneering spirit again you have to leave your comfort zone and go exploring somewhere off the map.
It's gotten significantly harder now that I have a toddler and my S.O. works, but I can't help myself from stealing time for it.
Imposter roles are jobs that are created working backwards from "job at company" to something that an individual can realistically claim they do at the company. They became prominent in the last tech bubble when there was a lot of wealth being created and people wanted to go work at places like software companies, where they could not realistically contribute.
"Product Manager" and "SCRUM Master" are just some of the imposter roles that you've probably encountered. When you scrutinize the existence of these roles, there is a swift and immediate backlash from people who's lifestyle and livelihood is at stake. Product managers will point to famous people at Apple called "product managers" to distract from the fact that the median product manager does not add value.
When an organization creates a role that subsumes all of the creative control, and fills it from a pool of entirely unqualified people, the product gets worse, and the industry gets less innovative. You're either an avid user of the software, or an avid builder of it, and if you don't fit into one of those groups, it's unlikely that you can make a software product better.
Note that the author of the article is doing webdev, which by now ought to be as routine as using PowerPoint. It's rather embarrassing that it's not.
balamatom•2h ago
jebarker•1h ago
defgeneric•59m ago
E.g. there would be enormous difficulty in replacing the Dewey Decimal System with something else, if only due to its physical inertia, but with a computer system a curious clerk can invent an alternative categorization and retrieval system which inevitably touches on mathematical topics.