Do anything and everything to remain in "growth stock" category. Spend money on useless features, on engineers working on those useless features - as long as it will make your company look like it has bright future and space to grow.
https://locusmag.com/feature/commentary-cory-doctorow-revers...
> There’s a bit of automation theory jargon that I absolutely adore: “centaurs” and “reverse-centaurs.” A centaur is a human being who is assisted by a machine that does some onerous task (like transcribing 40 hours of podcasts). A reverse-centaur is a machine that is assisted by a human being, who is expected to work at the machine’s pace.
Amazon is fundamentally a logistics + robotics company and is one of the worst companies to join for 'stability' as they have razor-thin margins.
With almost 1.6M workers, the layoffs there are at least in the 5 figures and they will not stop to do the easiest thing possible in order to increase profit margins and that is to take jobs away from warehouse workers (using robots) and corporate jobs (using AI agents).
> Most engineers (including me) spent months grinding LeetCode at least twice in their career, studying system design, and passing grueling 6-round interviews to prove they are the “top 1%.”
Leetcode can be easily gamed and cheated and is a waste of time.
Now you need to make money for yourself instead of dancing around performative interviews since an AI Agent + Human out performs over 90% of workers doing the mundane work at Amazon anyway.
You are being scammed without knowing it.
You seem to forget that AWS exists.
https://s2.q4cdn.com/299287126/files/doc_financials/2025/ar/...
Amazon the logistics company is paid for by the $100+ per year that its customers just give to Amazon to get basically nothing in return.
Compared to the other FAANG companies, these margins not only thin, but terrible and Amazon has the worst margins out of FAANG regardless of AWS or not.
There is remarkably little pushback on company narratives about layoffs or ailing economic fortunes from journalists which is weird because it's more normal that they are not truthful.
The Brexit vote is nothing like this though. AI is probably the biggest corporate gaslighting exercise I've ever seen in my entire life.
Also very common to blame things on health and safety, GDPR, etc.
LLMs are another force multiplier. If your computerized process is a disaster … well, you said it and you’re right.
To err is human, to really foul things up requires a computer.But I literally mean if you have a crappy business and put AI into it you’re just gonna make your business worse.
AI as a tool is not actually a solution for very much. AI can mark a good process better but it will also make bad processes way worse.
It’s a power drill upgrade when you were previously only using a screwdriver, but it’s still not a table saw.
Traditional factories do make wrong guesses about the future and overhire all the time. Example: https://www.google.com/search?q=Ford+f150+lightning+laid+off...
From a supply chain management perspective, this does not make sense. The directive should be something like "find the best suppliers on the planet."
Manufacturers can't hire beyond the places in production that someone can stand and do something. There needs to be some kind of equipment or process for worker to contribute in some meaningful way, even if it is merely for a projection of increased production (e.g., hiring a second shift for a facility currently running one shift).
What I wonder is if in tech, the "equipment" is a computer that supports everything a developer needs. From there, new things can be added to the existing product.
Manufacturing equipment is generally substantially more expensive than a computer and supporting software, though not always. Might this contribute to the differences, especially for manufacturing that normally runs 24-hour shifts?
Cheap money amplified this cycle, but this isn’t a tech specific "failure", it’s just how forecasting under uncertainty work.
It’s incredible how some engineers assume they understand economics, then proceed to fail on some of its most basic premises. This tends to happen when engineering-style certainty is applied to systems that are driven by incentives and uncertainty.
But factory workers usually require specialized machinery, tooling, and physical capacity, which makes overhiring slower, harder and more constrained. Those investments force more deliberate planning.
By contrast, engineers mostly require a laptop and company hoodie... That low marginal cost makes it far easier to hire aggressively on expectations and unwind just as aggressively when those expectations change.
And, (assumption again) the factory boss doesn't have an incentive to increase idle worker numbers, where a dev manager often benefits from being in charge of a larger number of hardly working people.
All of a sudden it is no longer enough with a few shell scripts. No, we need a full kubernetes cluster to run a service used by 10 secretaries.
No, we can't just use PostgreSQL as a queue, we definitely need Apache Kafka for 1 msg per second.
Alas, gone are the days when engineers too required specialized equipment like a desktop computer on the desk that you couldn't move with you. Every evening, you left it at office and went home to live a 100% home life. Alas, gone are those days.
Dunning Kruger effect, am I right?
We consider we are smart because we can make computers go beep boop so we know about the economy too. I mean, I am part of this too even though I (or we all) know the effect but I guess my point is that there should be an humility if we are wrong.
I can be wrong, I usually am. If someone corrects me, I would try to hopefully learn from that. But let's see how the author of this post responds to the GP's (valid, from what I can tell) critique.
Edit: Looks like they have already responded (before I wrote it but I forgot to see their comment where they said that its not at the scale or frequency we see in tech)
Funny, the person you are agreeing with made the strongest "I know more than you" flex than anyone.
Stay away from europe while you can.
Yes it's kind of obvious to anyone who's looking at the actual work being done: the constant churn of OS updates, the JS-framework-du-jour, apps being updated constantly...
It seems to me like a lot of this is just busy work, as if engineers need to justify having a job by being releasing inconsequential updates all the time. Bullshit jobs anyone?
I for one would really like things to slow down, we all deserve it!
Most [sane] software out there, but not all, has a main development time which is ridiculous compared to its life cycle (you could code in binary machine code, for several ISAs, it would not even matter).
Then, it is extremely hard to justify _HONESTLY_ a permanent income in software development. Really, really hard.
I truly believe that these new tools will actually hurt the bigger companies and conversely help smaller ones. I'm in healthcare. The big players in the EMR space are Epic and Cerner. They are one-size-fits-all behemoths that hospitals have to work against than with. What if, instead of having to reach out to the big players, the economics of having a software developer or 2 on staff make it such that you could build custom-tailored, bespoke software to work "with" your company and not against?
The behemoths exist especially, but not exclusively, in that space because regulations (correctly) are steep. In the case of hospital systems you're talking both the management and protection of both employee and patient data. That's not to say of course that the behemoth's are particularly good at that, it's merely that if the hospital rolls it's own solution, as you suggest, they then take on the liability should that system go wrong. On the other side, if Epic has a data breach, every hospital shrugs it's shoulders. It isn't their problem. And, even more fundamentally, if Epic as a product sucks ass... well. The employees didn't choose it, neither did the patients, leadership did.
You see these relationships (or lack thereof) all over the place in our modern world, where the people doing the work with these absurdly terrible tools are not given any decision-making power with regard to which tools to use. Hell, at my workplace, we actually have some in that leadership asks if we're happy with our various HR softwares and things, but fundamentally, they all pretty much suck and we're currently sitting at the least shitty one we could find, which is far from a solid fit for our smaller company. But it's the best we can do because none of these suites are designed to be good for people to use, they're designed to check a set of legal and feature checkboxes for the companies they sell to.
Honestly I don't know how you fix this, short of barring B2B SAAS as an entire industry. Time was, when you wanted to run a sales company, you had to run your own solution to keeping track of internal data. Salesforce didn't exist. You had rows upon rows of file cabinets, if there was a fire data was a lost, if a disgruntled worker stole your sales list and sold it to a competitor, that was it's own issue to deal with. Now crooks can crack the locks off of NetSuite and steal your whole fucking business without even knowing where the hell your HQ even is or caring for that matter, and our business universe if you will is bifurcated all to hell as a result. Companies are engaged in constant games of "pin the legal responsibility on someone else" because to compete, they need internet and software based sales and data management systems, but building those systems is a pain in the ass, and then you're responsible if they go wrong.
It's probably risk and liability and not development costs that keep things from moving in house. Not things AI is great at mitigating.
Or they're actively enshittified, aiming to extract more short-term revenue at the cost of a long-term future...
Here is Google complaint about not serving lobster biscue:
https://x.com/Andercot/status/1768346257486184566?s=20
Zero interest rate phenomenon.
We could get another 2008-like market crash before 2030, once the major AI companies begin to IPO onto the public markets.
If you want stability, go write Java for an insurance company for $85,000/year in Hartford, CT.
OP is horrified to discover risky gambling happening in Las Vegas.
.. and terrible for inflation, but that can be blamed on other people.
The point is mass media communication and frictionless money movements across the world and market access which is so freely availible to the small retail investor.
It's a recipe for disaster because an extraordinary claim can attract billions of dollars with nothing but hope and dreams to back it up.
Imagine if the Wright Brothers had today markets and mass media at their disposal, they'd be showered in billions or even trillions but the actual model didn't make any money because it was R&D
Really grateful that the opportunities I've been given weren't predicated on knowing things completely irrelevant to my job. I have spent exactly zero time solving LeetCode problems in my career (beyond algorithm stuff in college).
The LLM proponents are trying the same naive move with intangible assets, but dismissed finite limits of externalized costs on surrounding community infrastructure. "AI" puts it in direct competition with the foundational economic resources for modern civilization. The technical side just added a facade of legitimacy to an economic fiction.
https://en.wikipedia.org/wiki/Competitive_exclusion_principl...
Thus, as energy costs must go up, the living standards of Americans is bid down. Individuals can't fix irrational movements, but one may profit from its predictable outcome. We look forwards to stripping data centers for discounted GPUs. =3
"Memoirs of extraordinary popular delusions and the madness of crowds" (Charles Mackay, 1852)
We can of course discuss how many people got into industry during COVID heyday and whether they should have, but mostly I think it's about those behemoths having disproportionately high impact on the entire labour market.
Firing people in most of Europe is still not as easy as it is in the US.
The opposite is also true, it's not that easy to leave your employer and you have to give 1/3/6 months notice before leaving, depending on your role/seniority/contract.
Sometimes companies even make you sign 12 months notice contracts clause where they pay you a fixed monthly bonus but you can't leave without giving a 12 months notice, my SO has signed one.
Down sizing is a perfectly legal reason to fire people in Europe, and it happens all the time when big companies do mass firings. The difficult part is getting to choose which individuals to fire.
But I'm curious what people think the equilibrium looks like. If the "two-tier system" (core revenue teams + disposable experimental teams) becomes the norm, what does that mean for the future of SWE as a career?
A few scenarios I keep turning over:
1. Bifurcation - A small elite of "10x engineers" command premium comp while the majority compete for increasingly commoditized roles
2. Craftsmanship revival - Companies learn that the "disposable workforce" model ships garbage, and there's renewed appreciation for experienced engineers who stick around
3. Consulting/contractor becomes default - Full-time employment becomes rare; most devs work project-to-project like other creative industries
The article argues AI isn't the cause, but it seems like it could accelerate whatever trend is already in motion. If companies are already treating engineers as interchangeable inventory, AI tooling gives them cover to reduce headcount further.For those of you 10+ years into your careers: are you optimistic about staying in IC roles long-term, or does management/entrepreneurship feel like the only sustainable path?
Low–wage employees without agency.
I'm in the tech industry and have been doing this for 12+ years now. In the beginning, it was because I wanted to live overseas for a few years, without a break in my career.
Now, it's about survival. I buy my own health insurance (me and my family) in the marketplace every year (so I'm not tied to an employer), work with multiple clients (I never really have to worry about getting laid off), and make much more than a FTE.
While all my friends in tech are getting laid off or constantly in fear of getting laid off, I don't have to worry.
I also find that because I touch so many different technologies, I have to turn down work. I turned down a company last year, that wanted me in-house and one this year that would have been too demanding on my schedule.
It's also flexible and always remote.
The Pragmatic Engineer argues it's actually trimodal (at least in Europe): https://blog.pragmaticengineer.com/software-engineering-sala...
As far as I'm concerned, the main purpose of IT is to automate work. Tech companies make these systems of automation and provide them to other industries, so they can automate.
Making a program or an IT system is something you only do once. So once it is completed, it is expected that a lot of people who helped make it have to go. It's like building a skyscraper. Massive amounts of work to build it, and when it's finished, most workers have to move on.
Of course an IT company can continue to expand into perpetuity, but what if they don't have the leadership talent or resources to create a new giant project after one has been finished? Then the sensible thing is to down-size.
"Well don't hire too many people in the first place to rush your project into completion" - Then you get left behind.
Some software is finished, like Microsoft Office 2003, and requires no additional work except to force ads on people. Those jobs may end.
Those were also iterative processes: first tires and mud houses, then horse carriages and brick houses, and eventually cars and buildings.
In that sense, it’s not fundamentally different from engineering today. Working on core engineering functionality of a company is essentially the same kind of process.
The difference lies in whether you’re working on core functionality, or on some iterative experiment that nobody knows will succeed.
“Efficiency” is a damned lie. Enterprise IT is one of the most inefficient spaces out there, full of decades of band-aids layered atop one another in the form of fad products, fancy buzzwords, and “defining leadership” projects. The reason you cannot get shit done at work quickly isn’t because of bureaucracy or management layers standing in the way so much as it’s the vested interest in weakening IT and Ops teams so that those higher-ups can retain more of the profit pie for themselves.
My entire job is to make technology become so efficient that it fades into the background as a force amplifier for your actual work, and I’ve only spent ~1/3rd of my 15+ year career actually doing that in some form. I should be the one making sure documentation is maintained so that new hires onboard in days, not months. I should be the owner of the network and/or compute infrastructure the business needs to operate, not some MSP in another country whose contract you’ll replace with a lower bidder next year. I should be the one driving improvements to the enterprise technology stack in areas we could benefit from, not some overpriced consultant justifying whatever the CIO has a hard-on for from his recent country club outing.
Consultants, outsourcing, and fad-chasing aren’t efficient. They do not better the business, overwhelmingly. AI won’t magically fix broken pipelines, bad datasets, or undocumented processes, because it is only ever aware of what it is told to be aware of, and none of those groups have any interest or incentive in actually fixing broken things.
The tech industry is woefully and powerfully inefficient. It hoards engineers and then blocks them from solving actual problems in favor of prestige projects. It squanders entire datacenters on prompt ingestion and token prediction instead of paying a handful of basically competent engineers a livable salary to buy a home near the office and fucking fix shit. Its leaders demand awards and recognition for existing, not for actually contributing positively back to society - which leads to stupid and short-sighted decision-making processes and outcomes.
And all of this, as OP points out, is built on a history of government bailouts for failures and cheap debt for rampant speculation. There’s no incentive to actually be efficient or run efficient businesses, and this is the resultant mess.
Software companies have much higher profit margins than companies that ship physical products. There really aren’t many industries that do better margins than software.
To sell software, you don’t need a production facility, warehouse, nor do you even need an office building if you don’t want one.
https://pages.stern.nyu.edu/~adamodar/New_Home_Page/datafile...
> I will add some commentary from my subjective POV in IT:
Subjective is doing the carrying, there. I am admitting up front that this is specific to me, my career, and the specific life experiences I’ve had with it thus far.
Like…I won’t even entertain the rest of your comment if you’re not even going to read the entirety of mine before vomiting out an “UhM aHkShUaLlY” retort.
The entire point of my comment is that a subjective anecdote from a very limited perspective isn’t telling a good story about why the industry is the way it is. You said these businesses aren’t efficient but they have extremely high profit margins, which objectively paints a different picture: they are sufficiently efficient.
All this financial risk in the industry where employees are over-hired and low interest rate VC funds are thrown around haphazardly is accepted because even a modestly successful software business has high profit margins.
The investors that funded less than $10 million to Airbnb’s series A ended up a significant shareholder of a company that has the same market cap as PNC Financial Services in less than 20 years. PNC spent 180 years getting to the size it is today.
Software-driven products have huge profit potential and they can grow fast.
If you think I’m being some kind of pedantic loser, fine, so be it. You can put the blindfolds on and ignore a data-supported viewpoint.
If you are starting a dry cleaning business, you have a cost of the equipment, rent and other well known factors. Starting a tech company in a new and unproven area has different expenses and a different risk/reward profile.
Malinvestment can come in a lot of flavors. Cheap capital will result in too many dry cleaners and also too many startups that probably shouldn't have gotten funding.
The downside comes in various forms. 1) existing dry cleaning businesses are less profitable because of increased competition, and 2) startups hire scarce engineers and drive up wages, which drive up costs for everyone.
Cheap capital is justified bc the goal is growth, but it is a blunt instrument that creates hot spots and neglected areas simultaneously. Compare the approach used in the US with the approach taken by China. Chinese firms face significantly more competition than firms in the capitalist US, but overall China's policies are crafted with a more deliberate eye toward their distributional consequences and notions of the greater good are much more subject to sharp critique and pressure across social and industrial strata.
What we are seeing in the US is that policymakers have come to believe that the growth-focused approach is an escape hatch that can be used to reduce the effects of other bad decisions, but at some point the size of the inflated economy gets big enough that it takes on a political life of its own post-911 defense contractors have dramatically more lobbying and policy-influencing power than they had prior. Today, systemically risky financial industry participants have significantly more political clout than they had before the 2008 correction.
In other words, the fabric of (political) reality shifts and it becomes hard to identify what normal would look like or feel like. In my view, AI adds fuel to the existing fire -- it rapidly shifts demand away from software engineers and onto strategists -- give the team a strategy and now with AI the team will have it done in a few weeks. If not, a competitor will do it without poaching anyone from your team.
And market forces include both creative and destructive forces. Firm failure is a feature, not a bug.
This then causes the market to dry up again and if the interest rate hasn't dropped even further then a lot of companies that need follow up investment will now get killed off. It's a very Darwinian landscape that results from this and I've been wondering for years if there isn't a better way to do this.
ZIRP (especially the "double tap" ZIRP in 2021/2022) created this monster (bootcamp devs getting hired, big tech devs making "day in the life of" tiktok vids).
contractors give:
instant scale up/down without layoff optics
no benefits overhead
no severance obligations
easy performance management (just don't renew)
this mirrors what other industries typically do after large restructuring waves ... manufacturing got temp agencies and staffing firms as permanent fixtures post-rust belt collapse. tech is just catching up to the same playbook.
- Senior Engineers now often is sufficient for most tasks, Junior Engineers seems like a burden rather than a boost during development process
- Companies feel comfortable with hiring fast and firing fast
- Tech Market is now flooded with not-so-good engineers having good experience with good AI coding assistants - which already are capable of solving 80% of the problems - they are ready to work for much less than really experienced engineers
In general, yes, companies overhired many software developers, hoping they will continue hyper growing, but then reality has kicked in - this was not sustainable for most businesses.
So in the end you lose both your ability to acquire new knowledge and to keep existing knowledge.
There has to be a better way surely?
The actual reason tech companies overhire is because people get promoted based on the number of people that are "under" them. All leaders are incentivized to fight for headcount.
Tech changed a lot from 2010 to 2020. Prior to 2010 almost everything built required a huge amount of development effort, and in 2010 there was still a huge amount of useful stuff to be built.
Remember – prior to 2010 a lot of major companies didn't even have basic e-commerce stores because the internet was still a desktop thing, and because of this it really only appealed to a subsection of the population who were computer literate.
Post 2010 and post iPhone the internet broadened massively. Suddenly everyone was online and companies now had to have an e-commerce store just to survive. Only problem was that there wasn't a Shopify or even npm to build from... So these companies had to hire armies of engineers.
Similarly there was no Uber, online banking was barely a thing, there was no real online streaming services, etc, etc, etc...
During this time almost everything had to built by hand, and almost everything being built was a good investment because it was so obviously useful.
Around 2015 I realised that e-commerce was close to being a solved problem. Both in how most major companies had built out fairly good e-commerce stores, and also in how it was becoming relatively easy for someone to create an e-commerce store with almost no tech skills with solutions like Shopify.
I'd argue somewhere between 2010 and 2020 the tech industry fundamentally changed. It become less about building useful stuff like search engines, social media sites, booking systems, e-commerce stores, etc – these were the obvious use cases for tech. Instead the tech industry started to transition to building what can only be described as "hype products" in which CEOs would promise similar profits and societal disruption as the stuff built before, except this time the market demand was much less clear.
Around this time I noticed both I and people I knew in tech stopped building useful stuff and were building increasingly more abstract stuff which was difficult to communicate to non-technical folks. If you asked someone what they did in tech around this time they might tell you that their company are disrupting some industry with the blockchain or that they're using machine learning pick birthday cards using data sourced from Twitter.
I used to bring this up to people in tech but so many people in tech at this time had convinced themselves that the money was rolling in because they were just so intelligent and solving really hard problem.
In reality the money was rolling in because of two back to back revolutions – the internet and the smart phone. These demanded almost all industries made a significant investment in technology, and for a decade or so those investments were extremely profitable. Anyone working in tech profited from those no-brainer technical investments.
Post-2015 the huge amount of capital in tech and the cheap money allowed people to spend recklessly on the "next big thing" for many years. 2015 to 2020 was such an amazing time to be in tech because people were basically throwing money at you to build literally anything.
But time's up now. Companies are realising that a lot of the money they invested in tech in recent years isn't profitable and isn't even that useful. So now they're focusing in on delivering value and building up profit margins.
The tech market isn't broken, it's coming back down to reality. Like railway workers post the boom we must face that most of the core infrastructure has now been built. A few of us will stick around making the odd improvement and maintaining what's already there, but that boom isn't coming back. Many of us will need to seek new professions.
Also, if you are using AI to even write such a simple blog post, then perhaps corporations are indeed using it for all kinds of purposes too, and that undoubtedly reflects on their hiring.
There is huge part of tech market that is just humdrum blue color software. Keep the ERP system running, build a new efficiency report, trouble shoot why the payroll missed bob last week because of a un-validated text entry field.
Or because of the years of zero interest, tons more people went into software, so now it is over-populated, and thus puts pressure on regular hum-drum software jobs.
This is interesting, in my experience its seemed to be the opposite.
In manufacturing, its must easier to come up with a specific number of employees you need for a given project or contract.
If the contract is expected to sign you may hire early to get ahead of it.
If the contract falls through, or an existing contract is cancelled, you know exactly how many people you need to cut to balance labor with your current commitments.
ZIRP, AI, over hiring, and a wave of boot camp labour supply I suspect all contribute.
Plus we’re also likely approaching saturation on a lot of fronts with attention and ad density saturation. Things like YouTube seem to be on the edge of how much ads they can force feed without people just not using yt because it’s unusable. Enshitification isn’t a cycle, it’s a one way trip. Combine that with over hiring and it’s bound to hit a wall
austin-cheney•1h ago
Then came COVID and the economy contracted. As a result the stock market changed to reward profitability. So, excess developers had to go. We are still feeling this.
I do agree that AI is not to blame for this. In fact I will go further and claim that AI is a net negative that make this worse for the employer by ultimately requiring more people who average lower confidence and lower capabilities than without, but I say that with a huge caveat.
The deeper problem is not market effect or panaceas like AI. The deeper problem is poorly qualified workers and hard to identify talent. It’s easy to over hire, and then fire, when everyone generally sucks and doesn’t matter. If the average employed developer is excellent at what they deliver these people would be easy to identify and tough to fire like engineers, doctors, and lawyers. If the typical developer is excellent at what they do AI would be a complete net negative.
AI and these market shifts thus hide a lower level problem nobody wants to solve: qualification.
hrimfaxi•1h ago
If people can't identify qualified professionals without relying on credentials, they probably aren't qualified to be hiring managers.
lm28469•1h ago
Simple, the 80% of code monkeys who are not good at what they do will cause way more damages than the "professionals who are excellent at what they do". And out fo tech I can guarantee you the vast majority of people use llms to do less, not to do more or do better
It's also easily verifiable, supposedly AI makes everyone a 10x developer/worker, it's been ~3 years now, where are the benefits ? Which company/industry made 10x progress or 10x revenue or cut 90% of their workforce ?
How many man hours are lost on AI slop PRs? AI written tickets which seem to make sense at first but fall apart once you dig deep? AI reports from mckinsey&co which use fake sources?
linuxftw•46m ago
LLM's are at least a 10x speed up on the 'producing code' portion of the job, but that's often only a fraction of the overall job. There's lots of time spent planning and in meetings, doing research, corporate red tape, etc.
For writing code and unit tests, generating deployment yaml and terraform, for me it's easily a 30x speed up. I can do in 1 or 2 hours what would have previously taken a week.
servo_sausage•28m ago
We have been able to move our "low cost" work out of India to eastern Europe, Vietnam and the Philippines; pay per worker more, but we need half as many (and can actually train them).
Although our business process was already tolerant of low cost regions producing a large amount of crap; seperate teams doing testing and documentation...
It's been more of a mixed bag in the "high skill" regions, we have been getting more pushback against training, people wanting to be on senior+ teams only, due to the llm code produced by juniors. This is completely new, as it's coming from people who used to see mentoring and teaching as a solid positive in their job.
austin-cheney•20m ago
* Peer reviews in the industry
* Publications in peer reviewed journals
* Owner/partner of a firm of licensed professionals
* Quantity of surgeries, clients, products, and so forth
* Transparency around lawsuits, license violations, ethics violations, and so forth
* Multiple licenses. Not one, but multiple stacked on top of a base qualification license. For example an environmental lawyer will clearly have a law license, but will also have various environmental or chemistry certifications as well. Another example is that a cardiologist is not the same as a nurse practitioner or general physician.
Compare all of that against what the typical developer has:
* I have been employed for a long time
More elite developers might have these:
* Author of a published book
* Open source software author of application downloaded more than a million times
Those elite items aren't taken very seriously for employment consideration despite their effort and weight compared to what their peers offer.
ido•1h ago
sznio•31m ago
if not for covid, the zirp era would end more gently. covid overhiring was the last hurrah for companies to use the low interest. if not covid, there wouldn't be overhiring and subsequent firing
the market would be as bad as now (or dare i say, *normal*), but it would be stable bad, not whiplash.
radicalethics•1h ago
pjc50•1h ago
There's also the thought nobody wants to examine: what if the consumer market total spend is kind of tapped out?
copilot_king•1h ago
linuxftw•51m ago
The F-150 Lightening had the same problem.