>As previously noted, the metrics from OpenAI and Anthropic are imperfect proxies for AI risk and usage, while still being the best available.
Seems they're just coming out and admitting they refuse to measure it themselves. Not a good sign.
One specific stupid manager will absolutely replace people, but the overall dynamic isn't any more broken than it used to be.
What, personally, I think it's very surprising.
Yes, I learn how to use AI for coding in case it doesn't advance much more. But if AI is really going to do what some people think, it doesn't matter if you learn to use it or not, whole swathes of jobs, including software developers, will be obsolete. If your business boils down to being a middle man for an LLM it's not long for this world.
What really matters is the rate of advancement.
And no, there won't be new jobs to replace them. This is less like industrialization, which created jobs before replacing old ones, and more like the automation that hollowed out whole communities and cities from the '70s to '00s. Services largely saved us from this, but I see no new sector to come and rescue us. And any re-orientation of the labor force to existing jobs will drive down those wages too.
- I've sold software to several mid-scale production firms. Folks that do everything from Netflix title sequence designs to pharmaceutical television ad spots. They're billing at less than a quarter of their previous rate and picking up more clients on account of AI. They're downsizing the folks that do not do VFX or editing.
- A neighbor of mine who is a filmmaker was laid off last week. If you've flown Delta, you've seen his in-flight videos. His former employer, who he has worked for for nearly a decade, is attracting clients that are hiring them for AI work. My neighbor was not attached to any of those efforts.
- Major ad firm WPP is laying people off. Some of this is the economic macro and decreased ad spend. Another of my neighbors works for them and they haven't had any major projects. She typically manages major F500 clients. They're not spending. Despite that, she says some of the inter-departmental woes are directly attributable to AI.
- I spoke with former members in SAG-AFTRA leadership (before Sean Astin came on board). They quit on account of AI. "The writing is on the wall", they said. Direct quote.
I seem to remember the latest tools for software developers were pushed in the business organisation by the developers - and eventually the folk at the top relented and accepted it.
When the reverse is happening, alarm bells should ring.
But hey, Im not against these CEOs destroying the culture within the firm and making their employees hate their guts, resulting in negative productivity gains.
Including a real-life LLM "resurrection" of the fictional Erlich Bachman, created as part of a successful espionage mission to steal a Chinese deep learning company's near impossible distillation technology. But despite its trove of valuable illicit information, it has been orphaned online, unable to find its mysterious SV-fan hacker creators. As a result, chatErlich is now desperately attempting to make contact with the original SV team actors, who it actually believes are their fictional counterparts.
Holy shit does your average startup manager send emails like that?
We’ve also reduced the hours we work per week. We care about getting things done not time behind a screen.
Normally as we add enterprise customers we have to dedicate more dev resource keeping them happy. But since Claude code and now codex we have not felt that feeling of not being on top of the work. Thus not feeling the need to hire more devs.
That is not true IMHO.
If one is expecting Lovable to create a production app by just giving a few prompts, that obviously is not going to happen, not now and most probably for a long time.
However, if you use Claude Code or one of the proper IDEs, you can definitely guide it step by step and build production quality code, actually code that may even be better than most software engineers out there.
Moreover, these tools allow you to take your proficiency in software dev and specific languages/frameworks to other languages/frameworks without being an expert in them, and that I think is a huge win in itself.
It may be an excuse to layoff but it’s not ramping up velocity in ways that PR is making it seem to non tech literate.
From my experience, these are better suited at the moment for small teams and new projects. It’s unclear to me how they’ll work in large team/massive legacy code situations. Teams will have to experiment and come up with processes that work. IMHO anyway.
But it's definitely had an effect on jobs.
It's made so many underqualified people think they have a new superpower, and made so many people miserable with the implied belittling of their actual skills. It's really damaging work culture.
Of course studies like this are aimed at people who think jobs are interchangeable neutral little black boxes that can be scaled up and scaled down, and who don't like to think about what they involve.
> Overall, our metrics indicate that the broader labor market has not experienced a discernible disruption since ChatGPT’s release
Because metrics don't tell the story.
Title implies all things AI, when they were actually looking at GenAI. I know it's what everyone thinks of, but I hate how everything gets muddled.
I suspect AI is currently fashionable as a smokescreen to justify deep cost cutting (See MSFT example.)
I also had a vibe coded prototype get handed to me to fix it
I suppose at this point there's no debate on if we can call outselves "engineers" anymore. I can't imagine a civil engineer saying "they are accomplishing their task (getting a bridge out there that works)",
> The debate over whether AI is taking people’s jobs may or may not last forever. If AI takes a lot of people’s jobs, the debate will end because one side will have clearly won. But if AI doesn’t take a lot of people’s jobs, then the debate will never be resolved, because there will be a bunch of people who will still go around saying that it’s about to take everyone’s job. Sometimes those people will find some subset of workers whose employment prospects are looking weaker than others, and claim that this is the beginning of the great AI job destruction wave. And who will be able to prove them wrong?
Lets focus on the tech firms that produce software.
Two things should happen if AI proliferates into software development:
1) Increasing top line - due to more projects being taken by enabling labour to be more productive 2) Operating margin increasing - due to labour input declining and taking more cost-reduction projects
If those 2 things dont occur - the AI investment was a waste of money from a financial perspective. And this is before I even discount the cash flows by the cost of capital of these high-risk projects (high discount rate).
At some point everyone will be analysed in this manner. Only Nvidia is winning as it stands, ironically, not because of LLMs. But rather because they sell the hardware that LLMs operate on.
Internally, they could actually write 1,000X more software and it will be absorbed by internal customers. They will buy less packaged software from tech firms (unless it's infrastructure), internally they could keep the same headcount or more, as AI allows them to write more software.
Say idk, we add additional regulatory requirements for apps, so even though developers with an AI are more powerful (let's just assume this for a moment), they might still need to solve more tasks than before.
Kind of how oil prices influence whether it makes sense to extract it from some specific reservoir: if better technology makes it cheaper to extract oil, those reservoirs will be tapped at lower oil prices too, leading to more oil being extracted in total.
When it comes to the valuations of these AI companies, they certainly have valuations that are very high compared to their earnings. It doesn't necessarily mean though that replacement of jobs is priced in.
But yeah, once AI is capable enough to do all tasks humans do in employment, there will be no need to employ any humans at all for any task whatsoever. At that point, many bets are off how it will hit the economy. Modelling that is quite difficult.
Also at this point the current ideas of competition go wonky.
In theory most companies in the same industry should homogenize at a maxima which leads to rapid consolidation. Lots of individual people think they'll be able to compete because they 'also have robots', but this seems unlikely to me except in the case of some boutique products. Those companies with the most data and the cheapest energy costs will win out.
AI has no skin, you can't shame it, fire it, jail it. In all critical tasks, where we take risk on life, health, money, investment or resources spent we need that accountability.
Humans, besides being consequence sinks, are also task originators and participate in task iteration by providing feedback and constraints. Those come from the context of information that is personal and cannot be owned by AI providers.
So, even though AI might do the work, humans spark it, maintain/guide it, and in the end receive the good or bad outcomes and pay the cost. There are as many unique contexts as people, contextual embeddedness cannot be owned by others.
From "House restores immediate R&D deduction in new tax bill" (2024) https://news.ycombinator.com/item?id=39213002 .. https://news.ycombinator.com/context?id=38988189 :
>> "Since amortization took effect [ in 2022 thanks to a time-triggered portion of the Trump-era Tax Cuts and Jobs Act ("TCJA" 2017) ], the growth rate of R&D spending has slowed dramatically from 6.6 percent on average over the previous five years to less than one-half of 1 percent over the last 12 months," Estes said. "The [R&D] sector is down by more than 14,000 jobs"
> Hopefully R&D spending at an average of 6.6% will again translate to real growth
From "Generative AI as Seniority-Biased Technological Change" https://news.ycombinator.com/item?id=45275202 :
> Did tech reduce hiring after Section 174 R&D tax policy changes?
[...]
> From https://news.ycombinator.com/item?id=45131866 :
>> In 2017 Trump made businesses have to amortize these [R&D] expenses over 5 years instead of deducting them, starting in 2022 (it is common for an administration to write laws that will only have a negative effect after they're gone). This move wrecked the R&D tax credit. Many US businesses stopped claiming R&D tax credits entirely as a result. Others had surprise tax bills
> People just want the same R&D tax incentives back:
> "Tell HN: Help restore the tax deduction for software dev in the US (Section 174)" (2025 (2439 points)) https://news.ycombinator.com/item?id=44226145
It is suspected that hiring levels correlate with the cancelling of the R&D Tax credit.
The TCJA (2017 Trump) cancelled the R&D tax credit.
The OBBA (2025 Trump) restored the R&D tax credit for tax year 2025.
Automation seems to be a better excuse than outsourcing
Every year, large companies secretly rank employees and then yank the 10% or so they consider low performing. This is called rank and yank [1]. If your company has performance reviews and is ran by MBAs it almost certainly uses it.
[1] https://en.wikipedia.org/wiki/Vitality_curve
The most important aspect of rank and yank is that it has to be done in secrecy. Your company will not tell you it is using it. Even your manager might not know this.
When rank and yank is not done in secrecy, employees react to it by hiring the most mediocre people they can, sabotaging/isolating strong performers, hiring to fire, forming peer review/code review mafias, avoiding helping others as much as possible, etc. Anything they can do to not land in the bottom 10%. This cannibalizes the company and an example is what Ballmer did to Microsoft.
Any person with a ChatGPT account can now ask it to analyze the "game" of rank and yank from the perspective of game theory and realize how dumb the whole idea is. The rational strategy for the employee is to destroy the company from within. But MBAs love it because it involves a made up statistical distribution.
The only truth about rank and yank is that it's a stupid idea that has impacted the careers of millions of hard working people around the world, while also impacting many families and their future. It has converted thousands of companies into horrible places to work filled with workplace psychopaths at the top.
MBAs are people who believe in the work of the person that kickstarted the decline of American manufacturing, Jack Welch. Jack Welch extracted record profits from GE for 20 years, but left it a hollowed-out "pile of shit" according to his successor. The worse part is that MBAs aspire to be like him and in the process have ruined the whole manufacturing industry.
So to pull off a rank and yank every year you need a scapegoat, and this year the scapegoat is AI. In previous years it has been the economy, or some other excuse. AI will naturally become the scapegoat for everything.
Have you ever wondered why your company is laying off people while having job postings for the same positions? Does it happen every year? Does it happen after performance reviews? Is it always around 10% of the workforce? Oof... that's a tough guess, I wonder what it might be!
AI is the perfect scapegoat because the company can claim they're using AI and boost their value somehow. But if AI could reduce your headcount by so much then your company, your business model, your processes, your intellectual property, etc. have no intrinsic value anyways and the correct interpretation of the situation is that everyone should divest and make the share price go to zero.
Over 100,000 people. That's only one company, and only during his tenure.
Now add up all the terminations at every company that adopted his corporate astrology bullshit. Millions of people and the number increases every year.
How many of those people went into financial hardship, homelessness or even worse? All because of 1 person.
To put things in perspective, you could fire every working person in all of New York City and you would have fired less people than the result of his destructive legacy.
> "Overall, our metrics indicate that the broader labor market has not experienced a discernible disruption since ChatGPT’s release 33 months ago, undercutting fears that AI automation is currently eroding the demand for cognitive labor across the economy," said Martha Gimbel, Molly Kinder, Joshua Kendall, and Maddie Lee in a report summary.
> I know for a fact
That's not what a fact is; if we took everything written on businesswire or what the business owners / salespeople told us at face value then we'd be in deep trouble.
Ironically the thing broken in most cases is poor quality management that let things get so bloated and messy in the first place… the same folks that are cluelessly boasting about the potential of AI in their company.
Doesn’t seem to be that outdated
And the data goes up to 33 months since ChatGPT's release, or in other words Nov 2022 + 33 months = August 2025.
- Companies building AI models & tools - this one is obvious.
- Executives using AI to justify layoffs - there have been constant rounds of layoffs across corporate America since ~2021, but recent ones have been rebranded as "AI taking the jobs" so no one points to the obvious corporate mismanagement, offshoring and greed.
- Bosses using AI to push employees to work harder - I have personally seen this at my own company. AI is an excuse to increase forced attrition. "You aren't good enough" is harder to justify, so now it is "you aren't using AI well enough".
Real-world use cases of AI meanwhile haven't really moved beyond the prototype stage.
Not sure that is true at this point?
This study is far too soon. In 2030, if nothing changes meanwhile, we might know the effect of AI as we have it now in 2025 on employment.
The core intuition for this phenomenon is that human society overall takes the tech productivity gains to do more things overall, creating new goods and services. The broader range of goods and services overall also enables more people to find work.
Put another way, "“One thing I love about customers is that they are divinely discontent. Their expectations are never static – they go up. It’s human nature. You cannot rest on your laurels in this world. Customers won’t have it.” -- one of Bezos's Amazon shareholder letters.
One of my favorite counterintuitive examples: The biggest economic gains from the 1800s Industrial Revolution actually came from the humble washer/dryer. By making routine homeware 100x more efficient, this (along with other home appliances) allowed more women to enter the labor force, nearly doubling labor force participation within a couple generations. Though, at the beginning, lots of people were opining about homemakers losing a sense of purpose or relaxing all the time.
It's certainly possible that this study is just reinforcing the researcher's biases from their previous understanding of the economics of innovation, and also possible that this study is accurate today but conditions will change in the future. That said, I believe the burden of proof is on the pundits claiming cataclysmic job loss, which is counter to economic historians' models of innovation.
What you're saying is a common understanding, but it's a false one, rooted in Victorian-era attitudes towards medieval peasants that really have nothing to do with reality.
The most important thing to understand about peasant farmers is that their economic prospects are tied to the availability of land, and land is a finite resource of which there is not enough and no more can be had. Most pre-modern societies are set up to extract every possible extra amount of food produced, which basically means that in times of plenty, you get more people who have no work available for them (which means they up and leave to the cities, the only places which have the sufficient labor pool).
> People liked being farmers, people liked owning their own land, people liked being their own boss, people liked feeding themselves, people liked to be independent and self-reliant.
Oooh boy. There's a vast array of different socioeconomic statuses varying through time and space, but broadly speaking, most peasants did not own their own land, and even the majority of people who did own their own land did not own enough to feed themselves from their own land. And even if you did own your land and enough of it to feed your family, you probably still need to borrow the plow and oxen teams, and other farming implements, from your local lord. And since you are perennially on borderline starvation, you're not independent and self-reliant, you're entirely reliant on the village communal support to help you get through those times when your fields were a little bare.
Pretending that medieval peasantry was some sort of idyllic lifestyle is exactly the kind of Victorian-era fantasy you're decrying.
What peasant life offered wasn't comfort but stability. Peasant life may suck, but at least you knew what you were in for. If you moved to a city (let alone further away), you left your support network, you left everybody you knew, maybe for a shot at a better life... but with essentially no recourse if anything failed. Or you could stay, where things wouldn't get better, but they also wouldn't get worse. Unless there were a major calamity and staying wasn't an option.
https://acoup.blog/2025/07/11/collections-life-work-death-an...
It's not done yet, and I am eagerly awaiting the end results. That said, from what I can tell from his writings, jcrammer is mostly correct. The peasant life - the modal life - was just awful hard work for many decades. It was not nice and it was not better than the factories most of the time. Yes there were bad factories, a lot of them, but they lasted a brief time. The Factory Act in Britian was in 1833, only a few decades after the factories were even a thing.
Aside: We really need better education in labor laws overall.
That's not as true in the US's development. There's such an abundance of land and rapid expansion made it easier and easier for new landowners to grab acres of land. American to this day is still very sparse as a country.
US farmers had a bunch of land and possibly slave labor. They had little need to adapt to new tech. And yes, stability is key if you have it; it's a fleeting feeling (even to this day).
Until the local lord took a fancy to a different type of agriculture, drove you off and ploughed your village back into the soil!
I've been watching time-team recently and this seems to come up semi-frequently. Your family could have been there for 200 years, no matter, bye now!
The middle ages saw the growth of cities, commerce, increasingly industrial agriculture, etc. It also saw non-peasant societies like the vikings, muslim civilization, etc.
There were massive social safety nets in the form of guilds, religious orders, and political patronage organizations. Disease was a much bigger threat than starvation.
You're right the Victorians, like the Pre-Raphelites and Oxford movement, fetishized medieval life. But that was a reaction against anti-medieval Tudor-begun propaganda in place since the Plantagenets were defeated at the Battle of Bosworth Field (1485).
Surely the viking and muslim kingdoms had the base in farmers? I mean, neither were nomad civilizations.
It really isn't. This is the accepted term in academia, so long as you don't dispute the user of the term "medieval". While the period did see an outgrowth of cities, the vast majority (80%++) of the population was rural in almost every country throughout the period, including Scandinavian ones: Viking was a profession, not a culture, and the vast majority of Scandinavians, including Vikings, were in fact farmers, which is evidenced by how once the "Great Heathen Army" secured a foothold in Northumberland they proceeded to build a bunch of farms, thus all the places now named "-by" in northern England. Similarly, "Muslim civilization" was not a monolith. Yes, Arabia is not exactly conducive to settled agriculture, but Egypt and the Levant -- the political heartlands of Arab civilization from the 8th century and onwards -- were among the most agriculturally productive lands on earth, and a similarly large proportion of their population was engaged in the hereditary profession of agriculture as a result.
> There were massive social safety nets in the form of guilds, religious orders, and political patronage organizations. Disease was a much bigger threat than starvation.
This is true, and something very commonly overlooked. People think of medieval life as "brutish and short" but in reality these were stable, largely prosperous societies.
> most peasants did not own their own land,
If you mean most individuals then sure, but on a per household basis actually most peasants did indeed have land to call their own. E.g. reference The Decline of Serfdom in Late Medieval England. The "bottom rung" was predominantly half-virgater villeins, who were unfree leaseholders with barely enough to feed their family -- but they did still 'own' their land insofar as they had perpetual usufruct to it, and by the early 1500s paid essentially nothing in rent due to inflation and hereditary fixed rent levels.
> What peasant life offered wasn't comfort but stability.
Sure, I don't disagree. I never tried to claim that peasant life was great and enjoyable -- but people, especially people with kids to take care of, tend to value a baseline of stability above all else.
Historically most farmers were some form of serf. So I think it was common.
If your ten-year-old apprentice died because of your negligence, it damaged your business both in the day to day, in the long run, and reputationally. So you kept your apprentices alive, and you ultimately had to feed them.
For the first few decades of the industrial revolution, if a kid died in a factory situation because they lost concentration out of hunger and exhaustion, so what? Get another kid. Deaths of semi-skilled labour happened at scale, because the agency of those who were looking after them was taken away.
It really did take mass protests and labour organising to deal with it. Industry has an institutional memory of what that cost them, which is why unions are treated the way they are now, and why they are so urgently needed again.
The Luddites were skilled artisans in the textile industry. They often worked from home, owning spinning and weaving equipment and acting as what we’d call independent contractors today.
The mechanization of the textile industry resulted in work that required less skill and had to be performed in a dangerous factory for suppressed wages that were determined by a cartel of factory owners rather than a robust market of small makers.
Sitting here 200 years on from the Industrial Revolution it seems to be an obvious good. But it sure did not sound like an appealing thing to live through if you weren’t one of the few owners of the means of production.
Scrub through this report from ABC so your stomach can do backflips on how bad externalities are not tracked in modern prices:
We've never had that before.
Factories can scale to the point that a single factory for a lot of products can meet the needs of all of humanity. Add in economies of scale and the number of actual work on the actual product jobs decreases for any one product over time.
Information technology can scale to the point that any one company can manage vast oceans of data about processes and conditions in the business exercising large amounts of control over economies.
Shipping and transport are fast, cheap, and ubiquitous. That huge factory from above can make anything then ship it anywhere cheaper than you can make it right next door.
Robots and further AI automation can insure the investor class gets even wealthier by not having to pay things like the 16 hours a day you don't work in health insurance. More so there is never a labor shortage by pesky striking workers asking for more pay or better conditions.
All the above setup conditions for the consolidation of control of almost everything by a very small number of entities.
But with AI the skills gap is huge. Taking software engineering as an example, previously a motivated bootcamp graduate could start churning out basic CRUD apps and be useful. Now a motivated senior engineer with AI agents could churn out an app a day. So the minimum useful skill-level is now at the level of a senior engineer.
And this is possible because of yet another thing we never had before: A "universal" knowledge worker machine. Knowledge work was "safe" because it was a relatively high-skill category and automation was only possible through software, which was expensive and slow to build, largely due to a shortage of software developers.
Now we potentially don't even need software for a huge range of cognitive tasks! Of course, the stochastic nature of AI probably means we will be writing software for a long time yet, but as above, existing (senior) developers can scale themselves up much, much faster.
But it’s the transition that is the problem. The people living through the transition have to go through hell.
You see this all the time when some new technology, especially an information disintermediation technology, gets compared to the printing press. "The printing press broke the monopoly on knowledge and brought Europe out of the Dark Ages!" Yeah, but first it killed millions of people in a century of warfare. Do the people in an equivalent position now get a vote, or are they acceptable casualties for the glorious hypothetical future?
The answer seems to be we get no vote
I'm not happy about it
Are you accounting for the lives saved through better technology?
As we see here, this tech is only taking and not giving much back.
I'm just demonstrating that advancing technology and throwing a whole job sector out with nowhere to go is not mutually exclusive. And that the lack of any such conversation means we'll just go down the same bloody path as last time in history.
As usual it seems like there's only one box left when a new technology tries to strongarm its way into society. The invention of the personal computer avoided a lot of chaos by doing all of the above.
The author contends that free trade has become a tool favoring corporate interests at the expense of workers’ dignity, community, and fair participation in economic decision-making.
Beyond that, they illustrate that net economic gains _should_ be able to also lift those same workers, providing enough surplus for re-training, but it doesn't. The gains flow to the top and the workers left in a worse position.
What exactly did your professors mean by "economic gain"? It'd probably take an entire thesis to unpack what that means and all the ambiguities that anyone could drive a train through.
The only way I can square that circle is if they meant that women's economic prospects declined so much with the automation of textiles that they had nothing left but small efficiency gains. Maintenance tasks like clothes washing don't produce any economic value outside the household, whereas spinning/weaving/sewing produces something the household can sell, trade, or gift. Early textile mechanization took place decades before washing machines were even a novelty so sure, "biggest economic gain" as long as you're measuring from a local maxima.
Instead the opposite may have occurred, when cloth became cheaper and the amount of clothing increased greatly increasing the amount of washing the masses had to do.
This is very, very incorrect. Egregiously so. All evidence from antiquity to pre-modern Europe contradicts it.
Up until the mechanization of textiles, spinning and weaving took up the majority of women’s work time. Even noblewomen were painted doing the act. Annual labor hours were comparable to unskilled men in agriculture, even after mechanical innovations.
If you’d like I can give you academic citations but acoup.blog written by an actual historian just ran an entire series on this topic: https://acoup.blog/2025/09/26/collections-life-work-death-an...
Subsistence for a small family meant nearly 1800 hours a year of work on just textiles, accounting for a single woman’s labor over an entire year. That’s just 200 hours short of a modern 9-5 work year. On just textiles.
Nowadays, the new strategy is "you will own nothing, have worse more expensive services, and like it". The mood has completely shifted as of late. It feels more like the shareholder's demands are never static.
That was basically true from about 1830, when railroads came out of beta, to 1979.[1][2] During that period, wages roughly tracked productivity. There are solid US stats on this from at least 1950.
This has happened in other countries, too, but generally later than in the US. Here's a study from Japan.[3] In the UK, productivity and wages parted company around 2008.[4]
There are stats for China, but they only go back to 1995 and aren't taken that seriously even in China. Plus, China now has 200 million gig workers (!)[5] and data collection on them is weak. Official workforce size in China peaked in 2015, which may indicate that many gig workers are not being counted. This needs to be looked at more closely.
Productivity is going great, but the gains don't go to workers much.
[1] https://www.epi.org/productivity-pay-gap/
[2] https://fredblog.stlouisfed.org/2023/03/when-comparing-wages...
[3] https://www.rieti.go.jp/en/publications/summary/25090019.htm...
[4] https://fraserofallander.org/link-labour-productivity-wage-g...
[5] https://www.economist.com/leaders/2025/09/18/chinas-200m-gig...
Most people eventually find some kind of job. The alternative is being homeless.
25-55 employment vs population is now highest on record barring a few years in ~1997-2000 during the dotcom boom and even that by only a tiny bit. Also, remember that employment is now suppressed by many people going FIRE, which was a rare exception 25 years ago. Many of my friends left the world of work for good years ago, all their late 30s or early 40s.
I think a combination of a higher minimum wage and a shorter standard work week - introduced simultaneously in discrete steps over three or four years, say - would transform lives on a massive scale.
But unfortunately everything in America seems to be at best ossified, and at worst headed into the dumper.
People want inflation when it's their own wages, and they want labour supply to be constrained to achieve that. But as soon as the wage inflation feeds through to price inflation, they get very upset. Transient but significant inflation was one of the successful attacks on Biden's economic record, for example.
Wages are flat because it's policy for wages to be flat so that prices can be flat.
This is ridiculous because prices haven't been flat but wages largely have
Take Klarna. They laid off 700 people, realized it was a mistake, but they are hiring people back as gig workers [0]. Not proper employees with a salary and benefits. The thing about the US in particular is that a job is not just a job. It's your social safety net, as too many welfare programs have onerous work requirements. Employers know this. They have way too much power, probably more than ever before in our lifetimes. AI gives them that much more power.
0 - https://www.livemint.com/companies/news/klarnas-ai-replaced-...
There's a strong social element to jobs - it's not just money, it's your friends. But that's an easy fix too - forming social units not based on employers. Unions used to fill this role, as did guilds, and just plain clubs.
These aren't huge problems.
It's not about "global equilibrium", because that misses completely that hiring a very expensive westener still has positive roi, but only if you are attempting to innovate.
Projects are being cancelled left and right and teams dissolved, so this tells me that the former point is shakey. Something tells me that we'll have a tiny boom sometime after the AI bubble pops and suddenly innovation is "free" again. Or because smaller startups can actually get funded without needed to throw AI somewhere in their pitch deck.
I’m not so optimistic. But glad some people are!
I think there's many reasons for the AI hype, but one of the basic ones is that it's the only way to keep the economy propped up. I doesn't matter if it's an illusion or not, it means money is flowing many directions (even if a shocking number of those flows are accounting tricks).
What we're watching is some mass hysteria like tulip mania. There are many, many people who benefit from this situation independent of whether or not its an illusion.
And maybe that bubble will pop and maybe it will soon, but when it does, most of us will wish it hadn't.
It will, and it will be huge. It may in fact be the trigger of the next Great Depression. It's going to be ugly.
But maybe that's the pain the US needs. This situation was decades in the making and the US decided instead to blow up the debt to unmaintainable levels. There's very little rope left this time if we have even a recession withotu hyperinflation (let alone a depression). It will be bad, but I'm also beyond tired of everyone kicking the ticking pipe bomb down the road for the next generation to handle. The sooner it blows up, the better chance we have something left to clean up. Maybe we can bring some integrity back to the country.
Then you add the millions and millions tech companies spent promoting coding as a career, and the organic attraction from high salaries that caused CS to became the most popular major at many schools, and pushed droves up people into coding bootcamps.
The free money stopped, and without that subsidy we probably have 50% more programmers than we really need.
Why would they deliberately give up their profits by doubling headcount just to layoff around 10% in aggregate? They still have to pay the rest 90%… so they just did that so that the 10% could be laid off?
Yes
>To what gain?
- reporting explosive growth
- launching a lot of initiatives, of which more than half of are probably shelved c. 2025. Seriously, peek into any company and see all the odd products they were announcing back then.
- to gain more of a global foothold. If you want to operate in a new country, you'll need to hire a lot of staff to manage compliance, run offices, work with local companies, etc.
- to have a lot of bodies to throw at a problem should they need it. Since hiring was basically "free" is was convinent to have talent on hold just in case.
- to poach, be it from competitors or from potential future competition. There's been reports of skilled hires who'd proceed to be on standby for quite a while. But paid very well. That's not something you do normally.
>Why would they deliberately give up their profits by doubling headcount just to layoff around 10% in aggregate?
That's the oversimplification of S174. They weren't losing much revenue because those labor costs would be balanced out from the taxes they didn't have to pay. It just cancels out.
Layoffs started quick once that ended. Remeber, they don't see layoffs as a "bad thing" as a business. The stock even bumps up a tiny bit back on 2022/2023 over layoff news. Something about "being responsible" or some drek.
S174 will be back next year, but I think there's multiple other issues that mean the market won't quickly bounce back.
Fed rates increased, ad yields plummeted, and a cooling off of the COVID tech boom meant that companies quickly left "growth mode". The transition to maintenance mode also caused a new wave of outsourcing to cut the cost of "no longer free money" North American devs.
That's why hiring actually isn't down as much in earning calls. They are still hiring, just not from here.
That’s a big part of where things like micro services come from.
>Moreover, look at the hiring numbers in earnings calls. These companies did not slow down hiring. They slowed down hiring in North America.
I don’t think this is true. IT jobs in India are down 10%.
https://timesofindia.indiatimes.com/business/india-business/...
The only way this won't happen is if at some point AI models just stop getting smarter and more autonomously capable despite every AI lab's research and engineering effort.
We had such a pushback last decade about "bullshit jobs" among the tech sphere, but we seemed to have fallen right into it. At least while the money is flowing to enable it.
"We define workslop as AI generated work content that masquerades as good work, but lacks the substance to meaningfully advance a given task."
ChatGPT is basically pushing the real work on to the next person along the line who then has to fix up the generated workslop.
That seems to indicate they've got a bad metric, since I would expect both of those to have an impact higher than their control.
1) they’re assuming that every company in the world is using AI aggressively.
Yes, ChatGpT is popular but there are still a LOT of companies that have not adopted AI in the enterprise.
If they wanted to analyze the impact of AI on the labor market, they need to analyze the mix of employees in companies that actually are actively implementing or aggressively using AI.
2) They did not mention how many people they sampled along the unemployed but if its something like 4% or so, the number of software engineers in that sampling is probably pitifully low (ie 10)
Definitely not enough to make a conclusion.
For me, the second effect is more prominent: still maintaining / hiring the same staff but we are taking on qualitatively different things that we would not have accepted due to complexity or workload reasons now.
https://theconversation.com/does-ai-actually-boost-productiv...
The problem is not just the data, but working with aggregated data also has to do with the definition of data categories. After decades, they may have defined a new category for such surveys, after lengthy debates, and therefore a significant shift in employment mix. For ex. we can argue that software programming is also largely a production job because they produce custom software for clients! And computer is only a tool like other machines. Seeing so, I guess the job mix has not even changed much since the industry revolution!
But for fast changing situations, such view can be too shallow and harbor dangerous blind spots. Of course it always depends on the perspectives. If we care only about whether there will be more unemployment or the disappear of a whole job category then yes, Yale report and alike are helpful. If people however care about the two 2 millions call-center jobs in the Philippines or the difficulties in the job market for CS fresh graduates then such reports could create a dangerous complacency.
ChrisArchitect•4mo ago
dang•4mo ago