In RTP, the amount of office space that was built out has never been fully utilized to this very day.
These tech bubbles come and go, but leave enormous craters of destruction in their wake.
They also leave a select few propped up to move on to their next, entitled, adventure [0].
[0] https://web.archive.org/web/20100326134422/http://www.phonep...
You can already see some of the bigger players trying to squeeze out more money out of their victims, pardon, users, in order to maintain at least the illusion of a future where they are profitable. We also see some players eg Apple give up on the race, and some other players, eg Microsoft, Google and OpenAI switch personnel and positions in the race, which should be a good signal it's a mined out field and they are scrapping to be the one that gets the few remaining spoils before it all crumbles down. Very bearish on LLMs.
There comes a snake oil salesmen and says that LLMs are the bees'knees and you just take as a given without any critical thought.
But when people actually have proof that LLMs are nothing special and even reduce productivity (by 19% in experienced developers): https://secondthoughts.ai/p/ai-coding-slowdown you are screaming to the heavens for proof.
I am the last to stop the AI fanbase from outsourcing their critical thinking and some of their wallet to a fancy autocomplete, that means more work for the rest of us that actually do the thinking.
LLM vibe-"engineers" will be fleeced $$$ when Anthropic actually starts wanting to be profitable: https://news.ycombinator.com/item?id=44598254#44602695
Their skills deteriorate, and so does their critical thinking: https://www.mdpi.com/2075-4698/15/1/6
And I just can't stop giggling imagining their workday when Claude is down.
LGTM is not a sustainable career choice, people.
Hard pass on spending time debating with the fanbase of a tool which claim to fame is the dubious benefit of allowing its users to avoid critical thinking for themselves.
As soon as somebody tried to actually measure LLM productivity for devs, it became evident that it reduces it about 19% for experienced engineers:
https://hackaday.com/2025/07/11/measuring-the-impact-of-llms...
But you can believe any x-AI/e-acc bro (with 0 years of dev experience but dog-like obedience to their mind-influencers Musk or Sama) that suddenly became a self-proclaimed expert and have 319491% increase of productivity. It's Schrödinger's LLM gains: 1337% on X, -19% in a study.
Those are the same people that back in the day were proclaiming that NFTs, web3 and crypto were the next coming of Jesus. These are types of people who hang on every statement of Andrej Karpathi and take it as gospel, while failing to realize that's the same guy telling you self-driving cars are to be a reality in 2019. I sometimes wonder how they are not ashamed of themselves, but I realized shame requires critical thinking, and context/memory, something that is severely lacking in LLMs, but also seemingly in the LLM fanbase.
Funnily enough, these e/acc bros are the ones that do benefit the most from LLMs. That's because if you are used to offload your critical thinking to accounts on X, offloading to a fancy autocomplete instead doesn't seem to be such a big step down.
But in reality, the cold facts are that if LLMs actually helped with productivity, we'd have a noticeable impact on open source. And what do we have instead? It's either crickets or just slop, insane amount of slop.
https://www.theregister.com/2025/02/16/oss_llm_slop/
https://www.infoq.com/articles/llm-surging-pr-noise/
And AGI is juuust behind the corner, trust me.
"Being a bubble" and "companies are using it in production" are not mutually exclusive.
It’s not incremental it’s revolutionary. Nothing has come before that has such power and capability.
At the same time when studies are coming out that experienced developers are losing 19% of productivity using AI tools, it makes me question whether it's not a devolution. Especially considering how widely unprofitable is for Claude to be run at scale where it's at least a net neutral for the average dev, where is that revolution you are talking about?
Is it the same revolution like NFts or blockhain or whatever web3 was, cause I am still waiting for those?
If it's revolutionary as you say, why are companies laying off people when higher productivity per employee should mean that more employees increase the advantage from AI? Why aren't early adopters running circles around competitors and producing larger, more frequent and/or higher quality updates and products in a measurable way?
That being said, everything is overvalued and a lot of this is ridiculous.
Extrapolation would reasonably show that they're reaching an asymptote, graph cost vs improvement on a chart; you'll see that they are not proportional.
- The energy efficiency and cost improvements of LLMs has plateau-ed as of late. https://arxiv.org/html/2507.11417v1
- The improvements from each subsequent model have also plateau-ed, with even regressions being noticeable
- The biggest players are so wildly unprofitable that they are already trying to change their plans or squeeze their current fanbase and raise their rates
https://news.ycombinator.com/item?id=44598254
https://www.wheresyoured.at/anthropic-is-bleeding-out/
- And, as it turns out, experienced developers are 19% less productive using LLMs: https://www.theregister.com/2025/07/11/ai_code_tools_slow_do...
> I.e the rate of improvement of these tools.
They have stopped improving to match for the increase of the rate their costs and benefits. It's simple mathematics, improvements in efficiency don't match the increases in costs and the fact that they are extremely unprofitable, and all that data points to a bubble.
It's one of the most obvious bubbles if I have ever seen one, propped only by vibes and X-posts and Sama's promise that AGI is just around the corner, just inject a couple trillion more, trust me bro. All that for a fancy autocomplete.
And as someone who have seen the PC, Internet and Smartphone cycle. I will say ChartGPT ( or AI ) adoption cycle is way faster than anything I have seen.
I agree completely from a developer point of view, as for it being a bubble.. I'm not sure. It seems that it's a couple of companies enticing people to integrate things into their system so deeply that later, they can name their price and dictate their terms so that all technology falls in line with how they want things to play out.
We can see it already happening with VC investment not touching anything that doesn't have AI integrated.
Less innovation and creativity from smaller startups means less competition, which is great for business.
Great example though, so many companies grew from the IAC movement. Those were startups though, doing what Amazon probably didn't have the resources to do themselves and also didn't need to.
This AI craze feels to hit at a bit of a lower level, I feel awful for any junior or new Devs entering the market.
Edit for clarity : Hashicorps terraform wouldn't be here without making aws infra easier.
A lot of the AGI prediction is a fairly mundane business of extrapolating Moore's law like growth in computing and comparing it to the human brain. I don't think calling it mythology from high priests is a very accurate appraisal.
d00mB0t•6mo ago