What's next, the great CPU shortage of 2026?
The only silver lining is that newer devices will have to scale down memory, so developers will have to ditch memory-sucking frameworks and start to optimize things again.
If it’s temporary I can live with it.
I guess this was inevitable with the absolute insane money being poured into AI.
Given this has been going on for years at this point, the high prices of graphics cards through crypto and now AI, it feels like this is the new normal, forever propped up by the next grift.
No. Prices will just go up, less innovation in general.
Work is fucked. 23TB of RAM online. Microservices FTW. Not. Each node has OS overhead. Each pod has language VM overhead. And the architecture can only cost more over time. On top of that "storage is cheap so we won't bother to delete anything". Stupid mentality across the board.
You have 31/999 credits remaining. What activity would you like to productively participate in today?
Which goes into why I think this might be good. Developers have kind of treated disks as "oh well" with binaries ballooning in size, even when it can easily solved, and there is little care to make things lightweight. Just like I now figure out a different solution to recover space, I'm hoping with a shortage this kind of thing will be more widespread, and we'll end up with smaller things until the shortage is over. "Necessity is the mother of all invention" or however it goes.
Then they came for the RAM, but I did not speak out, for I had already closed Firefox.
Then they came for the hard drives, but I did not speak out, for I had the cloud.
Then my NAS died, and there was no drive left to restore from backup.
That's when I sell of my current hardware and house, buy a cow and some land somewhere in the boondocks and become a hermit.
"You will use AI, because that will be the only way you will have a relaxed life. You will pay for it, own nothing and be content. Nobody cares if you are happy or not."
You also have to look at the current status of the market. The level of investment in data centers spurred by AI are unlikely to last unless massive gains materialize. It's pretty clear some manufacturers are betting things will cool down and don't want to overcommit.
It is time to talk seriously about breaking up the hyperscalers. If we don't address the structural dominance of hyperscalers over the physical supply chain, "Personal Computing" is going to become a luxury of the past, and we’ll all be terminal-renters in someone else's data center.
Why is this allowed on HN?
1) The comment you replied to is 1 minute old, that is fast for any system to detect weird comments
2) There's no easy and sure-fire way to detect LLM content. Here's wikipedias list of tells https://en.wikipedia.org/wiki/Wikipedia:Signs_of_AI_writing
How do you know that ? Genuine question.
The "isn't just .., but .." construction is so overused by LLMs.
In this case “it’s not x, it’s y” pattern and its placement is a dead giveaway.
It's not ironic, but bitterly funny, if you ask me.
Note: I'm not an AI, I'm an actual human without a Claude account.
ChatGPT does this just as much, maybe even more, across every model they've ever released to the public.
How did both Claude and GPT end up with such a similar stylistic quirk?
I'd add that Kimi does it sometimes, but much less frequently. (Kimi, in general, is a better writer with a more neutral voice.) I don't have enough experience with Gemini or Deepseek to say.
This is the game plan of course, why have customers pay one time for hardware when they can have you constantly feed them money over the long term. Shareholders want this model.
It started with planned obsolescence, now this new model is the natural progression.. There is no obsolescence even in discussion when you're only option is to rent a service, that the provider has no incentive to even make competitive.
I really feel this will be China's moment to flood the market with hardware and improve their quality over time.
Yep. My take is that, ironically, it's going to be because of government funding the circular tech economy, pushing consumers out of the tech space.
It's no coincidence that Microsoft decided to take such a massive stake in OpenAI - leveraging the opportunity to get in on a new front for vendor locking by force-multiplying their own market share by inserting it into everything they provide is an obvious choice, but also leveraging the insane amount of capital being thrown into the cesspit that is AI to make consumer hardware unaffordable (and eventually unusable due to remote attestation schemes) further enforces their position. OEM computers that meet the hardware requirements of their locked OS and software suite being the only computers that are a) affordable and b) "trusted" is the end goal.
I don't want to throw around buzzwords or be doomeristic, but this is digital corporatism in its endgame. Playing markets to price out every consumer globally for essential hardware is evil and something that a just world would punish relentlessly and swiftly, yet there aren't even crickets. This is happening unopposed.
China is now the only solution to fix broken western controlled markets.
There is appetite in some circles for a consumer boycott but not much coordination on targets.
And for the same reason - to avoid the dominant players going "oh shiny" on short term lucrative adventures or outright trying to manipulate the market - causing people to starve and making society grind to a halt.
> They largely come from hyperscalers who want hard drives for their AI data centers, for example to store training data on them.
What type of training data? LLMs need relatively little of that. For example, DeepSeek-V3 [1], still a relatively large model:
> We pre-train DeepSeek-V3 on 14.8 trillion diverse and high-quality tokens
At 2 bytes per token, that's 29.6 terabytes. That's basically nothing compared to the amount of 4K content that is uploaded to YouTube every day.
moomoo11•51m ago
blackhaz•50m ago
Also, the Return of PKZIP.
baal80spam•46m ago
boobsbr•30m ago
Keyframe•33m ago