What's next, the great CPU shortage of 2026?
The only silver lining is that newer devices will have to scale down memory, so developers will have to ditch memory-sucking frameworks and start to optimize things again.
If it’s temporary I can live with it.
I guess this was inevitable with the absolute insane money being poured into AI.
Given this has been going on for years at this point, the high prices of graphics cards through crypto and now AI, it feels like this is the new normal, forever propped up by the next grift.
Sure you have to isolate certain rogue states - North Korea, Russia, USA
Big tech will be deemed "too big to fail" and will get a bail out. The tax payers will suffer.
That's how inflation works. In this case it seems more narrow though, there's hope the prices will go down. Especially if the AI hype finds a reason to flounder.
Just like the price of labour. Your salary went up and doesn't come down
In the UK weekly earnings increased 34% from December 2019 to December 2025.
CPI went up 30% in the same period.
Obviously that CPI covers things which went up more, and things which went up less, and your personal inflation will be different to everyone elses. Petrol prices end of Jan 2020 were 128p a litre, end of Jan 2025 they are 132p a litre [0]. Indeed petrol prices were 132p in January 2013. If you drive 40,000 miles a year you will thus see far lower inflation than someone who doesn't drive.
Call me cynical if you like, but I don’t see this optimism that assumes the banal idea that somehow good always wins, when that’s simply not possible and in fact bad-guys have won many times before, it’s just that “dead men tell no tales” and the winners control what you think is reality.
Same way the price of groceries going up means people buy only what they need and ditch the superfluous.
No. Prices will just go up, less innovation in general.
Work is fucked. 23TB of RAM online. Microservices FTW. Not. Each node has OS overhead. Each pod has language VM overhead. And the architecture can only cost more over time. On top of that "storage is cheap so we won't bother to delete anything". Stupid mentality across the board.
You have 31/999 credits remaining. What activity would you like to productively participate in today?
Which goes into why I think this might be good. Developers have kind of treated disks as "oh well" with binaries ballooning in size, even when it can easily solved, and there is little care to make things lightweight. Just like I now figure out a different solution to recover space, I'm hoping with a shortage this kind of thing will be more widespread, and we'll end up with smaller things until the shortage is over. "Necessity is the mother of all invention" or however it goes.
Then they came for the RAM, but I did not speak out, for I had already closed Firefox.
Then they came for the hard drives, but I did not speak out, for I had the cloud.
Then my NAS died, and there was no drive left to restore from backup.
On the other hand the total storage capacity shipped each year has risen, as a combination of HDDs getting larger and larger, and demand shifting from smaller consumer HDDs to larger data center, enterprise and NAS HDDs. I'm not sure how flexible those production lines are, but maybe the reaction will be shifting even more capacity to higher-capacity drives with cutting-edge technology
That's when I sell of my current hardware and house, buy a cow and some land somewhere in the boondocks and become a hermit.
"You will use AI, because that will be the only way you will have a relaxed life. You will pay for it, own nothing and be content. Nobody cares if you are happy or not."
You also have to look at the current status of the market. The level of investment in data centers spurred by AI are unlikely to last unless massive gains materialize. It's pretty clear some manufacturers are betting things will cool down and don't want to overcommit.
It is time to talk seriously about breaking up the hyperscalers. If we don't address the structural dominance of hyperscalers over the physical supply chain, "Personal Computing" is going to become a luxury of the past, and we’ll all be terminal-renters in someone else's data center.
Why is this allowed on HN?
1) The comment you replied to is 1 minute old, that is fast for any system to detect weird comments
2) There's no easy and sure-fire way to detect LLM content. Here's wikipedias list of tells https://en.wikipedia.org/wiki/Wikipedia:Signs_of_AI_writing
How do you know that ? Genuine question.
The "isn't just .., but .." construction is so overused by LLMs.
In this case “it’s not x, it’s y” pattern and its placement is a dead giveaway.
It's not ironic, but bitterly funny, if you ask me.
Note: I'm not an AI, I'm an actual human without a Claude account.
ChatGPT does this just as much, maybe even more, across every model they've ever released to the public.
How did both Claude and GPT end up with such a similar stylistic quirk?
I'd add that Kimi does it sometimes, but much less frequently. (Kimi, in general, is a better writer with a more neutral voice.) I don't have enough experience with Gemini or Deepseek to say.
LLMs do these things because they are in the training data, which means that people do these things too.
It is sometimes difficult to not sound like an LLM-written or LLM-reworded comment… I've been called a bot a few times despite never using LLMs for writing English⁴.
--------
[1] particularly vapid space-filler articles/comments or those using whataboutism style redirection, which might be a significant chunk of model training data because of how many of them are out there.
[2] I overuse footnotes as well, which is apparently a smell in the output of some generative tools.
[3] A lot of pre-LLM style-checking tools would recommend this in place of hyphens, and some automated reformatters would make the change without access, so there are going to be many examples in training data.
[4] I think there is one at work in VS which I use in DayJob, when it is suggesting code completion options to save typing (literally Glorified Predictive Text) and I sometimes accept its suggestion, and some of the tools I use to check my Spanish⁵ may be LLM based, so I can't claim that I don't use them at all.
[5] I'm just learning, so automatic translators are useful to check what I'm written isn't gibberish. For anyone else doing the same: make sure you research any suggested changes preferably using pre-2023 sources, because the output of these tools can be quite wrong as you can see when translating into a language you are fluent in.
This is the game plan of course, why have customers pay one time for hardware when they can have you constantly feed them money over the long term. Shareholders want this model.
It started with planned obsolescence, now this new model is the natural progression.. There is no obsolescence even in discussion when you're only option is to rent a service, that the provider has no incentive to even make competitive.
I really feel this will be China's moment to flood the market with hardware and improve their quality over time.
Yep. My take is that, ironically, it's going to be because of government funding the circular tech economy, pushing consumers out of the tech space.
It's no coincidence that Microsoft decided to take such a massive stake in OpenAI - leveraging the opportunity to get in on a new front for vendor locking by force-multiplying their own market share by inserting it into everything they provide is an obvious choice, but also leveraging the insane amount of capital being thrown into the cesspit that is AI to make consumer hardware unaffordable (and eventually unusable due to remote attestation schemes) further enforces their position. OEM computers that meet the hardware requirements of their locked OS and software suite being the only computers that are a) affordable and b) "trusted" is the end goal.
I don't want to throw around buzzwords or be doomeristic, but this is digital corporatism in its endgame. Playing markets to price out every consumer globally for essential hardware is evil and something that a just world would punish relentlessly and swiftly, yet there aren't even crickets. This is happening unopposed.
It's so hard to grasp as a problem for the lay person until it's too late.
China is now the only solution to fix broken western controlled markets.
There is appetite in some circles for a consumer boycott but not much coordination on targets.
And for the same reason - to avoid the dominant players going "oh shiny" on short term lucrative adventures or outright trying to manipulate the market - causing people to starve and making society grind to a halt.
> They largely come from hyperscalers who want hard drives for their AI data centers, for example to store training data on them.
What type of training data? LLMs need relatively little of that. For example, DeepSeek-V3 [1], still a relatively large model:
> We pre-train DeepSeek-V3 on 14.8 trillion diverse and high-quality tokens
At 2 bytes per token, that's 29.6 terabytes. That's basically nothing compared to the amount of 4K content that is uploaded to YouTube every day.
This is assuming most of what we stored are either images or video.
moomoo11•1h ago
blackhaz•1h ago
Also, the Return of PKZIP.
baal80spam•1h ago
boobsbr•53m ago
Keyframe•57m ago
nubinetwork•14m ago