You can go 16GB if you go native and throw some assembly in the mix. Use old school scripting languages. Debloat browsers.
It has been long delayed.
But it doesn't matter either way, because both 16 and 32GB have what, doubled, tripled? It's nuts. Even if you say "just buy less memory", now is a horrible time to be building a system.
[0]: For me this is really an important part of working with Claude, the model improves with the time but stay consistent, its "personality" or whatever you want to call it, has been really stable over the past versions, this allows a very smooth transition from version N to N+1.
What I can think of is that there may be a push toward training for exclusively search-based rewards so that the model isn't required to compress a large proportion of the internet into their weights. But this is likely to be much slower and come with initial performance costs that frontier model developers will not want to incur.
DRAM manufacturers got burned multiple times in the past scaling up production during a price bubble, and it appears they've learned their lesson (to the detriment of the rest of us).
With a bit of luck OpenAI collapses under its own weight sooner than later, otherwise we're screwed for several years.
More rot economy. Customers are such a drag. Lets just sell to other companies for billion dollar deals at once. These AI companies have bottomless wallets. No one has thought of this before we will totally get rich.
Now you can't even fit a browser doing nothing into that memory...
I'll do the engineering so we're good on that front. Just need investors.
Of course, it takes quite some time for a fab to go from an idea to mass production. Even in China. Expect prices to drop 2-3 years from now when all the new capacity comes online?
I don't think there is a conspiracy or price fixing going on here. Demand for high profit margin memory is insatiable (at least until 2027 maybe beyond) and by the time extra capacity comes online and the memory crunch eases the minor memory players will have captured such a large part of the legacy/consumer market that it makes little sense for the big 3 to get involved anymore.
Add to that scars from overbuilding capacity during previous super memory super cycles and you end up with this perfect storm.
This https://cdna.pcpartpicker.com/static/forever/images/trends/2... will happen to every class of thing (especially once it hits energy, because everything is downstream of that).
Unbounded increases in complexity lead to diminishing returns on energy investment and increased system fragility which both contribute to an increased likelihood of collapse as solutions to old problems generate new problems faster than new solutions can be created since energy that should be dedicated to new solutions is needed to maintain the layers of complexity generated by the layers of previous solutions.
not trying to argue, just curious.
If the argument is that prices will skyrocket simply because of long-term AI demand, I think that ignores the fact that manufacturing vastly more products will stabilize prices up to the point that raw materials start to become significantly more expensive, and is strongly incentivized over the ~10-year timeframe for IC manufacturers.
The value of AGI/ASI is not only defined by its practical use, It is also bounded by the purchasing power of potential consumers.
If humans aren’t worth paying, those humans won’t be paying anyone either. No business can function without customers, no matter how good the product.
Wonder what would happen if it really takes a dive. The impact on the SF tech scene will be brutal. Maybe I'll go escape on a sailboat for 3 years or something.
Anyway, tangential, but something I think about occasionally.
I've been a huge advocate for local, open, generative AI as the best resistance to massive take-over by large corporations controlling all of this content creation. But even as it is (or "was" I should say), running decent models at home is prohibitively expensive for most people.
Micron has already decided to just eliminate the Crucial brand (as mentioned in the post). It feels like if this continues, once our nice home PCs start to break, we won't be able to repair them.
The extreme version of this is that even dumb terminals (which still require some ram) will be as expensive as laptops today. In this world, our entire computing experience is connecting a dumb terminal to a ChatGPT interface where the only way we can interact with anything is through "agents" and prompts.
In this world, OpenAI is not overvalued, and there is no bubble because the large LLM companies become computing.
But again, I think this is mostly a dystopian sci-fi fiction... but it does sit a bit too close to the realm of possible for my tastes.
It's not a lot, but it's enough for a dumb terminal.
I remember when the crypto miners rented a plane to deliver their precious GPUs.
1 person has all the money and all the power and everyone else is bankrupt forever and sad.
If Crucial screws up by closing their consumer business they won’t feel any pain from it because the idea of new competitors entering the space is basically impossible.
RAM being plentiful and cheap led to a lot of software development being very RAM-unaware, allowing the inefficiencies of programs to be mostly obfuscated from the user. If RAM prices continue rising, the semi-apocalytic consumer fiction you've spun here would require that developers not change their behaviors when it comes to software they write. There will be an equillibrium in the market that still allows the entry of consumer PC's it will just mean devices people buy will have less available RAM than is typical. The demand will eventually match up to the change in supply as is typical of supply/demand issues and not continuously rise into an infinite horizon.
https://news.ycombinator.com/item?id=46142100#46143535
Had Samsung known SK Hynix was about to commit a similar chunk of supply — or vice-versa — the pricing and terms would have likely been different. It’s entirely conceivable they wouldn’t have both agreed to supply such a substantial part of global supply if they had known more...but at the end of the day - OpenAI did succeed in keeping the circles tight, locking down the NDAs, and leveraging the fact that these companies assumed the other wasn’t giving up this much wafer volume simultaneously…in order to make a surgical strike on the global RAM supply chain..
What's the economic value per warehoused and insured cubic inch of 900,000 memory wafers? Grok response:> As of late 2025, 900,000 finished 300 mm 3D NAND memory wafers (typical high-volume inventory for a major memory maker) are worth roughly $9 billion and occupy about 104–105 million cubic inches when properly warehoused in FOUPs. → Economic value ≈ $85–90 per warehoused cubic inch.
Blaming OpenAI for this like it's some sort of targetted evil hostile act is like blaming China for industrial pollution: yeah, you outsourced all your industrial production to them.
So it's the Bitcoin craze all over again. Sigh. The bubble will eventually collapse, it has to - but the markets can stay irrational longer than you can stay solvent... or, to use a more appropriate comparison, have a working computer.
I for myself? I hope once this bubble collapses, we see actual punishments again. Too-large-to-fail companies broken up, people getting prosecuted for the wash trading masquerading itself as "legitimate investments" in the entire bubble (that more looks like the genetic family table of the infamously incestuous Habsburg family), greedy executives jailed or, at least where national security is impacted due to chip shortages, permanently gotten rid of. I'm sick and tired of large companies being able to just get away with gobbling up everything, killing off the economy at large, they are not just parasites - they are a cancer, killing its host society.
If an RTX 5000 series price topped out at historical prices no one would need hosted AI
Then it came to be that models were on a path to run well enough loaded into RAM... uh oh
This is in line with ISPs long ago banning running personal services and the long held desire to sell dumb thin clients that must work with a central service
Web developers fell for confidence games of old elders hook line and sinker. Nothing but the insane ego and vanity of some tech oligarchs driving this. They cannot appear weak. Vain aura farming, projection of strength.
You don't need a new PC. Just use the old one.
At time time I was messing around with the "badram" patch for Linux.
Always good advice.
But what will happen when people are priced out from the circus?
I wouldn't ascribe that much intent. More simply, datacenter builders have bought up the entire supply (and likely future production for some time), hence the supply shortfall.
This is a very simple supply-and-demand situation, nothing nefarious about it.
Spoiler, but the answer is basically that old hardware rules the day because it lasts longer and is more reliable of timespans of decades.
DDR5 32GB is currently going for ~$330 on Amazon
DDR4 32GB is currently going for ~$130 on Amazon
DDR3 32GB is currently going for ~50 on Amazon (4x8GB)
For anyone where cost is a concern, using older hardware seems like a particularly easy choice, especially if a person is comfortable with a Linux environment, since the massive droves of recently retired Windows 10 incompatible hardware works great with your Linux distro of choice.
You can see the cost rise of DDR4 here.
Just like SSDs from 2010 have 100.000 writes per bit instead of below 10.000.
CPUs might even follow the same durability pattern but that remains to be seen.
Keep your old machines alive and backed up!
If you have a potentially multi-billion dollar contract, most businesses will do things outside of their standard product offerings to take in that revenue.
I look at MS Teams currently using 1.5GB of RAM doing nothing.
If not, what would these AI companies do with the huge supply of hardware they're going to want to get rid of? I think a secondary market is sure to appear.
jsheard•44m ago
loloquwowndueo•42m ago
dcchambers•42m ago
Eric_WVGG•21m ago
It’s been pointed out by others that price is part of Apple's marketing strategy. You can see that in the trash can Mac Pro, which logically should have gotten cheaper over the ridiculous six years it was on sale with near-unchanged specs. But the marketing message was, "we're selling a $3000 computer."
Those fat margins leave them with a nice buffer. Competing products will get more expensive; Apple's will sit still and look even better by comparison.
We are fortunate that Apple picked last year to make 16gb the new floor, though! And I don't think we're going to see base SSDs get any more generous for a very, very long time.
* okay I do remember that Macbook Airs could be had for $999 for a few years, that disappeared for a while, then came back
suprnurd•42m ago
umanwizard•41m ago
kayson•40m ago
FastFT•35m ago
Night_Thastus•42m ago
They don't care. They'll pass the cost on to the consumers and not give it a second thought.
rfmc•41m ago
throw0101d•23m ago
Perhaps I don't understand something so clarification would be helpful:
I was under the impression that Apple's RAM was on-die, and so baked in during chip manufacturing and not a 'stand alone' SKU that is grafted onto the die. So Apple does not go out to purchase third-party product, but rather self-makes it (via ASML) when the rest of the chip is made (CPU, GPU, I/O controller, etc).
Is this not the case?
jsheard•21m ago
https://upload.wikimedia.org/wikipedia/commons/d/df/Mac_Mini...
That square is the whole M1 package, Apple's custom die is under the heatspreader on the left, and the two blocks on the right are LPDDR packages stacked on top of the main package.
https://wccftech.com/apple-m2-ultra-soc-delidded-package-siz...
Scaled up, the M2 Ultra is the same deal just with two compute dies and 8 separate memory packages.
diabllicseagull•21m ago