That said.... hopefully at least on Android side you can get a free (as in unchastified) OS to run on it.
Until they come for the HW.
Is that likely? History says it's inevitable, but timeframe is an open question.
If this does occur, unfortunately it isn’t like any of the production capacity is going to immediately shift or be repurposed. A lot of the hardware isn’t usable outside of datacenter deployments. I would guess a more realistic recalibration is 2-3 years of immense pain followed by gradual availability of components again.
The capital from the gulf is already disrupted. It isn't anymore a matter of if or when.
You might have a DVD collection, ten external drives, three laptops, and a workstration. You may still for all intents and purposes be wholly dependent on cloud computing, say, because that it is the only practical way to run whatever AI-driven software three years from now.
Liberty goes beyond that.
https://www.reddit.com/r/LocalLLaMA/comments/1s0czc4/round_2...
uBlock Origin has prevented the following page from loading:
https://xn--gckvb8fzb.com/hold-on-to-your-hardware/
This happened because of the following filter:
||xn--$document
The filter has been found in: IDN Homograph Attack Protection - Complete BlockageHowever, I do believe that we're at an inflection point where DC hardware is diverging rapidly from consumer compute.
Most consumers are using laptops and laptops are not keeping pace with where the frontier is in a singular compute node. Laptops are increasingly just clients for someone else's compute that you rent, or buy a time slice with your eyeballs, much like smartphones pretty much always have been.
I personally dropped $20k on a high end desktop - 768G of RAM, 96 cores, 96 GB Blackwell GPU - last October, before RAM prices spiked, based on the logic that hardware had moved on but local compute was basically stagnant, and if I wanted to own my computing hardware, I'd better buy something now that will last a while.
This way, my laptop is just a disposable client for my real workstation, a Tailscale connection away, and I'm free to do whatever I like with it.
I could sell the RAM alone now for the price I paid for it.
Also, I wonder how many of us, even here on HN, have the ability to spend that amount of money on computer for personal use. Frankly I wouldn't even know what to do with all the RAM - should I just ramdisk every program I use and every digital thing I made in the last five years?
Anyhow, I suppose for the folks who can't afford hardware (perhaps by design), one ought to own nothing and be happy.
The RAM choice was because I have never regretted buying more RAM - it's practically always a better trade than a slightly faster CPU - and 96GB DIMMs were at a sweet spot compared to 128GB DIMMs.
That, and the ability to have big LLMs in memory, for some local inference, even if it's slow mixed CPU/GPU inference, or paged on demand. And if not for big LLMs, then to keep models cached for quick swapping.
768GB of RAM is insane…
Meanwhile, I’ve been going back and forth for over a year about spending $10k on a MacBook Pro with 128GB. I can’t shake the feeling I’d never actually use that much, and that, long term, cloud compute is going to matter more than sinking money into a single, non-upgradable machine anyway.
It wasn't my primary motivator but it hasn't made me regret my decision.
I hummed and hawed on it for a good few months myself.
Before this price spike, it used to be you could get a second-hand rack server with 1TB of DDR4 for about $1000-2000. People were massively underestimating the performance of reasonably priced server hardware.
You can still get that, of course, but it costs a lot more. The recycling company I know is now taking the RAM out of every server and selling it separately.
Apple hardware is incredibly overpriced.
I don't know your workloads, but for me personally 64 GB is the ceiling buffer on RAM - I can run entire k8s cluster locally with that and the M5 Pro with top cores is same CPU as M5 Max. I don't need the GPU - the local AI story and OSS models are just a toy for my use-cases and I'm always going to shell out for the API/frontier capabilities. I'm even thinking of 48 config because they already have those on 8% discounts/shipped by Amazon and I never hit that even on my workstation with 64 GB.
See a $1100 GPU on eBay, but it’s in the US? Actually a $1900 GPU.
A colleague were just talking about how well he timed the purchase of his $700 24GB 3090.
My phone has 16gigs of ram and a terabyte of storage, laptops today are ridiculous compared to anything I studied with.
I'm not arguing mind you, just trying to understand the usecases people are thinking of here.
Running Electron apps and browsing React-based websites, of course.
Open source efforts need to give up on local AI and embrace cloud compute.
We need to stop building toy models to run on RTX and instead try to compete with the hyperscalers. We need open weights models that are big and run on H200s. Those are the class of models that will be able to compete.
When the hyperscalers reach take off, we're done for. If we can stay within ~6months, we might be able to slow them down or even break them.
If there was something 80-90% as good as Opus or Seedance or Nano Banana, more of the ecosystem would switch to open source because it offers control and sovereignty. But we don't have that right now.
If we had really competitive open weights models, universities, research teams, other labs, and other companies would be able to collaboratively contribute to the effort.
Everyone in the open source world is trying to shrink these models to fit on their 3090 instead, though, and that's such a wasted effort. It's short term thinking.
An "OpenRunPod/OpenOpenRouter" + one click deploy of models just as good as Gemini will win over LMStudio and ComfyUI trying to hack a solution on your own Nvidia gaming card.
That's such a tiny segment of the market, and the tools are all horrible to use anyway. It's like we learned nothing from "The Year of Linux on Desktop 1999". Only when we realized the data center was our friend did we frame our open source effort appropriately.
People who are willing to drop $20k on a computer might not be affected much tho.
I don't share the same 1:1 opinion with regards to the article, but it is absolutely clear that RAM prices have gone up enormously. Just compare them. That is fact.
It may be cheaper lateron, but ... when will that happen? Is there a guarantee? Supply crunch can also mean that fewer people can afford something because the prices are now much higher than before. Add to this the oil crisis Trump started and we are now suddenly having to pay more just because a few mafiosi benefit from this. (See Krugman's analysis of the recent stock market flow of money/stocks.)
Does all this not apply to businesses buying computers for their employees?
Tongue in cheek: we urgently need fusion power plants. For the AI and the helium.
The experience is quite immersive and well worth the upgrade that happened very progressively (WiFi 5 1080p then WiFi 6/7 4K).
For gaming, I have a dedicated device - a Nintendo Switch, but I also play indie PC games like Slay the Spire, Forge MTG, some puzzle games e.g. TIS-100.
Linux with i3 is fast and responsive. I write code in the terminal, no fancy debuggers, no million plugins, no Electron mess.
It’s enough for everything I need, and I don’t see a reason to ever upgrade. Unless my hardware starts failing, of course.
There could be a swing in the future where people will demand local AI instead and resources could shift back to affordable local AI devices.
Lastly, this thesis implies that we will be supply constrained forever such that prices for personal devices will always be elevated as a percentage of one's income. I don't believe that.
whatever happens it's crazy and hope AI madness is worth it
For example, my current Thinkpad T14-gen5, was bought with 8GB ram and 256GB NVME, and then upgraded to 64GB ram and 2TB NVME, for the same price as 16G/512G would have cost at Lenovo. And then I still have the 8GB/256GB to re-use/re-sell.
Consumer hardware will always be a market worth serving for companies who don't see their stock price as their product.
If the existing companies are unwilling to make a sale, I am sure new players will arise picking up their slack.
Can dang/a moderator please ban the domain from HN? Even if its not exactly malware, it's pretending to be malware to grab your attention and it's obviously intending to fill your browser history with inappropriate content, which didn't work on my browser because I opened the blog in a private browser session. The operator clearly doesn't run his blog in good faith.
You are also completely speculating on the intent. Less drama please.
I thought it was clever. But it also seems ham-fisted, and in poor taste.
I opened the tab on my work laptop and having NSFW title and icon in the office is unacceptable, I understand the intent but the implementation and this way of forcing people to do something is ridiculous. I do not own or control this machine, I trust the links of the frontage of HN to be somewhat safe and not put me in an uncomfortable position. Yes, the site not necessarily malware but a dark pattern and that’s not how you teach your average day-to-day user.
Those who are best able to use a resource are willing to pay the most for it thus pricing out unproductive usages of it.
This is pure Capitalism.
If one is in general against Capitalism, yes, one can complain.
But saying "I want free markets" and "I want capitalism", but then complaining when the free markets increase the price of your RAM is utterly deranged.
Some will say "but Altman is hoarding the RAM, he's not using it productively". It's irrelevant, he is willing to pay more than you to hoard that RAM. In his view he's extracting more value from that than you do. The markets will work. If this is unproductive use of Capital, OpenAI will go bankrupt.
Maybe... just maybe, a TODO list app shouldn't run 4 processes, and consume hundreds of megabytes of RAM?
camgunz•1h ago
XorNot•1h ago
The US is headed for a cataclysmic crash at this point and it's not clear what will trigger it, but all those companies pushing underpriced tokens and Rust ports of existing tools by agents aren't going to survive it.
not_the_fda•56m ago
They will let the hyperscalers buy their supply at a premium and wait for the bust. Then they will shift back to the consumer space.
Hardware is going to be expensive for awhile but its not as dire as the article makes it out to be.
9wzYQbTYsAIc•44m ago
At the same time, the article’s argument that the value of personal computer ownership is only going to rise, in terms of the value of speech, not strictly in terms of the value of lunch, is important to call out.
I’m glad I held on to my 2009 MacBook, for example, as it still functions today as an active part of my homelab, at an amortized yearly cost of practically the price of taking a nice steak dinner once a year.