You can subscribe to our GeForce NOW service to rent a top of the line card through our cloud service for the low low price of 11€$£ or 22€$£ a month with *almost no restrictions.
*Except for all the restrictions.
To get an idea, if you go gaming via cloud then fast internet + office PC or Laptop is enough. So you save way more than the GPU only in a proper comparison.
This is why I play consoles only. I can play games for years without ever changing HW and save tons of money compared to my PC gaming times.
The hardcore and frequent gamers won’t like it but it was never really for them.
And the competition on the GPU market is soft to say the least.
Damn nvidia
Maybe consumer electronics will move backwards by a process node or two?
I wonder if a bunch of consumer electronics will move back to something like 12 nm for a while? Seems like there's a lot of capacity in that range. Zen 2 wasn't so bad, right?
During this time AMD was focused on CPUs. They've already said that they'll focus more on GPUs now (since CPUs are way ahead and AI is a thing) so this should change things.
Ram is 4-5x the price of a year ago.
Is AI going to kill the consumer computer industry?
Or maybe assuming the trend holds in the longer term it could mean that consumers will move downstream of datacenters. Anyone who wants a GPU rocking 3 to 5 year old recycled enterprise gear.
Even if, the death of the AAA gaming is nothing I will cry about. Most games don't require anything remotely as performant as 5070.
Just saying that your grudges with AAA games have a blast effect you might not be aware of.
This is a false statement. They’re still producing consumer cards. You can go buy a 5070FE in stock on their web store at MSRP right now. You can buy a discounted 5060 from Best But below MSRP.
They’re changing production priorities for a little while if the rumors are accurate.
RAM prices have always been cyclical and prone to highs and lows. This is an especially high peak but it will pass like everything else.
These predictions that the sky is falling are way too dramatic.
My only small regret is that I decided to build an SFF PC, otherwise I would've gone for 128 GB of RAM instead of just 64. Oh well, ̶6̶4̶0̶ ̶K̶B̶ 64 GB should be enough for most purposes.
I don't necessarily think that everything is going doomer "subscription based cloud streaming"; the economics of these services never made sense, especially for gaming, and there's little reason to believe that the same incentives that led to Nvidia, Crucial, etc wanting out of the consumer hardware business wouldn't also impact that business.
Instead, the future is tightly integrated single-board computers (e.g. Framework Desktop, the new HP keyboard, Mac Mini, RPi, etc). They're easier for consumers to buy. Integrated memory, GPU, and cooling means we can drive higher performance. All of the components getting sourced by one supplier means the whole "X is leaving the consumer market" point is moot, and allows better bulk deals to be negotiated. They're smaller. It allows one company (e.g. Framework) to capture more margin than sharing with ten GPU or memory middle-men who just slap a sports car-looking cooler on whatever they bought from Micron and saying they're a real business.
My lingering hope is that we do see some company succeed who can direct-sell these high-end SBCs to consumers, so if you want to go the route of a custom case and such, you still can. And that we don't lose modular storage. But I've lost all hope that DIY PCs will survive this decade; to be frank, they haven't made economic sense for a while.
I don't think that checks out. The fabs are booked out AFAIU. This is going to hit SoCs (and anything else you can come up with) sooner rather than later because it all depends on the same fabs producing the same silicone at the end of the day. It's just packaged differently.
They left the consumer market due to the price difference. It's not that there aren't middlemen willing to purchase in bulk right now. It's that the OEMs aren't willing to sell at any price because they've already sold their entire future inventory at absurd prices for the next however many months or years.
I assume there will still be at least a few SoCs to choose from but the prices will likely be completely absurd because they will have to match the enterprise price for the components that go into them.
The sub-argument to this is that graphics cards will drive up fab prices for other packaged silicon products; this has probably been true for the past two years, but its very likely that we'll see this change in 2026. Even if theoretical demand stays high (which is debatable, but not for today): every major AI lab is sitting on warehouses of gigawatts of ready silicon, with nowhere to power them, so real demand will drop as the bottlenecks of data center construction and power delivery are solved. Those problems will take another few years. Even if these companies have the money and want to spend it: It makes zero sense to buy cards today, to have them sit in a warehouse for two years, when you can sit on the cash and buy newer-generation cards in a year.
I would bet very real money that, sometime in 2026, we will see Nvidia reduce or cancel a committed fab order from TSMC.
That's a real difference with the weight that Nvidia brings to the table, versus other customers. Not just the difference in capital, but the demand and diversity at play. There is Nvidia hardware in crop dusters and cruise missiles, datacenters and deep-sea SONAR. Apple kicked out their partners and doesn't even consider paying market-price when TSMC asks them to. They have the money to buy iPhone silicon at-cost, but the iPhone doesn't make enough money to compete with Raytheon's margins. Apple is smaller than you think, and all it takes is a team player to prove it.
The performance side still won't add up in the favor of SOCs anyways. Distributed machines with high-speed interconnect run circles around the fastest Macs that an equivalent budget can buy. Benchmarks all show Nvidia being more power-efficient at raster and compute despite having the more complicated architecture. Nobody is ripping up their dGPU racks to install an SOC cluster right now, datacenters aren't putting their Nvidia cards on Ebay to buy more Ryzen blades. The opposite is happening really; SOCs are being fast-tracked into obsolescence to better handle heterogeneous workloads that homogenous SOCs can't do efficiently.
Your initial claim ("they haven't made economic sense for a while") baffles me. SOCs would be cleaning up shop in the HPC niche, if they were any good at it.
GP suggests that AI hardware acquisition has outstripped the capacity to rack and power said hardware for the foreseeable future. If that's true we would expect orders to start getting postponed or even cancelled soon.
It's an interesting hypothesis but I'm not sure I believe it.
It’s true that Nvidia has a more diverse set of end-customers than Apple’s silicon. But, the customers you’re describing were the ones they had in 2020 when their revenue was $3B, versus the $57B they do today. The vast, vast majority of that revenue growth has came from less-than ten customers; the usual suspects. Revenue growth is a proxy for their contracts with TSMC. If the hyperscaler revenue growth takes a hit, the scale of that hit would outpace their ability to just shuffle fulfillment around (though, I have no doubt they will try)
You’re also misunderstanding my initial claim, which might explain why you are baffled. I did not claim that SBCs make better economic sense than modular computers. I claimed that DIY computers rarely make economic sense over non-DIY computers today. Modular computers can be non-DIY; a quarter of every Best Buy is full of them, and it’s what most businesses in the HPC, CAD, etc spaces would buy their employees: a pre-build from Dell. Dell’s/etc relationships with suppliers grant them the same power laws that I explain fuel the SBC segment’s growth; power laws that the middlemen in the DIY space, like Corsair/etc, aren’t benefited from as strongly, and easily overwhelm with their own markup for sports car cooling fins and RGB.
So, if Nvidia ends up with extra fab space amid a RAM shortage, I think they'll pivot to RTX chips and Jetson boards that have already proven to retain their value. Nvidia can retain capacity and justify the cost because there are still buyers lined-up out the door. The iPhone and iPad have no actual usage for wafers this dense and Apple knows it; it would be idiotic to start a bidding war over 2nm capacity when your flagship product is a gambling tablet.
> I claimed that DIY computers rarely make economic sense over non-DIY computers today
Like the other commenters said - hasn't that always been the case? You could have told Linus Torvalds this in the 1990s and strictly speaking you'd be correct. That sentiment has had zero salient impact on the health of custom computers or the longevity of DIY machines built to-task. China still ships Noctua fans and PC cases between flip-flops and sex toys because we will pay $100 for some plastic with ball-bearings inside it.
DIY computers have never made sense, and it has never been a serious threat to their existence. Not today, not 30 years ago neither.
Instead, the future is tightly integrated single-board computers
Well, all of that is true, but all of that has always been true, right?AI is simultaneously a bubble and here to stay (a bit like the "Web 1.0" bubble IMO)
Also, importantly, consumer GPUs are still an important on-ramp for developers getting into nVidia's ecosystem via CUDA. Software is their real moat.
There are other ways to provide that on-ramp, and nVidia would rather rent you the hardware than sell it to you anyway, but.... I dunno. Part of me says the rumors are true, part of me says the rumors are not true...
So while the news is not great, I think it is far from any doom and gloom if we are in fact going to be getting more 5060 cards.
As it is the value of the crazy higher speced cards was questionable with most developers targeting console specs anyways. But it does bring to question how this might impact the next generation of consoles and if those will be scaled back.
We will likely be seeing some stagnation of capability for a couple years. Maybe once the bubble pops all the work that went into AI chips can come back to gaming chips and we can have a big leap in capability.
Maybe another way to look at it is: with hundreds of billions being tossed around, could there possibly not be second-order effects?
We'll see....
https://wccftech.com/nvidia-to-bring-back-geforce-rtx-3060-q...
voidfunc•3w ago
Happy I just bought my 5080 before Christmas. Theyre all on borrowed time.
legobmw99•3w ago
ndiddy•3w ago