Liars or not, the performance has not been there for me in any of my usecases, from personal to professional.
A system from 2017/2018 with an 8700K and an 8GB 2080 performs so closely to the top end, expensive systems today that it makes almost no sense to upgrade at MSRP+markup unless your system is older than this.
Unless you need specific features only on more recent cards, there are very few use cases I can think of needing more than a 30 series card right now.
Unless nvidia's money printing machine breaks soon, expect the same to continue for the next 3+ years. Crappy expensive cards with a premium on memory with almost no actual video rendering performance increase.
This does not somehow give purchasers more budget room now, but they can buy 30-series cards in spades and not have to worry about the same heating and power deliveries as a little bonus.
This is wrong. The 50 series uses 12V-2x6, not 12VHPWR. The 30 series was the first to use 12VHPR. The 40 series was the second to use 12VHPWR and first to use 12V-2x6. The 50 series was the second to use 12V-2x6. The female connectors are what changed in 12V-2x6. The male connectors are identical between 12V-2x6 and 12VHPWR.
That said, the industry seems to be moving to adding detection into the PSU, given seasonic’s announcement:
https://www.tomshardware.com/pc-components/power-supplies/se...
Finally, I think there is a simpler solution, which is to change the cable to use two large gauge wires instead of 12 individual ones to carry current. That would eliminate the need for balancing the wires in the first place.
If the gauge is large enough on the two large gauge wires, they would carry away heat from the connector, keeping it cool. That would be an added bonus. Merely eliminating the balancing problem should prevent additional failures, since the connectors melt after the unbalanced wires cause some to become incredibly hot, dumping heat into the connectors. That is why some reports have the PSU side melting too.
I feel like this is a misunderstanding, though I admit I'm splitting hairs here. DLSS is a form of TAA, and so is FSR and most other modern upscalers. You generally don't need an extra antialiasing pipeline if you're getting an artificially supersampled image.
We've seen this technique variably developed across the lifespan of realtime raster graphics; first with checkerboard rendering, then TAA, then now DLSS/frame generation. It has upsides and downsides, and some TAA implementations were actually really good for the time.
Ray tracing? Temporal accumulation and denoising. Irradiance cache? Temporal accumulation and denoising. most modern light rendering techniques cannot be done in time in a single frame. Add to that the fact that deferred or hybrid rendering makes implementing MSAA be anywhere between "miserable" and "impossible", and you have the situation we're in today.
MSAA only helps with geometric edges, shader aliasing can be combatted with prefiltering but even then it's difficult to get rid of it completely. MSAA also needs beefy multisample intermediate buffers, this makes it pretty much a non-starter on heavily deferred rendering pipelines, which throw away coverage information to fit their framebuffer budget. On top of that the industry moved to stochastic effects for rendering all kinds of things that were too expensive before, the latest being actual realtime path tracing. I know people moan about TAA and DLSS but to do realtime path tracing at 4k is sort of nuts really. I still consider it a bit of a miracle we can do it at all.
Personally, I wish there was more research by big players into things like texture space lighting, which makes shading aliasing mostly go away, plays nice with alpha blending and would make MSAA viable again. The issue there is with shading only the stuff you see and not wasting texels.
SSAA is an even older technique than MSAA but the results are not visually the same as just having a really high-DPI screen with no AA.
5 or maybe 10 years ago, high-end GPU are needed to run games at reasonably eye candy setting. In 2025, $500 mid-range GPUs are more than enough. Folks all over can barely tell between High and Ultra settings, DLSS vs FSR, or DLSS FG and Lossless Scaling. There's just no point to compete at $500 price point any more, that Nvidia has largely given up and relegating to the AMD-built Consoles, and integrated graphics like AMD APU, that offer good value in low-end, medium-end, and high-end.
Maybe the rumored Nvidia PC, or the Switch 2, can bring some resurgence.
AMD is truly making excellent cards, and with a bit of luck UDNA is even better. But they're in the same situation as Nvidia: they could sell 200 GPUs, ship drivers, maintain them, deal with returns and make $100k... Or just sell a single MI300X to a trusted partner that won't make any waves and still make $100k.
Wafer availability unfortunately rules all, and as it stands, we're lucky neither of them have abandoned their gaming segments for massively profitable AI things.
I went with the 5070 Ti since the 5080 didn't seem like a real step up, and the 5090 was just too expensive and wasn't in stock for ages.
If I had a bit more patience, I would have waited till the next node refresh, or for the 5090. I don't think any of the other current 50-series cards are worth besides the 5090 it if you're coming from a 2080. And by worth it I mean will give you a big boost in performance.
In their never ending quest to find ways to suck more money out of people, one natural extension is to just turn the thing into a luxury good and that alone seems to justify the markup
This is why new home construction is expensive - the layout of a home doesn’t change much but it’s trivial to throw on some fancy fixtures and slap the deluxe label on the listing.
Or take a Toyota, slap some leather seats on it, call it a Lexus and mark up the price 40% (I get that these days there are more meaningful differences but the point stands)
This and turning everything into subscriptions alone are responsible for 90% of the issues I have as a consumer
Graphics cards seem to be headed in this direction as well - breaking through that last ceiling for maximum fps is going to be like buying a bentley (if it isn’t already) where as before it was just opting for the v8
I suppose you could also blame the software side, for adopting compute-intensive ray tracing features or getting lazy with upscaling. But PC gaming has always been a luxury market, at least since "can it run Crysis/DOOM" was a refrain. The homogeneity of a console lineup hasn't ever really existed on PC.
> DLSS vs FSR, or DLSS FG and Lossless Scaling.
I've used all of these (at 4K, 120hz, set to "balanced") since they came out, and I just don't understand how people say this.
FSR is a vaseline-like mess to me, it has its own distinct blurriness. Not as bad as naive upscaling, and I'll use it if no DLSS is available and the game doesn't run well, but it's distracting.
Lossless is borderline unusable. I don't remember the algorithm's name, but it has a blur similar to FSR. It cannot handle text or UI elements without artifacting (because it's not integrated in the engine, those don't get rendered at native resolution). The frame generation causes almost everything to have a ghost or afterimage - UI elements and the reticle included. It can also reduce your framerate because it's not as optimized. On top of that, the way the program works interferes with HDR pipelines. It is a last resort.
DLSS (3) is, by a large margin, the best offering. It just works and I can't notice any cons. Older versions did have ghosting, but it's been fixed. And I can retroactively fix older games by just swapping the DLL (there's a tool for this on GitHub, actually). I have not tried DLSS 4.
Most people either can’t tell the difference, don’t care about the difference, or both. Similar discourse can be found about FSR, frame drop, and frame stutter. I have conceded that most people do not care.
I found it super alarming because why would they fake something on stage to the extent of just lying.i know Steve jobs had backup phones but jsut claiming a robot is autonomous when it isn’t I just feel it was scammy.
It reminded me of when Tesla had remote controlled Optimus bots. I mean I think that’s awesome like super cool but clearly the users thought the robots were autonomous during that dinner party.
I have no idea why I seem to be the only person bothered by “stage lies” to this level. Tbh even the Tesla bots weren’t claimed to be autonomous so actually I should never have mentioned them but it explains the “not real” vibe.
Not meaning to disparage just explaining my perception as a European maybe it’s just me though!
EDIT > Im kinda suprised by the weak arguments in the replies, I love both companies, I am just offering POSITIVE feedback, that its important ( in my eyes ) to be careful not to pretend in certain specific ways or it makes the viewer question the foundation ( which we all know is SOLID and good ).
EDIT 2 >There actually is a good rebuttal in the replies, although apparently I have "reading comprehension skill deficiencies" its just my pov that they were insinuating the robot was aware of its surroundings, which is fair enough.
So there’s at least a bit more “there” there than the Tesla bots.
See this snipet : "Operator Commands Are Merged: The control system blends expressive animation commands (e.g., wave, look left) with balance-maintaining RL motions"
I will print a full retraction if someone can confirm my gut feeling is correct
It's easier to stabilise from an operator initiated wave, really; it knows it's happening before it does the wave, and would have a model of the forces it'll induce.
Please elaborate unless Im being thick.
EDIT > I upvoted your comment in any case as Im sure its helping
At best the advantage of connecting those systems is that the operator command can inform the balance system, but there's nothing novel about that.
Your understanding of AI and robotics are more cucumber than pear shaped. You're making very little technical sense here. Challenges and progress in robotics aren't where you think they are. It's all propagandish contents you're basing your understandings on.
If you're getting information from TikTok or YouTube Shorts style content, especially around Tesla bros - get the hell out of it at Ludicrous Speed. Or consume way more of it so thoroughly that you cannot be deceived anymore despite blatant lies everywhere. Then come back. They're all plain wrong and it's not good for you.
I hate being lied to, especially if it's so the liar can reap some economic advantage from having the lie believed.
Do business with people that are known liars? And just get repeatedly deceived?
…Though upon reflection that would explain why the depression rate is so high.
Neither company was very forthcoming about the robots being piloted, but neither seems to be denying it either. And both seem to use RL / ML techniques to maintain balance, locomotion, etc. Not unlike Boston Dynamics' bots, which are also very carefully orchestrated by humans in multiple ways.
Haters gonna hate (downvotes just prove it - ha!)
Yet he lists all the RL stuff that we know is used in the robot, he isnt being silent and saying " this robot is aided by AI" , or better yet, not commenting on the specifics, ( which would have been totally ok ), instead he is saying " This is real life simulation", which it isnt.
EDIT > apparently I am wrong - thank you for the correction everyone!
It is clearly - to me at least - doing both of those things.
I think you're reading things into what he said that aren't there.
If you think the droid was autonomous then I guess that is evidence that nvidia were misrepresenting (if not lying).
Having seen these droids outside of the nvidia presentation and watching the nvidia presentation, I think it’s obvious it was human operated and that nvidia were misleading people.
I don't know what you're referring to, but I'd just say that I don't believe what you are describing could have possibly happened.
Nvidia is a huge corporation, with more than a few lawyers on staff and on retainer, and what you are describing is criminal fraud that any plaintiff's lawyer would have a field day with. So, given that, and since I don't think people who work at Nvidia are complete idiots, I think whatever you are describing didn't happen the way you are describing it. Now, it's certainly possible there was some small print disclaimer, or there was some "weasel wording" that described something with ambiguity, but when you accuse someone of criminal fraud you want to have more than "hey this is just my opinion" to back it up.
It's complete cult crazy talk. Not even cargocult, it's proper cultism.
The failure rate is just barely acceptable in a consumer use-case with a single card, but with multiple cards the probability of failure (which takes down the whole machine, as there's no way to hot-swap the card) makes it unusable.
I can't otherwise see why they'd persevere on that stupid connector when better alternatives exist.
For as long as they have competition, I will support those companies instead. If they all fail, I guess I will start one. My spite for them knows no limits
https://www.tomshardware.com/news/radeon-catalyst-image-qual...
https://linustechtips.com/topic/1497989-amd-caught-cheating-...
I actually tend to trust companies who have had bad backlash for past behaviors more than those who haven't. Might be naive but once bitten...
The forum post you linked was an april fools joke.
"Kinda rather not do april 1st jokes like this as it does get cached and passed around after the fact without it being clear."
It became obvious when old e-waste Xeons were turned into viable, usable machines, years ago.
Something is obviously wrong with this entire industry, and I cannot wait for it to pop. THIS will be the excitement everyone is looking for.
High-end GPUs are already useless for gaming (a low-end GPU is enough), their traditional source of demand. They're floating on artificial demand for a while now.
There are two markets that currently could use them: LLMs and Augmented Reality. Both of these are currently useless, and getting more useless by the day.
CPUs are just piggybacking on all of this.
So, lots of things hanging on unrealized promises. It will pop when there is no next use for super high-end GPUs.
War is a potential user of such devices, and I predict it could be the next thing after LLMs and AR. But then if war breaks out in such a scale to drive silicon prices up, lots of things are going to pop, and food and fuel will boom to such a magnitude that will make silicon look silly.
I think it will pop before it comes to the point of war driving it, and it will happen within our lifetimes (so, not a Nostradamus-style prediction that will only be realized long-after I'm dead).
This is the exact model in which WWII operated. Car and plane supply chains were practically nationalized to support the military industry.
If drones, surveillance, satellites become the main war tech, they'll all use silicon, and things will be fully nationalized.
We already have all sorts of hints of this. Doesn't need a genius to predict that it could be what happens to these industries.
The balance with food and fuel is more delicate though. A war with drones, satellites and surveillance is not like WWII, there's a commercial aspect to it. If you put it on paper, food and fuel project more power and thus, can move more money. Any public crisis can make people forget about GPUs and jeopardize the process of nationalization that is currently being implemented, which still depends on relatively peaceful international trade.
Dude, you're describing the 80s. We're in 2025.
GPUs will be used for automated surveillance, espionage, brainwashing and market manipulation. At least that's what the current batch of technologies implies.
The only thing stopping this from becoming a full dystopia is that delicate balance with food and fuel I mentioned earlier.
It has become pretty obvious that entire wealthy nations can starve if they make the wrong move. Turns out GPUs cannot produce calories, and there's a limit to how much of a market you can manipulate to produce calories for you.
From a market perspective, LLMs sell GPUs. Doesn't even matter if they work or not.
From the geopolitical tensions perspective, they're the perfect excuse to create infrastructure for a global analogue of the Great Firewall (something that the Chinese are pioneers of, and catching up to the plan).
From the software engineering perspective, LLMs are a nuissance, a distraction. They harm everyone.
Really? What about textures? Any ML that the new wave of games might use? For instance, while current LLMs powering NPC interactions would be pretty horrible, what about in 2 years time? You could have arbitrary dialogue trees AND dynamically voiced NPCs or PCs. This is categorically impossible without more VRAM.
> the perfect excuse to create infrastructure for a global analogue of the Great Firewall
Yes, let's have more censorship and kill the dream of the Internet even deader than it already is.
> From the software engineering perspective, LLMs are a nuissance, a distraction. They harm everyone.
You should be aware that reasonable minds can differ in this issue. I won't defend companies forcing the use of LLMs (it would be like forcing use of vim or any other tech you dislike), but I disagree about being a nuisance, distraction, or a universal harm. It's all down to choices and fit for use case.
> THIS will be the excitement everyone is looking for.
Or TSMC could become geopolitically jeopardized somehow, drastically increasing the secondhand value of modern GPUs even beyond what they're priced at now. It's all a system of scarcity, things could go either way.
If no good use is found for high-end GPUs, secondhand models will be like AOL CDs.
Of course the fact that we overwhelmingly chose the better option means that… we are worse off or something?
Not that AMD was anywhere near being in a good state 10 years ago. Nvidia still fucked you over.
Deceptive marketing aside, it's true that it's sad that we can't get 4K 60 Hz with ray tracing with current hardware without some kind of AI denoising and upscaling, but ray tracing is really just _profoundly_ hard so I can't really blame anyone for not having figured out how to put it in a consumer pc yet. There's a reason why pixar movies need huge render farms that take lots of time per frame. We would probably sooner get gaussian splatting and real time diffusion models in games than nice full resolution ray tracing tbh.
Maybe another regression in Blackwell.
I'm not saying they all got together and decided this together but their wonks are probably all saying the same thing. The market is shrinking and whether it's by design or incompetence, this creates a new opportunity to acquire it wholesale for pennies on the dollar and build a wall around it and charge for entry. It's a natural result of games requiring NVidia developers for driver tuning, bitcoin/ai and buying out capacity to prevent competitors.
The wildcard I can't fit into this puzzle is Valve. They have a huge opportunity here but they also might be convinced that they have already saturated the market and will read the writing on the wall.
The striking one for me is their linux efforts, at least as far as I'm aware they don't do a lot that isn't tied to the steam deck (or similar devices) or running games available on steam through linux. Even the deck APU is derived from the semi-custom work AMD did for the consoles, they're benefiting from a second later harvest that MS/Sony have invested (hundreds of millions?) in many years earlier. I suppose a lot of it comes down to what Valve needs to support their customers (developers/publishers), they don't see the point in pioneering and establishing some new branch of tech with developers.
From a supply/demand perspective, if all of your customers are still getting high on the 5 (or 20) year old supply, launching a new title in the same space isn't going to work. There are not an infinite # of gamers and the global dopamine budget is limited.
Launching a game like TF2 or Starcraft 2 in 2025 would be viewed as a business catastrophe by the metrics most AAA studios are currently operating under. Monthly ARPU for gamers years after purchasing the Orange Box was approximately $0.00. Giving gamers access to that strong of a drug would ruin the demand for other products.
What Microsoft is trying to do with Gamepass is a structural change. It may not work out the way that they plan but the truth is that sometimes these things do change the nature of the games you play.
I think Microsoft's strategy is going to come to the same result as Embracer Group. They've bought up lots of studios and they control a whole platform (by which I mean Xbox, not PC) but this doesn't give them that much power. Gaming does evolve and it often evolves to work around attempts like this, rather than in favor of them.
>> Microsoft's strategy is going to come to the same result as Embracer Group.
I hope you are right.
If I were trying to make a larger point, I guess it would be that big tech companies (Apple, MSFT, Amazon) don't want content creators to be too important in the ecosystem and tend to support initiatives that emphasize the platform.
Also mobile games that got priced at $0.99 meant that only the unicorn level games could actually make decent money so In-App Purchases were born.
But also I suspect it is just a problem where as consumers we spend a certain amount of money on certain kinds of entertainment and if as a content producer you can catch enough people’s attention you can get a slice of that pie. We saw this with streaming services where an average household spent about $100/month on cable so Netflix, Hulu, et al all decided to price themselves such that they could be a portion of that pie (and would have loved to be the whole pie but ironically studios not willing to license everything to everyone is what prevented that).
It also won’t work, and Microsoft has developed no way to compete on actual value. As much as I hate the acquisitions they’ve made, even if Microsoft as a whole were to croak tomorrow I think the game industry would be fine.
nvidia isn't purposely killing anything, they are just following the pivot into the AI nonsense. They have no choice, if they are in a unique position to make 10x by a pivot they will, even if it might be a dumpsterfire of a house of cards. Its immoral to just abandon the industry that created you, but companies have always been immoral.
Valve has an opportunity to what? Take over video card hardware market? No. AMD and Intel are already competitors in the market and cant get any foothold (until hopefully now consumers will have no choice but to shift to them)
Personally I'm happy with DLSS on balanced or quality, but the artifacts from framegen are really distracting. So I feel like it's fair to call their modern marketing snake oil since it's so reliant on frame gen to create the illusion of real progress.
So even when I'm running a game at native resolution, I still want anti-aliasing, and DLSS is a great choice then.
So, sure, we can say that all of this is ultimately software trickery, but when the trickery is dialed up to 11 and the marketing revolves entirely on it, while the raw performance is only slightly improved over previous generations, it's a clear sign that consumers are being duped.
[1]: I'm also opposed to frame generation from a philosophical standpoint. I want my experience to be as close as possible to what the game creator intended. That is, I want every frame to be generated by the game engine; every object to look as it should within the world, and so on. I don't want my graphics card to create an experience that approximates what the creator intended.
This is akin to reading a book on an e-reader that replaces every other word with one chosen by an algorithm. I want none of that.
I have been rocking AMD GPU ever since the drivers were upstreamed into the linux kernel. No regrets.
I have also realized that there is a lot out there in the world besides video games, and getting all in a huff about it isn’t worth my time or energy. But consumer gotta consoooooom and then cry and outrage when they are exploited instead of just walking away and doing something else.
Same with magic the gathering, the game went to shit and so many people got outraged and in a big huff but they still spend thousands on the hobby. I just stopped playing mtg.
Last one I ever tried was https://www.protondb.com/app/813780 with comments like "works perfectly, except multiplayer is completely broken" and the workaround has changed 3 times so far, also it lags no matter what. Gave up after stealing 4 different DLLs from Windows. It doesn't even have anticheat, it's just cause of some obscure math library.
I literally never had to do that. Most tweaking I needed to do was switching proton versions here and there (which is trivial to do).
Age of empires 2 used to work well, without needing any babying, so I'm not sure why it didn't for you. I will see about spinning it up.
My experience with running non-game windows-only programs has been similar over the past ~5 years. It really is finally the Year of the Linux Desktop, only few people seem to have noticed.
I also don't play any games that require a rootkit, so..
The vast majority of my gaming library runs fine on Linux. Older games might run better than on Windows, in fact.
My favorite part about being a reformed gaming addict is the fact that my MacBook now covers ~100% of my computer use cases. The desktop is nice for Visual Studio but that's about it.
I'm still running a 5700XT in my desktop. I have absolutely zero desire to upgrade.
Same boat. I have 5700XT as well and since 2023, used mostly my Mac for gaming.
My main hobby is videogames, but since I can consistently play most games on Linux (that has good AMD support), it doesn't really matter.
Efficiency: https://tpucdn.com/review/gigabyte-geforce-rtx-5050-gaming-o...
Vsync power draw: https://tpucdn.com/review/gigabyte-geforce-rtx-5050-gaming-o...
The variance within Nvidia's line-up is much larger than the variance between brands, anyway.
It's hard to get too offended by them shirking the consumer marker right now when they're printing money with their enterprise business.
Nvidia could have said "we're prioritizing enterprise" but instead they put on a big horse and pony show about their consumer GPUs.
I really like the Gamer's Nexus paper launch shirt. ;)
- outbid Apple on new nodes
- sign commitments with TSMC to get the capacity in the pipeline
- absolutely own the process nodes they made cards on that are still selling way above retail
NVIDIA has been posting net earnings in the 60-90 range over the last few years. If you think that's going to continue? You book the fab capacity hell or high water. Apple doesn't make those margins (which is what on paper would determine who is in front for the next node).
These are the same question Apple Fans asking Apple to buy TSMC. The fact is isn't so simple. And even if Nvidia were willing to pay for it TSMC wouldn't do it just for Nvidia alone.
Big if, I I get that.
BS! Nvidia isn't entitled. I'm not obligated. Customer always has final say.
The problem is a lot of customers can't or don't stand their ground. And the other side knows that.
Maybe you're a well trained "customer" by Nvidia just like Basil Fawlty was well trained by his wife ...
Stop excusing bs.
> So 7 years into ray traced real-time computer graphics and we’re still nowhere near 4K gaming at 60 FPS, even at $1,999.
The guy is complaining that a product can’t live up to his standard, while dismissing barely noticeable proposed trade off that can make it possible because it’s «fake».
I honestly don't know why nvidia didn't just suspend their consumer line entirely. It's clearly no longer a significant revenue source and they have thoroughly destroyed consumer goodwill over the past 5 years.
It's ~$12 billion a year with a high gross margin by the standards of every other hardware company. They want to make sure neither AMD nor Intel get that revenue they can invest into funding their own AI/ML efforts.
The spoiled gamer mentality is getting old for those of us that actually work daily in GPGPU across industries, develop with RTX kit, do AI research, etc.
Yes they’ve had some marketing and technical flubs as any giant publically traded company will have, but their balance of research-driven development alongside corporate profit necessities is unmatched.
Also, nobody ever said they hate their researchers.
And you can build mythologies around falsehoods to further reinforce it: "I have a legal obligation to maximize shareholder value." No buddy, you have some very specific restrictions on your ability to sell the company to your cousin (ha!) for a handful of glass beads. You have a legal obligation to bin your wafers the way it says on your own box, but that doesn't seem to bother you.
These days I get a machine like the excellent ASUS Proart P16 (grab one of those before they're all gone if you can) with a little 4060 or 4070 in it that can boot up Pytorch and make sure the model will run forwards and backwards at a contrived size, and then go rent a GB200 or whatever from Latitude or someone (seriously check out Latitude, they're great), or maybe one of those wildly competitive L40 series fly machines (fly whips the llama's ass like nothing since Winamp, check them out too). The GMTek EVO-X1 is a pretty capable little ROCm inference machine for under 1000, its big brother is nipping at the heels of a DGX Spark under 2k. There is good stuff out there but its all from non-incumbent angles.
I don't game anymore but if I did I would be paying a lot of attention to ARC, I've heard great things.
Fuck the cloud and their ancient Xeon SKUs for more than Latitude charges for 5Ghz EPYC. Fuck NVIDIA gaming retail rat race, its an electrical as well as moral hazard in 2025.
It's a shame we all have to be tricky to get what used to be a halfway fair deal 5-10 years ago (and 20 years ago they passed a HUGE part of the scaling bonanza down to the consumer), but its possible to compute well in 2025.
But I do spend a lot of effort finding good deals on modern ass compute. This is the shit I use to get a lot of performance on a budget.
Will people pay you to post on HN? How do I sign up?
you are safe.
What's so special about NVENC that Vulkan video or VAAPI can't provide?
> AMD also has accelerated video transcoding tech but for some reason nobody seems to be willing to implement it into their products
OBS works with VAAPI fine. Looking forward to them adding Vulkan video as an option.
Either way, as a Linux gamer I haven't touched Nvidia in years. AMD is a way better experience.
> This in turn sparked rumors about NVIDIA purposefully keeping stock low to make it look like the cards are in high demand to drive prices. And sure enough, on secondary markets, the cards go way above MSRP
Nvidia doesn't earn more money when cards are sold above MSRP, but they get almost all the hate for it. Why would they set themselves up for that?
Scalpers are a retail wide problem. Acting like Nvidia has the insight or ability to prevent them is just silly. People may not believe this, but retailers hate it as well and spend millions of dollars trying to combat it. They would have sold the product either way, but scalping results in the retailer's customers being mad and becoming some other company's customers, which are both major negatives.
Oh trust me, they can combat it. The easiest way, which is what Nintendo often does for the launch of its consoles, is produce an enormous amount of units before launch. The steady supply to retailers, absolutely destroys folks ability to scalp. Yes a few units will be scalped, but most scalpers will be underwater if there is a constant resupply. I know this because I used to scalp consoles during my teens and early twenties, and Nintendo's consoles were the least profitable and most problematic because they really try to supply the market. The same with iPhones, yeah you might have to wait a month after launch to find one if you don't pre-order but you can get one.
It's widely reported that most retailers had maybe tens of cards per store, or a few hundred nationally, for the 5090s launch. This immediately creates a giant spike in demand, and drove prices up along with the incentive for scalpers. The manufacturing partners immediately saw what (some) people were willing to pay (to the scalpers) and jacked up prices so they could get their cut. It is still so bad in the case of the 5090 that MSRP prices from AIBs skyrocketed 30%-50%. PNY had cards at the original $1999.99 MSRP and now those same cards can't be found for less than $2,999.99.
By contrast look at how AMD launched it's 9000 series of GPUS-- each MicroCenter reportedly had hundreds on hand (and it sure looked like by pictures floating around). Folks were just walking in until noon and still able to get a GPU on launch day. Multiple restocks happened across many retailers immediately after launch. Are there still some inflated prices in the 9000 series GPUs? Yes, but we're not talking a 50% increase. Having some high priced AIBs has always occurred but what Nvidia has done by intentionally under supplying the market is awful.
I personally have been trying to buy a 5090 FE since launch. I have been awake attempting to add to cart for every drop on BB but haven't been successful. I refuse to pay the inflated MSRP for cards that haven't been been that well reviewed. My 3090 is fine... At this point, I'm so frustrated by NVidia I'll likely just piss off for this generation and hope AMD comes out with something that has 32GB+ of VRAM at a somewhat reasonable price.
As has been explained by others. They cant. Look at the tech which is used by Switch 2 and then look at the tech by Nvidia 50 series.
And Nintendo didn't destroy scalpers, they are still in many market not meeting demand despite "is produce an enormous amount of units before launch".
How would we know if they were?
If you believe their public statements, because they didn't want to build out additional capacity and then have a huge excess supply of cards when demand suddenly dried up.
In other words, the charge of "purposefully keeping stock low" is something NVidia admitted to; there was just no theory of how they'd benefit from it in the present.
I have a 4070 Ti right now. I use it for inference and VR gaming on a Pimax Crystal (2880x2880x2). In War Thunder I get ~60 FPS. I’d love to be able to upgrade to a card with at least 16GB of VRAM and better graphics performance… but as far as I can tell, such a card does not exist at any price.
Here's another nvdia/mellanox bs problem: many mlx nic cards are finalized or post assembled say by hp. So if you have a hp "mellanox" nic nvidia washes their hands of anything detailed. It's not ours; hp could have done anything to it what do we know? So one phones hp ... and they have no clue either because it's really not their IP or their drivers.
It's a total cluster bleep and more and more why corporate america sucks
The negativity posted in this article is a hand wavy at best.
tl;dr nVidia is using constrained supply to maximise datacenter GPUs. Consumer GPUs are collateral damage and nVidia understands that. You can't blame a company for maximising profit.
Case in point https://www.digitimes.com/news/a20241122PD200/nvidia-tsmc-ca...
Substrate (CoWoS) is TSMC’s 2.5-D “sandwich” assembly method. So nVidia is blocking all of this capacity for datacenter GPUs.
The melting cables claim is a bit overblown as well.
From o3
• Yes, the 16-pin 12 VHPWR/12 V-2×6 power plug used on RTX 40- and 50-series boards really has a track-record of occasional “melting” events. • The root cause is almost always high contact resistance created by a half-seated or bent connector; once that pin pair sees 35-40 A it heats up and chars the plastic housing. gamersnexus.net
• The problem is rare (low-single-digit failure rates in large user surveys) but visually dramatic, which is why it dominates headlines. gamersnexus.net
• PCI-SIG and Nvidia have already revised the design (12 V-2×6, ATX 3.1), yet isolated incidents are still being reported on new RTX 5090 cards, proving the fix is not bullet-proof. tomshardware.com
• Calling this “planned obsolescence” is a stretch; it is more an example of an aggressive power spec colliding with real-world tolerances and user handling.
Idiots doing hardware installation, with zero experience, using 3rd party cables incorrectly, posting to social media, and youtubers jumping on the trend for likes.
These are 99% user error issues drummed up by non-professionals (and, in some cases, people paid by 3rd party vendors to protect those vendors' reputation).
And the complaints about transient performances issues with drivers, drummed up into apocalyptics scenarios, again, by youtubers, who are putting this stuff under a microscope for views, are universal across every single hardware and software product. Everything.
Claiming "DLSS is snakeoil", and similar things are just an expression of the complete lack of understanding of the people involved in these pot-stirring contests. Like... the technique obviously couldn't magically multiply the ability of hardware to generate frames using the primary method. It is exactly as advertised. It uses machine learning to approximate it. And it's some fantastic technology, that is now ubiquitous across the industry. Support and quality will increase over time, just like every _quality_ hardware product does during its early lifespan.
It's all so stupid and rooted in greed by those seeking ad-money, and those lacking in basic sense or experience in what they're talking about and doing. Embarrassing for the author to so publicly admit to eating up social media whinging.
> Idiots doing hardware installation, with zero experience, using 3rd party cables incorrectly
This is not true. Even GN reproduced the melting of the first-party cable.
Also, why shouldn't you be able to use third-party cables? Fuck DRM too.
The whole thing started with Derbauer going to bat for a cable from some 3rd party vendor that he'd admitted he'd already plugged in and out of various cards something like 50 times.
The actual instances that youtubers report on are all reddit posters and other random social media users who would clearly be better off getting a professional installation. The huge popularity for enthusiast consumer hardware, due to the social media hype cycle, has brought a huge number of naive enthusiasts into the arena. And they're getting burned by doing hardware projects on their own. It's entirely unsurprising, given what happens in all other realms of amateur hardware projects.
Most of those who are whinging about their issues are false positive user errors. The actual failure rates (and there are device failures) are far lower, and that's what warrantys are for.
But the fact of the matter is that Nvidia has shifted from a consumer business to b2b, and they don't even give a shit about pretending they care anymore. People take beef with that, understandably, and when you couple that with the false marketing, the lack of inventory, the occasional hardware failure, missing ROPs, insane prices that nobody can afford and all the other shit that's wrong with these GPUs, then this is the end result.
Customers don’t matter, the company matters.
Competition sorts out such attitude quick smart but AMD never misses a chance to copy Nvidias strategy in any way and intel is well behind.
So for now, you’ll eat what Jensen feeds you.
The two largest supercomputers in the world are powered by AMD. I don't think it's accurate to say Nvidia has monopoly on HPC
I hope they get hit with a class action lawsuit and are forced to recall and properly fix these products before anyone dies as a result of their shoddy engineering.
EDIT: Plantiff dismissed it. Guessing they settled. Here are the court documents (alternately, shakna's links below include unredacted copies):
https://www.classaction.org/media/plaintiff-v-nvidia-corpora...
https://www.classaction.org/media/plaintiff-v-nvidia-corpora...
A GamersNexus article investigating the matter: https://gamersnexus.net/gpus/12vhpwr-dumpster-fire-investiga...
And a video referenced in the original post, describing how the design changed from one that proactively managed current balancing, to simply bundling all the connections together and hoping for the best: https://youtu.be/kb5YzMoVQyw
Sounds like it was settled out of court.
[0] https://www.docketalarm.com/cases/California_Northern_Distri...
I’m curious whether the 5090 package was not following UL requirements.
Would that make them even more liable?
Part of me believes that the blame here is probably on the manufacturers and that this isn’t a problem with Nvidia corporate.
As a bonus, if the gauge is large enough, the cable would actually cool the connectors, although that should not be necessary since the failure appears to be caused by overloaded wires dumping heat into the connector as they overheat.
Or at least I think so? Was that a different 12VHPWR scandal?
Another problem is when the connector is angled, several of the pins may not make contact, shoving all the power through as few as one wire. A common bus would help this but the contact resistance in this case is still bad.
I might actually be happy to buy one of these things, at the inflated price, and run it at half voltage or something... but I can't tell if that is going to fix these concerns or they're just bad cards.
The lack of open source anything for GPU programming makes me want to throw my hands up and just do Apple. It feels much more open than pretending that there's anything open about CUDA on Linux.
Open is good, but the open standard itself is not enough. You need some kind of testing/certification, which is built in to the G-Sync process. AMD does have a FreeSync certification program now which is good.
If you rely on just the standard, some manufacturers get really lazy. One of my screens technically supports FreeSync but I turned it off day one because it has a narrow range and flickers very badly.
I guess the author is too young and didn't go through iPhone 2G to iPhone 6 era. Also worth remembering it wasn't too long ago Nvidia was sitting on nearly ONE full year of GPU stock unsold. That has completely changed the course of how Nvidia does supply chain management and forecast. Which unfortunately have a negative impact all the way to Series 50. I believe they have since changed and next Gen should be better prepared. But you can only do so much when AI demand is seemingly unlimited.
>The PC, as gaming platform, has long been held in high regards for its backwards compatibility. With the RTX 50 series, NVIDIA broke that going forward. PhysX.....
Glide? What about all the Audio Drivers API before. As much as I wish everything is backward compatible. That is just not how the world works. Just like any old games you need some fiddling to get it work. And they even make the code available so people could actually do something rather then emulation or reverse engineering.
>That, to me, was a warning sign that maybe, just maybe, ray tracing was introduced prematurely and half-baked.
Unfortunately that is not how it works. Do we want to go back to Pre-3DFx to today to see how many what we thought was great idea for 3D accelerator only to be replaced by better ideas or implementation? These idea were good on paper but didn't work well. We than learn from it and reiterate.
>Now they’re doing an even more computationally expensive version of ray tracing: path tracing. So all the generational improvements we could’ve had are nullified again......
How about Path Tracing is simply a better technology? Game developers also dont have to use any of these tech. The article act as if Nvidia forces all game to use it. Gamers want better graphics quality, Artist and Graphics asset is already by far the most expensive item in gaming and it is still increasing. What hardware improvement is allowing those to be achieved at lower cost. ( To Game Developers )
>Never mind that frame generation introduces input lag that NVIDIA needs to counter-balance with their “Reflex” technology,
No. That is not why "Reflex" tech was invented. Nvidia spend R&D on 1000 fps monitor as well and potentially sub 1ms frame monitor. They have always been latency sensitive.
------------------------------
I have no idea how modern Gamers become what they are today. And this isn't the first time I have read it even on HN. You dont have to buy Nvidia. You have AMD and now Intel ( again ). Basically I can summarise one thing about it, Gamers want Nvidia 's best GPU for the lowest price possible. Or a price they think is acceptable without understanding the market dynamics and anything supply chain or manufacturing. They also want higher "generational" performance. Like 2x every 2 year. And if they dont get it, it is Nvidia's fault. Not TSMC, not Cadence, not Tokyo Electron, not Issac Newton or Law of Physic. But Nvidia.
Nvidia's PR tactic isn't exactly new in the industry. Every single brand do something similar. Do I like it? No. But unfortunately that is how the game is played. And Apple is by far the worst offender.
I do sympathise with the Cable issue though. And not the first time Nvidia has with thermal issues. But then again they are also the one who are constantly pushing the boundary forward. And AFAIK the issues isn't as bad as the series 40 but some YouTube seems to be making a bigger issue than most. Supply issues will be better but TSMC 3nm is fully booked . The only possible solution would be to have consumer GPU less capable of AI workload. Or to have AI GPU working with leading edge node and consumer always be a node lower to split the capacity problem. I would imagine that is part of the reason why TSMC is accelerating 3nm capacity increase on US soil. Nvidia is now also large enough and has enough cash to take on more risk.
d00mB0t•4h ago