frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

Open in hackernews

Nvidia's RTX 5050 GPU starts at $249 with last-gen GDDR6 VRAM

https://www.theverge.com/news/692045/nvidia-geforce-rtx-5050-desktop-laptop-gpu-gddr6-gddr7
44•microsoftedging•6h ago

Comments

gs17•5h ago
And it's 8GB of last-gen GDDR6 video memory, the exact same as the $249 RTX 3050 from three years ago (same number of CUDA cores too). Technically with inflation that's more per dollar, I guess, but that's not super appealing.
bigyabai•5h ago
If you're using less memory, it kinda stands to reason that you can get more mileage out of less bandwidth. I'd be really upset if this was a 16gb or 24gb card, but we've been using GDDR6 for 8gb cards without issues for years now.

I agree that it's not super appealing, but Team Green has to hit the low price points somehow. This feels more like a hedged bet against Intel trying to muscle their way back into the budget market.

ryao•4h ago
AMD uses GDDR6 on their current generation cards:

https://www.techpowerup.com/gpu-specs/radeon-rx-9070-xt.c422...

Intel does too.

pama•5h ago
The performance figure in the link clearly says that it it’s a significant improvement to the 3050.
cogman10•5h ago
This is creative marketing from nVidia. Notice the "With DLSS 4".

That's AI frame hallucination which the 5050 has.

Without the DLSS, the numbers from independent reviewers has basically been exactly on par with the previous generations (about 10% increase in performance).

sidewndr46•5h ago
The charts are from the Verge, not exactly known for their integrity in regards to anything.

It's also with DLSS on, so you could just as easily have the framerate be 100 FPS, 1000 FPS, or 10000 FPS. The GPU doesn't actually have to render the frame in that case, it just has to have a pixel buffer ready to offload to whatever hardware sends it over the link to the display. Apparently some people actually really like this, but it isn't rendering by any reasonable definition.

pama•4h ago
Maybe this is the better link:

https://www.nvidia.com/en-us/geforce/news/rtx-5050-desktop-g...

colejohnson66•4h ago
Why would anyone trust Nvidia to not stretch the truth, especially with a press release? It's been shown multiple times they inflate their numbers.
cornstalks•4h ago
This is a better link: https://gamersnexus.net/gpus/nvidia-selling-lies-rtx-5070-fo...

It's about the RTX 5070 but the criticisms still hold for the RTX 5050 since Nvidia is still doing the same shenanigans.

theyinwhy•3h ago
Looking at 20 articles, they seem to be biased when it comes to AMD vs Nvidia.
unaindz•2h ago
I don't follow everything but they give flak to AMD too, maybe in a different way.
kllrnohj•4h ago
That's Nvidia's marketing slide and if you note the fine print they are tested at different settings. The RTX 5050 is using 4x frame gen which the 3050 isn't. Techpowerup has the RTX 5050 as being 20% faster than the 3050 give or take, which is certainly not enough to justify upgrading
happycube•4h ago
This card might've been forgivable with 16GB or a $149 MSRP, so of course they didn't do either...
ryao•4h ago
Why did I not see any complaints when AMD used GDDR6 on their current generation products:

https://www.techpowerup.com/gpu-specs/radeon-rx-9070-xt.c422...

unaindz•2h ago
Because they announced double the VRAM for the medium end card at around the same price. But there were complaints anyway.
throitallaway•5h ago
Can't wait to buy one for $550.
singhkays•4h ago
I know you're being facetious but such is the market atm. I put something together to track the absurdity of it all :)

https://gpuisfine.singhkays.com

mft_•3h ago
It's insane. I've been tracking a few used 4090s on EU ebay this week, and the typical selling price is ~1950 EUR.
singhkays•3h ago
huh..you might has well get a 5090 for that price. Seems like it's dipping below MSRP there - https://www.techpowerup.com/338256/nvidia-geforce-rtx-5090-b...
mft_•1h ago
Indeed; there are new 5090s from legit-looking sellers in eBay from around ~2300 EUR.
Aurornis•4h ago
The emphasis on last-gen memory is misplaced. I don't care what memory technology is used as long as the performance is good for the price.
seiferteric•4h ago
Ya, don't current gen Radeon cards use GDDR6 as well?
ryao•4h ago
They do:

https://www.techpowerup.com/gpu-specs/radeon-rx-9070-xt.c422...

Night_Thastus•4h ago
It's complete garbage and not worth buying. It's so cut down it's nearly useless outside of web browsing and very light games. The price is also effectively lie, it's going to be hard to get it for less than $300. Once we get some proper 3rd party test data in I'd be shocked if it's 5% better than a 4050 in raster without the use of fake frames.
jekwoooooe•4h ago
Why is there so much hate against fake frames? I know you can’t tell the difference
kllrnohj•4h ago
because you can tell the difference, they have quite a few artifacts, and they make latency worse which is especially problematic in the scenarios where you need the "performance" offered by fake frames. At this price point it's that last thing that's especially problematic. You may get 60fps in an fps counter with dlss 4, but it'll feel like 15-20fps and not be very playable
jamesgeck0•4h ago
NVidia themselves have said that framegen shouldn't be used if the card isn't hitting 60 FPS to start with because of the latency it introduces. If the card is cut down enough that it's struggling to hit 60 FPS in games, enabling framegen will do more harm then good.

You can feel additional latency easily in competitive FPS or high speed arcade racing games.

unaindz•2h ago
You can feel less than 50-60 fps on a management game where you only interact with the UI and move the camera around, not game breaking but doesn't feel great. And I used to play far cry 3 and CSGO at ~25 fps, I'm used to lack of performance.
Night_Thastus•4h ago
You absolutely can tell the difference. DLSS (upscale) visually is massively different in some games. Sometimes it works great, sometimes the result is very ugly. I've tested with several of my favorites.

And generated frames are far worse than that. If you're running at a very high base framerate (100+) then they can look OK but the moment the frames get any further apart the visual quality starts to tank.

orphea•3h ago
This is why: https://news.ycombinator.com/item?id=44368785

Fake frames are cool tech but they are horribly mismarketed, indistinguishable from scam.

toast0•2h ago
Fake frames have a big latency penalty, because you can't generate a frame between X and Y until you have Y. At the point that you have generated frame Y, however many frames you insert give you that much additional latency, beyond whatever your display adds.

I guess I can see some utility in situations where latency is not a major factor, but IMHO, that pushes out most gaming.

pitaj•4h ago
Agreed. Anybody buying this would be better off spending that money on a used card like a 2070 or 3060, and might even save a buck.
jamesgeck0•4h ago
Or an Intel Arc card. They aren't very high end, but they're competitive at MSRP and I suspect they'll demolish this in benchmarks.
lvl155•4h ago
Just buy AMD AI Max+ no?
anonym29•4h ago
Undoubtedly a better system, but the 395 variant with a full 128GB of (soldered on) RAM, you're looking at ~$2k for the system. Comparing that to a $250 dGPU (that arguably isn't even worth that) is a very "apples to oranges" comparison.
Kon-Peki•4h ago
This is not going to go well:

> x50-class GeForce GPUs are among the most popular in the world, second only to the x60-class on Steam. Their price point and power profile are especially popular:

> For anyone upgrading an older x50-class system

> Each GeForce RTX 5050 graphics card is powered by a single PCIe 8-pin cable, drawing a maximum of 130 Watts at stock speeds, making it great for systems with power supplies delivering as little as 550 Watts.

The 1050, 2050 and 3050 were all bus-powered cards. I doubt 95% these systems even have the cable coming from their power supply. Imagine all the poor saps that excitedly swap out their old card for this, and... nothing works.

Source link: https://www.nvidia.com/en-us/geforce/news/rtx-5050-desktop-g...

LorenDB•4h ago
Hey, at least it's not the infamous 12-pin connector.

(Full disclosure, I have a 9070 XT with a 12-pin connector. No fires yet, though.)

Kon-Peki•4h ago
Well, maybe a molex-to-8pin-PCIe cable comes in the box!
reginald78•4h ago
To bolster this, after the 750ti the 50 products have had pretty lame price to performance compared to the next step up, but have remained quite popular. Most people seem to argue that the lack of additional power is their main advantage and why they are popular.

I personally think people remember being happy with the 750ti and just keep buying those cards.

toast0•2h ago
I've got at 1650 Super; it's not bus-powered either. I think it's got a 6-pin, but often you can plug a 6-pin into an 8-pin board and it'll just run a lower current limit (this might not be accurate --- a lot of internet comments say 8 pin boards will detect a 6-pin connector and refuse to work). A whole lot of modern computing gets ~ 90% of the performance with 50% of the power; so if using a 6-pin lead drops power to 50%, you would still get most of it.

I've got a ~ 2006 380W power supply hanging out near my desk and it's got a 6-pin pci-e cable; I really don't think people won't have at least that, certainly not 95% of systems with a pci-e x16 slot.

neepi•4h ago
Just bought a 16Gb 5060 Ti for twice that. I don’t feel disappointed.
happycube•4h ago
Yeah, moreso since that's the highest nvidia card this gen available with a sane power connector.
sidewndr46•4h ago
12VHPWR has to be one of the weirdest industry decisions I've seen in a while. So far I thought I had been able to avoid it, but recently bought a power supply that uses it on the modular cable connector.

But it isn't really that uncommon either, I had a suzuki motorcycle that used a connector with 15 amp pins to handle 30 amps of current on one pin. I eventually concluded the only reason that connector was in the harness was to ease assembly and just cut it out entirely and soldered the junction together.

neepi•3h ago
Oh yeah totally agree with that one. I hate Molex connectors but the new one is just stupid.
LorenDB•4h ago
I can't believe that nobody has yet mentioned the Intel Arc Battlemage B580. Same $250 MSRP (which has inflated, but every other GPU is inflated too, and the 5050 will probably inflate as well), but has 12 GB of VRAM and bats just below a 4060 Ti 16 GB[0].

[0]: https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html

jamesgeck0•3h ago
I've been pretty happy with my Arc A770 LE (16 GB). The drivers were rough at launch, but they've gotten much better, and at the time it was the best performance $350 could buy.
prossercj•2h ago
How is it for gaming? Had any compatibility issues?
KronisLV•17m ago
I had both an A580 (not an A770, but at least something from that generation) and then later a B580, at one point even both in the same computer, side by side, when I wanted to use one for games and the other for encoding:

https://blog.kronis.dev/blog/what-is-ruining-dual-gpu-setups

https://blog.kronis.dev/blog/more-pc-shenanigans-my-setup-un...

https://blog.kronis.dev/blog/two-intel-arc-gpus-in-one-pc-wo...

When paired with a worse CPU like a Ryzen 5 4500, the experience won't always be good (despite no monitoring software actually showing that the CPU is a bottleneck).

When paired with a better CPU (I got a Ryzen 7 5800X to replace it, eventually with an AIO cause the temperatures were too high under full load anyways), either of them are pretty okay.

In a single GPU setup either of them run most games okay, not that many compatibility or stability issues, even in older indie titles, though I've had some like STALCRAFT: X complain about running on an integrated GPU (Intel being detected as such). Most software also works, unless you want to run LLMs locally, where Nvidia will have more of an advantage and you'd go off the beaten path. Most annoying I've had were some stability issues near the launch of each card, for example running the B580 with their Boost functionality on in their graphics software sometimes crashed in Delta Force, no longer seems to be an issue.

Temperature and power draw seem fine. Their XeSS upscaling is actually really good (I use it on top of native resolution in War Thunder as fancy AA), their frame generation feels like it has more latency than FSR but also better quality, might be subjective, but it's not even supported in that many games in the first place. Their video encoders are pretty nice, but sometimes get overloaded in intensive games instead of prioritizing the encoding over game framerate (which is stupid). Video editing software like DaVinci Resolve also seems okay.

The games that run badly are typically Unreal Engine 5 titles, such as S.T.A.L.K.E.R. 2 and The Forever Winter, where they use expensive rendering techniques and to get at least 30 FPS you have to turn the graphics way down, to the point where the games still run like crap and end up looking worse than something from 5 years ago. Those were even worse on the A series cards, but with the B series ones become at least barely playable.

In a dual GPU setup, nothing works that well, neither in Windows 11, nor Windows 10, neither with the A580 + B580, nor my old RX 580 + B580: system instability, some games ignoring the Intel GPU preference being set when an AMD one is available, low framerates when a video is playing on a secondary monitor (I have 4 in total), the inability to play games on the B580 and do encoding on the A580 due to either just OBS or also the hardware not having proper support for that (e.g. can't pick which GPU to do encode on, like you can with Nvidia ones, my attempts at patching OBS to do that failed, couldn't get a video frame from one GPU to the other). I moved back to running just the B580 in my PC.

For MSRP, I'd say that the Intel Arc B580 is actually a good option, perhaps better than all A series cards. But the more expensive it gets, the more attractive alternatives from AMD and Nvidia become. Personally wouldn't get an A770 unless needed the VRAM or the price was really good.

Also I’m not sure why the A580 needed two 8-pin connectors if it never drew that much power and also why the B580 has plenty of larger 3 fan versions when I could never really get high temps when running Furmark on the 2 fan version.

some_random•2h ago
I have to assume things are better to some degree but last I looked at Intel's offering the support was still unacceptably bad. That said, I really hope they can get things to a good state because there needs to be more competition at this price point.
kllrnohj•1h ago
The support is still worse, but you're getting a big discount on the hardware by comparison. So it kinda evens out at this price point where you're deciding between either having every game run badly or most, but not all, games running decently
speed_spread•3h ago
For such insanity, it should be labeled the RTX 5150
gunalx•3h ago
The 50 line is the new gt-30 tier.
dunno7456•3h ago
It will take Nvidia 10 years to release the firmware for PMU and then they will cancel it because it's "too old". Just like the they did with pascal, P520 and other perfectly working hardware that are barely usable to this day.
drcongo•2h ago
I'm still waiting for my DGX Spark. Starting to wonder if Nvidia have hired Musk for their PR to promise things they'll never deliver.
yrcyrc•2h ago
Not following much hardware news. What can I do with this or the Intel Arc? Play games? Run AI workload? Genuine question
LorenDB•1h ago
Both. AI is dependent on available VRAM, so the B580 will run some larger models than the 5050.
eighthourblink•18m ago
Coming from a 2060 Super, would this be a good upgrade? I dont really play newer high demand games, but i do enjoy my emulation. Currently on 2060 super and dont really have any issues with emulation.Ryzen 5 3600X / linux (of course :))