You can feel additional latency easily in competitive FPS or high speed arcade racing games.
And generated frames are far worse than that. If you're running at a very high base framerate (100+) then they can look OK but the moment the frames get any further apart the visual quality starts to tank.
Fake frames are cool tech but they are horribly mismarketed, indistinguishable from scam.
I guess I can see some utility in situations where latency is not a major factor, but IMHO, that pushes out most gaming.
> x50-class GeForce GPUs are among the most popular in the world, second only to the x60-class on Steam. Their price point and power profile are especially popular:
> For anyone upgrading an older x50-class system
> Each GeForce RTX 5050 graphics card is powered by a single PCIe 8-pin cable, drawing a maximum of 130 Watts at stock speeds, making it great for systems with power supplies delivering as little as 550 Watts.
The 1050, 2050 and 3050 were all bus-powered cards. I doubt 95% these systems even have the cable coming from their power supply. Imagine all the poor saps that excitedly swap out their old card for this, and... nothing works.
Source link: https://www.nvidia.com/en-us/geforce/news/rtx-5050-desktop-g...
(Full disclosure, I have a 9070 XT with a 12-pin connector. No fires yet, though.)
I personally think people remember being happy with the 750ti and just keep buying those cards.
I've got a ~ 2006 380W power supply hanging out near my desk and it's got a 6-pin pci-e cable; I really don't think people won't have at least that, certainly not 95% of systems with a pci-e x16 slot.
But it isn't really that uncommon either, I had a suzuki motorcycle that used a connector with 15 amp pins to handle 30 amps of current on one pin. I eventually concluded the only reason that connector was in the harness was to ease assembly and just cut it out entirely and soldered the junction together.
[0]: https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html
https://blog.kronis.dev/blog/what-is-ruining-dual-gpu-setups
https://blog.kronis.dev/blog/more-pc-shenanigans-my-setup-un...
https://blog.kronis.dev/blog/two-intel-arc-gpus-in-one-pc-wo...
When paired with a worse CPU like a Ryzen 5 4500, the experience won't always be good (despite no monitoring software actually showing that the CPU is a bottleneck).
When paired with a better CPU (I got a Ryzen 7 5800X to replace it, eventually with an AIO cause the temperatures were too high under full load anyways), either of them are pretty okay.
In a single GPU setup either of them run most games okay, not that many compatibility or stability issues, even in older indie titles, though I've had some like STALCRAFT: X complain about running on an integrated GPU (Intel being detected as such). Most software also works, unless you want to run LLMs locally, where Nvidia will have more of an advantage and you'd go off the beaten path. Most annoying I've had were some stability issues near the launch of each card, for example running the B580 with their Boost functionality on in their graphics software sometimes crashed in Delta Force, no longer seems to be an issue.
Temperature and power draw seem fine. Their XeSS upscaling is actually really good (I use it on top of native resolution in War Thunder as fancy AA), their frame generation feels like it has more latency than FSR but also better quality, might be subjective, but it's not even supported in that many games in the first place. Their video encoders are pretty nice, but sometimes get overloaded in intensive games instead of prioritizing the encoding over game framerate (which is stupid). Video editing software like DaVinci Resolve also seems okay.
The games that run badly are typically Unreal Engine 5 titles, such as S.T.A.L.K.E.R. 2 and The Forever Winter, where they use expensive rendering techniques and to get at least 30 FPS you have to turn the graphics way down, to the point where the games still run like crap and end up looking worse than something from 5 years ago. Those were even worse on the A series cards, but with the B series ones become at least barely playable.
In a dual GPU setup, nothing works that well, neither in Windows 11, nor Windows 10, neither with the A580 + B580, nor my old RX 580 + B580: system instability, some games ignoring the Intel GPU preference being set when an AMD one is available, low framerates when a video is playing on a secondary monitor (I have 4 in total), the inability to play games on the B580 and do encoding on the A580 due to either just OBS or also the hardware not having proper support for that (e.g. can't pick which GPU to do encode on, like you can with Nvidia ones, my attempts at patching OBS to do that failed, couldn't get a video frame from one GPU to the other). I moved back to running just the B580 in my PC.
For MSRP, I'd say that the Intel Arc B580 is actually a good option, perhaps better than all A series cards. But the more expensive it gets, the more attractive alternatives from AMD and Nvidia become. Personally wouldn't get an A770 unless needed the VRAM or the price was really good.
Also I’m not sure why the A580 needed two 8-pin connectors if it never drew that much power and also why the B580 has plenty of larger 3 fan versions when I could never really get high temps when running Furmark on the 2 fan version.
gs17•5h ago
bigyabai•5h ago
I agree that it's not super appealing, but Team Green has to hit the low price points somehow. This feels more like a hedged bet against Intel trying to muscle their way back into the budget market.
ryao•4h ago
https://www.techpowerup.com/gpu-specs/radeon-rx-9070-xt.c422...
Intel does too.
pama•5h ago
cogman10•5h ago
That's AI frame hallucination which the 5050 has.
Without the DLSS, the numbers from independent reviewers has basically been exactly on par with the previous generations (about 10% increase in performance).
sidewndr46•5h ago
It's also with DLSS on, so you could just as easily have the framerate be 100 FPS, 1000 FPS, or 10000 FPS. The GPU doesn't actually have to render the frame in that case, it just has to have a pixel buffer ready to offload to whatever hardware sends it over the link to the display. Apparently some people actually really like this, but it isn't rendering by any reasonable definition.
pama•4h ago
https://www.nvidia.com/en-us/geforce/news/rtx-5050-desktop-g...
colejohnson66•4h ago
cornstalks•4h ago
It's about the RTX 5070 but the criticisms still hold for the RTX 5050 since Nvidia is still doing the same shenanigans.
theyinwhy•3h ago
unaindz•2h ago
kllrnohj•4h ago
happycube•4h ago
ryao•4h ago
https://www.techpowerup.com/gpu-specs/radeon-rx-9070-xt.c422...
unaindz•2h ago