But if this does happen it will be in my opinion the start of a slow death of the democratization of tech.
At best it means we're going to be relegated to last tech if even that, as this isn't a case of SAS vs s-ata or u.2 vs m.2, but the very raw tech (chips).
Let's be real, the twitch FPS CoD players aren't going to give that up and play a boring life simulator.
This has the potential to harm a lot of businesses from hardware to software companies, and change the lives of millions of people.
Theres a LOT of games that compete with AAA-massive-budget games on aggregate like Dwarf Fortress, CS, League, Fortnite, people are still playing arma 2, dayz, rust, etc Rainbow Six: Siege still has adherents and even cash-payout tournaments. EvE: Online, Ultima Online, Runescape, still goin'
These games have like no advertising and are still moneymakers. Eve and UO are like 20 and 30 years old. Heck, Classic WoW!
I feel like league of legends has, wrt the genshin $s, I honestly haven’t checked!
Many gacha titles now offer amazing pc graphics on nvidia cards compared to mobile.
It will probably be a bigger blow to people who want to run LLMs at home.
It gonna be ok.
Can you elaborate a little? What, exactly, is your concern here? That you won't have nvidia as a choice? That AMD will be the only game in town? That gpu market will move from duopoly ( for gaming specifically ) to monopoly? I have little to go on, but I don't really want to put words in your mouth based on minimal post.
Not a locked ecosystem console or a streaming service with lag!
I think if nvidia leaves the market for AI, why wouldn’t AMD and intel, with the memory cartel. So DIY market is gone. That kills lots of companies and creators that rely on the gaming market.
It’s a doom spiral for a lot of the industry. If gaming is just PlayStation and switch and iGPUs there is a lot less innovation in pushing graphics.
It will kill the hobby.
Separately, do you think they won't try to ingratiate themselves to gamers again once AI market changes?
Do you not think they are part of the cartel anyway ( and the DIY market exists despite that )?
<< So DIY market is gone.
How? One use case is gone. Granted, not a small one and one with an odd type of.. fervor, but relatively small nonetheless. At best, DIY market shifts to local inference machines and whatnot. Unless you specifically refer to gaming market..
<< That kills lots of companies and creators that rely on the gaming market.
Markets change all the time. EA is king of the mountain. EA is filing for bankruptcy. Circle of life.
Edit: ALso, upon some additional consideration and in the spirit of christmas, fuck the streamers ( aka creators ). With very, very limited exceptions, they actively drive what is mostly wrong with gaming these days. Fuck em. And that is before we get to the general retardation they contribute to.
<< It’s a doom spiral for a lot of the industry.
How? For AAA? Good. Fuck em. We have been here before and were all better for it.
<< If gaming is just PlayStation and switch and iGPUs there is a lot less innovation in pushing graphics.
Am I reading it right? AMD and Intel is just for consoles?
<< It will kill the hobby.
It is an assertion without any evidence OR a logical cause and effect.
So far, I am not buying it.
PC was largely ignored for gaming, until finally EGA/VGA card, alongside AdLib/Soundblaster, became widespread in enough households to warrant development costs.
If my hobby is ruined and I can’t have fun, I’m going to be an asshole and make everyone else unhappy.
CoD is also huge on Playstation.
Totally niche appeal, yeah right.
Oh, we can only hope!
>This has the potential to harm a lot of businesses from hardware to software companies, and change the lives of millions of people.
Including millions of gamers, but for the better.
Why can’t you let people enjoy their hobby?
But what's most insane is trying to draw any parallels between gaming and these other things - something that was literally engineered to ruin human lives, biologically (hard drugs) or psychologically (gambling). The harm and evil caused by these two industries is incomprehensible (especially the legal parts of them, like alcohol and casino gambling/sports betting/online gambling), and trying to fit gaming in among them both downplays the amount of suffering inflicted by gambling and hard drugs, as well as villainizes normal people - like the hundreds of millions of people who play games in a sane, non-problematic way or indie game devs who make games because they want to express themselves artistically.
Anyways, I gotta log off HN for a while. I can feel my gaming withdrawal kicking in. I've bankrupted myself four times by only spending my money on gaming, and I've been in and out of rehab centres and ERs as I've been slowly destroying my body with gaming in a spiral of deadly addiction. I think I'll have to panhandle and threaten strangers on the street to buy some Steam cards.
Yes.
Or to put it more succinctly, would you want your obituary to lead with your call of duty prowess?
Excellent satire.
Thank you for your consideration.
Not all games need to be that, but Ghost of Tsushima in GBA Pokemon style is not the same game at all. And is it badly designed ? I also don't think so. Same for many VR games which make immersion meaningful in itself.
We can all come up with a litany of bad games, AAA or indie, but as long as there's a set of games fully pushing the envelope and bringing new things to the table, better hardware will be worth it IMHO.
The whole point is to convey details of an area you never lived in, of an actual place you never visited.
I'd make the same argument for Half-Life Alyx or BioHazard, the visceral reaction you get from a highly detailed and textured world works at a different level than just "knowing" what you have in front of your eyes.
Your brain is not filling the gaps, it is taking in the art of the creator.
RE 7 Biohazard was made for the PS4! And its VR version and Half-Life Alyx probably do require higher graphical fidelity, as VR games are not exactly the same thing as regular video games.
That might be the fundamental divide, for that category of games I'm more on the VR camp and will settle for 2D only for convenience or availability.
I see it with different expectations than games like Persona or Zelda (or GTA?) which could compete solely on the mechanics and a lot more, and I get the feeling you're comparing it more to these genres?
Biohazard on PS4 was very meh to me, at that level I feel it could get down to Switch graphics to prioritize better game mechanics and more freedom of play. I never loved the clunkiness, as an action games it's pretty frustrating, and the VR game is even worse in gameplay quality. The immersiveness in VR is the only redeeming quality IMHO.
VR, sure, you want a lot of frames on 2 screens, that requires beef so the visual fidelity on same GPU will be worse than on screen, but other than that if anything graphical part of games have flatlined for me.
Also, putting the money literally anywhere else gonna have better results game quality wise. I want better stories and more complex/interesting systems, not few more animated hairs
My son is using that card, today, and I'm amazed at everything that card can still power. I had a 5080 and just comparing a few games, I found if he used the SuperResolution correctly, he can set the other game settings at the same as mine and his frame-rate isn't far off (things like Fortnite, not Cyberpunk 2077)
There are many caveats there, of course. AMD's biggest problem is in the drivers/implementation for that card. Unlike NVidia's similar technology, it requires setting the game at a lower resolution which it then "fixes" and it tends to produce artifacts depending on the game/how high those settings go. It's a lot harder to juggle the settings between the driver and the game than it should be.
The other cool things is they also have Frame Gen available in the driver to apply to any game, unlike DLSS FG which only works on a few games. You can toggle it on in the AMD software just below the Super Res option. I quickly tried it in a few games and it worked great if you're already getting 60+ FPS, no noticeable artifacts. Though going from 30=>60 doesn't work, too many artifacts. And the extra FPS are only visible in the AMD software's FPS counter overlay, not in other FPS counter overlays.
In the latter case, I'd expect patches for AMD or Intel to become a priority pretty quickly. After all, they need their products to run on systems that customers can buy.
Intel is just plain not capable of it because it's not really a GPU, more a framebuffer with a clever blitter.
NVIDIA, like everyone else on a bleeding edge node, has hardware defects. The chance goes up massively with large chips like modern GPUs. So you try to produce B200 cores but some compute units are faulty. You fuse them off and now the chip is a GP102 gaming GPU.
The gaming market allows NVIDIA to still sell partially defective chips. There’s no reason to stop doing that. It would only reduce revenue without reducing costs.
B200 doesn't have any graphics capabilities. The datacenter chips don't have any graphical units, it's just wasted die space.
As long as gaming GPUs will compete for same wafer space that AI chips use, the AI chips will be far more profitable to NVIDIA
If it does, I think it would be a good thing.
The reason is that it would finally motivate game developers to be more realistic in their minimum hardware requirements, enabling games to be playable on onboard GPUs.
Right now, most recent games (for example, many games built on Unreal Engine 5) are unplayable on onboard GPUs. Game and engine devs simply don't bother anymore to optimize for the low end and thus they end up gatekeeping games and excluding millions of devices because for recent games, a discrete GPU is required even for the lowest settings.
Nowadays a game is only poorly optimized if it's literally unplayable or laggy, and you're forced to constantly upgrade your hardware with no discernible performance gain otherwise.
Not because the developers were lazy, but because newer GPUs were that much better.
If you think that the programmers are unmotivated (lazy) or incompetent; you’re wrong on both counts.
The amount of care and talent is unmatched in my professional career, and they are often working from incomplete (and changing) specifications towards a fixed deadline across multiple hardware targets.
The issue is that games have such high expectations that they didn’t have before.
There are very few “yearly titles” that allow you to nail down the software in a nicer way over time, its always a mad dash to get it done, on a huge 1000+ person project that has to be permanently playable from MAIN and where unit/integration tests would be completely useless the minute they were built.
The industry will end, but not because of “lazy devs”, its the ballooned expectations, stagnant costs, increased team sizes and a pathological contingent of people using games as a (bad) political vehicle without regard for the fact that they will be laid off if they can’t eventually generate revenue.
—-
Finally, back in the early days of games, if the game didn’t work, you assumed you needed better hardware and you would put the work in fixing drivers and settings or even upgrading to something that worked. Now if it doesn’t work on something from before COVID the consensus is that it is not optimised enough. I’m not casting aspersions at the mindset, but it’s a different mentality.
For awhile there you did have noticeable gameplay differences- those with GL quake could play better kind of thing.
Perhaps, but they also turned off Nanite, Lumen and virtual shadow maps. I'm not a UE5 hater but using its main features does currently come at a cost. I think these issues will eventually be fixed in newer versions and with better hardware, and at that point Nanite and VSM will become a no-brainer as they do solve real problems in game development.
Or even before hitting the shelves, cue Trio3D and Mystique, but tha's another story.
DOOM and Battlefield 6 are praised for being surprisingly well optimized for the graphics they offer, and some people bought these games for that reason alone. But I guess in the good old days good optimization would be the norm, not the exception.
This is an insane thing to say.
> Game and engine devs simply don't bother anymore to optimize for the low end
All games carefully consider the total addressable market. You can build a low end game that runs great on total ass garbage onboard GPU. Suffice to say these gamers are not an audience that spend a lot of money on games.
It’s totally fine and good to build premium content that requires premium hardware.
It’s also good to run on low-end hardware to increase the TAM. But there are limits. Building a modern game and targeting a 486 is a wee bit silly.
If Nvidia gamer GPUs disappear and devs were forced to build games that are capable of running on shit ass hardware the net benefit to gamers would be very minimal.
What would actually benefit gamers is making good hardware available at an affordable price!
Everything about your comment screams “tall poppy syndrome”. </rant>
(The third would, of course, be redundant if you were actually developing for a period 486. But I digress.)
I don't think it's insane. In that hypothetical case, it would be a slightly painful experience for some people that the top end is a bit curtailed for a few years while game developers learn to target other cards, hopefully in some more portable way. But also feeling hard done by because your graphics hardware is stuck at 2025 levels for a bit is not that much of hardship really, is it? In fact, if more time is spent optimising for non-premium cards, perhaps the premium card that you already have will work better then the next upgrade would have.
It's not inconceivable that the overall result is a better computing ecosystem in the long run. The open source space in particular, where Nvidia has long been problematic. Or maybe it'll be a multi decade gaming winter, but unless gamers stop being willing to throw large amounts of money chasing the top end, someone will want that money even if Nvidia didn't.
> In fact, if more time is spent optimising for non-premium cards, perhaps the premium card that you already have will work better then the next upgrade would have.
Nah. The stone doesn’t have nearly that much blood to squeeze. And optimizations for ultralow-end may or may not have any benefit to high end. This isn’t like optimizing CPU instruction count that benefits everyone.
Would there be a huge drive towards debloating software to run again on random old computers people find in cupboards?
They'll just move to remote rendering you'll have to subscribe to. Computers will stagnate as they are, and all new improvements will be reserved for the cloud providers. All hail our gracious overlords "donating" their compute time to the unwashed masses.
Hopefully AMD and Intel would still try. But I fear they'd probably follow Nvidia's lead.
Game streaming works well for puzzle, story-esque games where latency isn't an issue.
GeForce NOW and Xbox Cloud are much more sensible projects to look at/evaluate than Stadia.
Any game that is requires high APM (Action Per Minute) will be horrible to play via streaming.
The economics of it also have issues, as now you have to run a bunch more datacenters full of GPUs, and with an inconsistent usage curve leaving a bunch of them being left idle at any given time. You'd have to charge a subscription to justify that, which the market would not accept.
Not that its good or bad tho but we could probably have something more akin to spot instances of gpu being given for gaming purposes.
I do see a lot of company are having GPU access costs per second/instant shutdown/restart I suppose but overall I agree
My brother recently came for the holidays and I played ps5 for the first time on his mac connected to his room 70-100 kms away and honestly, the biggest factor of latency was how far the wifi connection (which was his phone's carrier) and overall, it was a good enough experience but I only played mortal kombat for a few minutes :)
That way they will not only burn the most good will but will also get themselves entangled even more into the AI bubble - hopefully enough to go down with it.
It would still suck if they left the market because who does AMD have to compete with with? Intel? LOL
Increased prices for everyone. Lovely. I can’t despise AI enough.
I am 100% sure AMD would have done the exact same thing as NVIDIA does right now, given the chance.
Are you saying they wouldn't have milked the market to the last drop? Do you really believe it?
if it were up to them, cuda would be a money losing initiative that was killed in 2009
Well, actually it's that the AI business made NVidia 10x bigger. NVidia now has a market cap of $4.4 trillion. That's six times bigger than General Motors, bigger than Apple, and the largest market cap in the world. For a GPU maker.
What happens then if the AI bubble crashes? Nvidia has given up their dominant position in the gaming market and made room for competitors to eat some (most?) of their pie, possibly even created an ultra-rare opportunity for a new competitor to pop up. That seems like a very short-sighted decision.
I think that we will instead see Nvidia abusing their dominant position to re-allocate DRAM away from gaming, as a sector-wide thing. They'll reduce gaming GPU production while simultaneously trying to prevent AMD or Intel from ramping up their own production.
It makes sense for them to retain their huge gaming GPU market share, because it's excellent insurance against an AI bust.
Qualcomm before they made all the chips they do today, ran a pretty popular and successful email client called Eudora.
Doing one thing well can lead to doing bigger things well.
More realistically, if the top end chips go towards the most demanding work, there might be more than enough lower grade silicon that can easily keep the gaming world going.
Plus, gamers rarely stop thinking in terms of gaming, and those insights helped develop GPUs into what they are today, and may have some more light to shine in the future. Where we see gaming and AI coming together, whether it's in completely and actually immersive worlds, etc, is pretty interesting.
Update: Adding https://en.wikipedia.org/wiki/Eudora_(email_client)
Most of the consumer market computes through their smartphones. The PC is a niche market now, and PC enthusiasts/gamers are a niche of a niche.
Any manufacturing capacity which NVIDIA or Micron devote to niche markets is capacity they can't use serving their most profitable market: enterprises and especially AI companies.
PCs are becoming terminals to cloud services, much like smartphones already are. Gaming PCs might still be a thing, but they'll be soldered together unexpandable black boxes. You want to run the latest games that go beyond your PC's meager capacity? Cloud stream them.
I know, I know. "Nothing is inevitable." But let's be real: one thing I've learned is that angry nerds can't change shit. Not when there's billions or trillions of dollars riding on the other side.
I'm kind of nostalgic for the Golden Age of graphics chip manufacturers 25 years ago, where we still had NVIDIA and ATI, but also 3DFX, S3, Matrox, PowerVR, and even smaller players, all doing their own thing and there were so many options.
To me what also feels is that there becomes more friction in an already really competitive and high-friction business of creating cloud.
With increasing ram prices which I (from my knowledge) would only decrease in 2027-2028 or when this bubble pops, It would be extremely expensive for a new entry of cloud provider in this space.
When I mention cloud provider, what I mean aren't the trifecta of AWS,Azure or GCP but rather all the other providers who bought their own hardware and are co-locating it to a datacenter and selling their services targeted at low/mid-range vps/vds servers
I had previously thought about creating cloud but in this economy and the current situations, I'd much rather wait.
The best bet right now for most people creating cloud /providing such services is probably whitewashing any other brand and providing services on top that make you special.
The servers are still rather cheap but the mood that I can see in providers right now is that they are willing to hold the costs for some time to not create a frenzy (so they still have low prices) but they are cautiously waiting and looking for the whole situation and if recent developments continue happening in such a way, I wouldn't be surprised if server providers might raise some prices because the effective underlying hardware's ram/prices increased too.
The end goal is the elimination of personal ownership over any tech. They want us to have to rent everything.
But perhaps then the US will probably reply with tariffs on the PC parts (or even ban them!) Which is slowly becoming the norm for US economic policy, and which won't reverse even after Trump.
i predict that the "pc" is going to be slowly but surely eaten bottom-up by increasingly powerful SoCs.
Moore Threads in China just announced a new GPU.[1] Announced, not shipped.
[1] https://wccftech.com/moore-threads-lushan-gaming-huashan-ai-...
That said, things like improved environmental physics and NPC/enemy AI might enable new and novel game mechanics and creative game design. But that can come from AMD and others too.
I just upgraded to 9700 XT to play ARC Raiders and it's absolutely a feast for the eyes while also pioneering on several fronts especially around the bot movement and intelligence.
Have you seen the GTA VI trailer?
Frankly, the graphics chops are plenty strong for a decade of excellent games. The big push in the next couple decades will probably be AI generated content to make games bigger and more detailed and more immersive
Overall, I think that AMD is more focused and energetic than their competitors now. They are very close of taking over Intel on the CPU race, both on datacenter and consumer segments, and Nvidia might be next in the next 5 years, depending on how the AI bubble develops.
My general impression is that the US technology companies either treat competition from China seriously and actively engage, or Chinese tech companies will slowly and surely eat the cake.
There are numerous examples: the recent bankruptcy of iRobot, the 3D printer market dominated by Bambu Labs, the mini PC market where Chinese brands dominates.
Wowfunhappy•3d ago
xattt•3d ago
Don’t worry about Nintendo. Their pockets are deep and they are creative enough to pivot. They would retool their stack to support another ARM chip, or another arch entirely.
nicolaslem•3d ago