But if this does happen it will be in my opinion the start of a slow death of the democratization of tech.
At best it means we're going to be relegated to last tech if even that, as this isn't a case of SAS vs s-ata or u.2 vs m.2, but the very raw tech (chips).
Let's be real, the twitch FPS CoD players aren't going to give that up and play a boring life simulator.
This has the potential to harm a lot of businesses from hardware to software companies, and change the lives of millions of people.
Theres a LOT of games that compete with AAA-massive-budget games on aggregate like Dwarf Fortress, CS, League, Fortnite, people are still playing arma 2, dayz, rust, etc Rainbow Six: Siege still has adherents and even cash-payout tournaments. EvE: Online, Ultima Online, Runescape, still goin'
These games have like no advertising and are still moneymakers. Eve and UO are like 20 and 30 years old. Heck, Classic WoW!
I feel like league of legends has, wrt the genshin $s, I honestly haven’t checked!
Many gacha titles now offer amazing pc graphics on nvidia cards compared to mobile.
It will probably be a bigger blow to people who want to run LLMs at home.
It gonna be ok.
Can you elaborate a little? What, exactly, is your concern here? That you won't have nvidia as a choice? That AMD will be the only game in town? That gpu market will move from duopoly ( for gaming specifically ) to monopoly? I have little to go on, but I don't really want to put words in your mouth based on minimal post.
Not a locked ecosystem console or a streaming service with lag!
I think if nvidia leaves the market for AI, why wouldn’t AMD and intel, with the memory cartel. So DIY market is gone. That kills lots of companies and creators that rely on the gaming market.
It’s a doom spiral for a lot of the industry. If gaming is just PlayStation and switch and iGPUs there is a lot less innovation in pushing graphics.
It will kill the hobby.
Separately, do you think they won't try to ingratiate themselves to gamers again once AI market changes?
Do you not think they are part of the cartel anyway ( and the DIY market exists despite that )?
<< So DIY market is gone.
How? One use case is gone. Granted, not a small one and one with an odd type of.. fervor, but relatively small nonetheless. At best, DIY market shifts to local inference machines and whatnot. Unless you specifically refer to gaming market..
<< That kills lots of companies and creators that rely on the gaming market.
Markets change all the time. EA is king of the mountain. EA is filing for bankruptcy. Circle of life.
Edit: ALso, upon some additional consideration and in the spirit of christmas, fuck the streamers ( aka creators ). With very, very limited exceptions, they actively drive what is mostly wrong with gaming these days. Fuck em. And that is before we get to the general retardation they contribute to.
<< It’s a doom spiral for a lot of the industry.
How? For AAA? Good. Fuck em. We have been here before and were all better for it.
<< If gaming is just PlayStation and switch and iGPUs there is a lot less innovation in pushing graphics.
Am I reading it right? AMD and Intel is just for consoles?
<< It will kill the hobby.
It is an assertion without any evidence OR a logical cause and effect.
So far, I am not buying it.
Firmly in old-guy “this content should not exist” camp
PC was largely ignored for gaming, until finally EGA/VGA card, alongside AdLib/Soundblaster, became widespread in enough households to warrant development costs.
That cartel is imply flying high right now, convinced they got the market by the balls.
They don't. Just give it most of 2026 and you'll see.
If my hobby is ruined and I can’t have fun, I’m going to be an asshole and make everyone else unhappy.
CoD is also huge on Playstation.
Totally niche appeal, yeah right.
Oh, we can only hope!
>This has the potential to harm a lot of businesses from hardware to software companies, and change the lives of millions of people.
Including millions of gamers, but for the better.
Why can’t you let people enjoy their hobby?
But what's most insane is trying to draw any parallels between gaming and these other things - something that was literally engineered to ruin human lives, biologically (hard drugs) or psychologically (gambling). The harm and evil caused by these two industries is incomprehensible (especially the legal parts of them, like alcohol and casino gambling/sports betting/online gambling), and trying to fit gaming in among them both downplays the amount of suffering inflicted by gambling and hard drugs, as well as villainizes normal people - like the hundreds of millions of people who play games in a sane, non-problematic way or indie game devs who make games because they want to express themselves artistically.
Anyways, I gotta log off HN for a while. I can feel my gaming withdrawal kicking in. I've bankrupted myself four times by only spending my money on gaming, and I've been in and out of rehab centres and ERs as I've been slowly destroying my body with gaming in a spiral of deadly addiction. I think I'll have to panhandle and threaten strangers on the street to buy some Steam cards.
Yes.
Or to put it more succinctly, would you want your obituary to lead with your call of duty prowess?
Excellent satire.
Thank you for your consideration.
Not all games need to be that, but Ghost of Tsushima in GBA Pokemon style is not the same game at all. And is it badly designed ? I also don't think so. Same for many VR games which make immersion meaningful in itself.
We can all come up with a litany of bad games, AAA or indie, but as long as there's a set of games fully pushing the envelope and bringing new things to the table, better hardware will be worth it IMHO.
The whole point is to convey details of an area you never lived in, of an actual place you never visited.
I'd make the same argument for Half-Life Alyx or BioHazard, the visceral reaction you get from a highly detailed and textured world works at a different level than just "knowing" what you have in front of your eyes.
Your brain is not filling the gaps, it is taking in the art of the creator.
RE 7 Biohazard was made for the PS4! And its VR version and Half-Life Alyx probably do require higher graphical fidelity, as VR games are not exactly the same thing as regular video games.
That might be the fundamental divide, for that category of games I'm more on the VR camp and will settle for 2D only for convenience or availability.
I see it with different expectations than games like Persona or Zelda (or GTA?) which could compete solely on the mechanics and a lot more, and I get the feeling you're comparing it more to these genres?
Biohazard on PS4 was very meh to me, at that level I feel it could get down to Switch graphics to prioritize better game mechanics and more freedom of play. I never loved the clunkiness, as an action games it's pretty frustrating, and the VR game is even worse in gameplay quality. The immersiveness in VR is the only redeeming quality IMHO.
VR, sure, you want a lot of frames on 2 screens, that requires beef so the visual fidelity on same GPU will be worse than on screen, but other than that if anything graphical part of games have flatlined for me.
Also, putting the money literally anywhere else gonna have better results game quality wise. I want better stories and more complex/interesting systems, not few more animated hairs
To note, cost and hardware availability is I think one half of the critical reasons people don't get into VR (other half being the bulkiness and puke ?). In a roundabout way, GPU melting games helped get better hardware at mainstream prices. Until crypto and AI happened. And now the Steam Frame faced with the RAM price situation.
> 5 years
I don't play it, but Infinite Nikki comes to mind, and the visuals are the core experience. I wonder how much a game like Arcknight Enfield taxes the player hardware, given they're pushing the 3D modelling side.
I agree with you on the plateauing part, in that the gaming industry seems to have mostly shoved HDPI in the corner. It costs so much more to produce a game at that visual quality in the first place, and PC makers and benchmarks focus on FHD performance, so ROI is that lower on the marketing side.
It kinda makes me sad, like being told "8-bit art is enough for images, we should focus on composition, how many Vermeer or DaVinci like painters do we expect anyway ?"
Me neither but recommended requirements on Steam are like.. RTX 2060, so 6 years old mid grade video card. We really don't need more power than we already have to make beautiful games.
> It kinda makes me sad, like being told "8-bit art is enough for images, we should focus on composition, how many Vermeer or DaVinci like painters do we expect anyway ?"
Except it isn't ? At this point more power is only really needed if you want to go hardcore into photorealism, and to actually use all that power you need massive budget just to produce all the assets at required quality.
It's like saying "if only painters had even smaller brushes, we could get photographical quality paintings." Does it really make art that much better ?
Steam's recommended specs are on the conservative side, usually adapted to play with the average settings.
Pushing the game setting to ultra at 4K gives it a mere 100fps on a RTX 4090 https://www.youtube.com/watch?v=rju22K1lfQY
That's a lot of dedication, but yes, some people will really enjoy the game in its full splendor. Telling them what they need or don't need, or what they should enjoy misses the point of games IMHO (avid players have probably already spent more than a top end gaming PC on the dresses)
> Does it really make art that much better ?
Putting a technical limit on what makes or doesn't make art better sounds fundamentally off to me.
I kinda understand your point on diminishing returns, except we haven't even reached a good frame rate at HDPI for 24~32"ish screens. And we'll always move to the next level.
"XXX should be enough for everyone" kind of assertions have never panned out well IMHO.
My son is using that card, today, and I'm amazed at everything that card can still power. I had a 5080 and just comparing a few games, I found if he used the SuperResolution correctly, he can set the other game settings at the same as mine and his frame-rate isn't far off (things like Fortnite, not Cyberpunk 2077)
There are many caveats there, of course. AMD's biggest problem is in the drivers/implementation for that card. Unlike NVidia's similar technology, it requires setting the game at a lower resolution which it then "fixes" and it tends to produce artifacts depending on the game/how high those settings go. It's a lot harder to juggle the settings between the driver and the game than it should be.
The other cool things is they also have Frame Gen available in the driver to apply to any game, unlike DLSS FG which only works on a few games. You can toggle it on in the AMD software just below the Super Res option. I quickly tried it in a few games and it worked great if you're already getting 60+ FPS, no noticeable artifacts. Though going from 30=>60 doesn't work, too many artifacts. And the extra FPS are only visible in the AMD software's FPS counter overlay, not in other FPS counter overlays.
I recently got a Asus Rog Flow Z13 gaming "tablet" with the AMD Strix Halo APU. It has a great CPU + shared RAM + ridiculously powerful iGPU. Doesn't have the brute power of my previous desktop with a 4090, but this thing can handle the same games at 4k with upscaling on high settings (no raytracing), it's shockingly capable for its compact form factor.
Don't get too worried. People still can and do vote with their wallets. Additional vector of attack against greedy capitalists is also the fact that the economy is not doing great either.
They cannot increase prices too much.
I also predict that the DDR5 RAM price hikes will not last until 2027 or even 2028 as many others think. I give it maximum one year, I'd even think the prices will start slightly coming down during summer 2026.
Reading and understanding economy is neat and all but in the modern age some people forget that the total addressable market is not infinite and that the regular customers have relatively tight budgets.
this is true in general
but the barrier to entry for gaming GPUs is massive (hundreds of billions)
intel have been working at it for close to a decade and now just about have a workable product, at the low end
By the time those are depleted we'll have a new player.
I'm hoping the Chinese fabs can finally catch up enough to provide a meaningful alternative both for memory and compute. They're more or less the only ones still making consumer grade stuff in lots of other segments, the rest of us just make overpriced low volume products for the highest bidder.
In the latter case, I'd expect patches for AMD or Intel to become a priority pretty quickly. After all, they need their products to run on systems that customers can buy.
Intel is just plain not capable of it because it's not really a GPU, more a framebuffer with a clever blitter.
NVIDIA, like everyone else on a bleeding edge node, has hardware defects. The chance goes up massively with large chips like modern GPUs. So you try to produce B200 cores but some compute units are faulty. You fuse them off and now the chip is a GP102 gaming GPU.
The gaming market allows NVIDIA to still sell partially defective chips. There’s no reason to stop doing that. It would only reduce revenue without reducing costs.
B200 doesn't have any graphics capabilities. The datacenter chips don't have any graphical units, it's just wasted die space.
As long as gaming GPUs will compete for same wafer space that AI chips use, the AI chips will be far more profitable to NVIDIA
If it does, I think it would be a good thing.
The reason is that it would finally motivate game developers to be more realistic in their minimum hardware requirements, enabling games to be playable on onboard GPUs.
Right now, most recent games (for example, many games built on Unreal Engine 5) are unplayable on onboard GPUs. Game and engine devs simply don't bother anymore to optimize for the low end and thus they end up gatekeeping games and excluding millions of devices because for recent games, a discrete GPU is required even for the lowest settings.
Nowadays a game is only poorly optimized if it's literally unplayable or laggy, and you're forced to constantly upgrade your hardware with no discernible performance gain otherwise.
Not because the developers were lazy, but because newer GPUs were that much better.
If you think that the programmers are unmotivated (lazy) or incompetent; you’re wrong on both counts.
The amount of care and talent is unmatched in my professional career, and they are often working from incomplete (and changing) specifications towards a fixed deadline across multiple hardware targets.
The issue is that games have such high expectations that they didn’t have before.
There are very few “yearly titles” that allow you to nail down the software in a nicer way over time, its always a mad dash to get it done, on a huge 1000+ person project that has to be permanently playable from MAIN and where unit/integration tests would be completely useless the minute they were built.
The industry will end, but not because of “lazy devs”, its the ballooned expectations, stagnant revenue opportunity, increased team sizes and a pathological contingent of people using games as a (bad) political vehicle without regard for the fact that they will be laid off if they can’t eventually generate revenue.
—-
Finally, back in the early days of games, if the game didn’t work, you assumed you needed better hardware and you would put the work in fixing drivers and settings or even upgrading to something that worked. Now if it doesn’t work on something from before COVID the consensus is that it is not optimised enough. I’m not casting aspersions at the mindset, but it’s a different mentality.
And a friend of mine still mostly plays the goddamn Ultima Online, the game that was released 28 years ago.
Your expectations of that game are set appropriately. Same with a lot of Indy games, the expectation can be that its in early access for a decade+. You would never accept that from, say, Ubisoft.
I fully agree and I really admire people working on the industry. When I see great games which are unplayable in the low end because of stupidly high minimum hardware requirements, I understand game devs are simply responding to internal trends within the industry, and especially going for a practical outcome by using an established game engine (such as Unreal 5).
But at some time I hope this GPU crunch forces this same industry to allocate time and resources either at the engine or at the game level to truly optimize for a realistic low end.
I don’t think any company that has given up their internal engine could invest 7 years of effort without even having revenue from a game to show for it.
So the industry will likely rally around Unreal and Unity- and I think a handful of the major players will release their engines on license… but Unreal will eat them alive due to the investments in Dev UX (which is much-much higher than proprietary game engines IME). Otherwise the only engines that can really innovate are gated behind AAA publishers and their push for revenue (against investment for any other purpose).
All this to say, I’m sorry to disappoint you, its very unlikely.
Games will have to get smaller and have better revenues.
But maybe, just maybe, they could request Epic or Unity to optimize their engines better for the lower end.
Optimisation is almost universally about tradeoffs.
If you are a general engine, you can’t easily make those tradeoffs, and worse you have to build guardrails and tooling for many cases, slowing things down further.
The best we can hope for is even better profiling tools from Epic, but they’ve been doing that for the last couple of years since borderlands.
No T&L meant everything was culled, clipped, transformed and per-vertex divided (perspective, lighting) on CPU.
Then you have brute force approach. Voodoo 1/2/3 doesnt employ any obvious speedup tricks in its pipeline. Every single triangle pushed into it is going to get textured (bilinear filtering, per pixel divide), shaded (lighting, blending, FOG applied) and then in the last step the card finally checks Z-buffer to decide between writing all this computed data to buffer or simply throwing it away.
It took a while before GPU devs started implementing low-hanging fruit optimizations https://therealmjp.github.io/posts/to-earlyz-or-not-to-early...
Hierarchical-Z, Fast Z clearing, Compressed Z buffer, Compressed Textures, Tiled shading. It all got added slowly one step at a time in early 2000.
For awhile there you did have noticeable gameplay differences- those with GL quake could play better kind of thing.
Perhaps, but they also turned off Nanite, Lumen and virtual shadow maps. I'm not a UE5 hater but using its main features does currently come at a cost. I think these issues will eventually be fixed in newer versions and with better hardware, and at that point Nanite and VSM will become a no-brainer as they do solve real problems in game development.
20 fps is not fine. I would consider that unplayable.
I expect at least 60, ideally 120 or more, as that's where the diminishing returns really start to kick in.
I could tolerate as low as 30 fps on a game that did not require precise aiming or reaction times, which basically eliminates all shooters.
Or even before hitting the shelves, cue Trio3D and Mystique, but tha's another story.
DOOM and Battlefield 6 are praised for being surprisingly well optimized for the graphics they offer, and some people bought these games for that reason alone. But I guess in the good old days good optimization would be the norm, not the exception.
I don't think it has ever been the case that this year's AAA games play well on last year's video cards.
This is an insane thing to say.
> Game and engine devs simply don't bother anymore to optimize for the low end
All games carefully consider the total addressable market. You can build a low end game that runs great on total ass garbage onboard GPU. Suffice to say these gamers are not an audience that spend a lot of money on games.
It’s totally fine and good to build premium content that requires premium hardware.
It’s also good to run on low-end hardware to increase the TAM. But there are limits. Building a modern game and targeting a 486 is a wee bit silly.
If Nvidia gamer GPUs disappear and devs were forced to build games that are capable of running on shit ass hardware the net benefit to gamers would be very minimal.
What would actually benefit gamers is making good hardware available at an affordable price!
Everything about your comment screams “tall poppy syndrome”. </rant>
(The third would, of course, be redundant if you were actually developing for a period 486. But I digress.)
But solitare ran on a 486 and I don’t see what of the gameplay requires massive CPU.
But the buffer for a full HD screen fill most of the memory of a typical 486 computer I think
I don't think it's insane. In that hypothetical case, it would be a slightly painful experience for some people that the top end is a bit curtailed for a few years while game developers learn to target other cards, hopefully in some more portable way. But also feeling hard done by because your graphics hardware is stuck at 2025 levels for a bit is not that much of hardship really, is it? In fact, if more time is spent optimising for non-premium cards, perhaps the premium card that you already have will work better then the next upgrade would have.
It's not inconceivable that the overall result is a better computing ecosystem in the long run. The open source space in particular, where Nvidia has long been problematic. Or maybe it'll be a multi decade gaming winter, but unless gamers stop being willing to throw large amounts of money chasing the top end, someone will want that money even if Nvidia didn't.
> In fact, if more time is spent optimising for non-premium cards, perhaps the premium card that you already have will work better then the next upgrade would have.
Nah. The stone doesn’t have nearly that much blood to squeeze. And optimizations for ultralow-end may or may not have any benefit to high end. This isn’t like optimizing CPU instruction count that benefits everyone.
Would there be a huge drive towards debloating software to run again on random old computers people find in cupboards?
They'll just move to remote rendering you'll have to subscribe to. Computers will stagnate as they are, and all new improvements will be reserved for the cloud providers. All hail our gracious overlords "donating" their compute time to the unwashed masses.
Hopefully AMD and Intel would still try. But I fear they'd probably follow Nvidia's lead.
Game streaming works well for puzzle, story-esque games where latency isn't an issue.
GeForce NOW and Xbox Cloud are much more sensible projects to look at/evaluate than Stadia.
Any game that is requires high APM (Action Per Minute) will be horrible to play via streaming.
I feel as if I shouldn't really need to explain this on this site, because it should be blindingly obvious that this will always be an issue with any streamed games for the same reason you have a several seconds lag between what happening on a live sports event and what you see on the screen.
The economics of it also have issues, as now you have to run a bunch more datacenters full of GPUs, and with an inconsistent usage curve leaving a bunch of them being left idle at any given time. You'd have to charge a subscription to justify that, which the market would not accept.
Not that its good or bad tho but we could probably have something more akin to spot instances of gpu being given for gaming purposes.
I do see a lot of company are having GPU access costs per second/instant shutdown/restart I suppose but overall I agree
My brother recently came for the holidays and I played ps5 for the first time on his mac connected to his room 70-100 kms away and honestly, the biggest factor of latency was how far the wifi connection (which was his phone's carrier) and overall, it was a good enough experience but I only played mortal kombat for a few minutes :)
and I meant that I think that the ps5 can run far away and you can still connect to it from your laptop and even connect a controller to your laptop (as my brother did) to play with a controller which runs on a mac and then it uses the ps5 itself
All in all, I found it really cool for what its worth.
Wait, are you sure you don't have that backward? IIUC, you don't[] notice the difference between a 2K display and a 4K display until you get up to larger screen sizes (say 60+ inches give or take a dozen inches; I don't have exact numbers :) ) and with those the optimal viewing range is like 4-8 feet away (depending on the screen size).
Either that or am I missing something...
[
]Generally, anyway. A 4K resolution should definitely be visible at 1-2 feet away as noticeably crisper, but only slightly.I didn't use it for gaming though, and I've "downgraded" resolution to 2x 1440p (and much higher refresh rates) since then. But more pixels is great if you can afford it.
It's one thing to say you don't need higher resolution and fewer pixels works fine, but all the people in the comments acting like you can't see the difference makes me wonder if they've ever seen a 4K TV before.
For example, I have two 1920x1080 monitors, but one is 160 Hz and the other is only 60 Hz, and the difference is night and day between them.
I have a 77’ S95D and my 1080p Switch looked horrible. Try it also with a 1080p screen bigger than 27 inch.
Wow, what a load of bullshit. I bet you also think the human eye can't see more than 30 fps?
If you're sitting 15+ feet away from your screen, yeah, you can't tell the difference. But for most people, with their eyes only being 2-3 feet away from their monitor, the difference is absolutely noticeable.
> HDR and more ray tracing/path tracing, etc. are more sensible ways of pushing quality higher.
HDR is an absolute game-changer, for sure. Ray-tracing is as well, especially once you learn to notice the artifacts created by shortcuts required to get reflections in raster-based rendering. It's like bad kerning. Something you never noticed before will suddenly stick out like a sore thumb and will bother the hell out of you.
Alternatively, you play modern games with incredibly blurry AA solutions. Try looking at something older from when AA actually worked.
And anyone who knows just a tiny bit of history of nvidia would know how much investment they have put into gaming and the technology they pioneered.
And like, when have onboard GPUs ever been good? The fact that they're even feasible these days should be praised but you're imagining some past where devs left them behind.
That way they will not only burn the most good will but will also get themselves entangled even more into the AI bubble - hopefully enough to go down with it.
It would still suck if they left the market because who does AMD have to compete with with? Intel? LOL
Increased prices for everyone. Lovely. I can’t despise AI enough.
I am 100% sure AMD would have done the exact same thing as NVIDIA does right now, given the chance.
Are you saying they wouldn't have milked the market to the last drop? Do you really believe it?
if it were up to them, cuda would be a money losing initiative that was killed in 2009
Well, actually it's that the AI business made NVidia 10x bigger. NVidia now has a market cap of $4.4 trillion. That's six times bigger than General Motors, bigger than Apple, and the largest market cap in the world. For a GPU maker.
They don't make GPUs!
Well, technically, they do assemble some of them, but almost all of the parts are made by other companies.
What happens then if the AI bubble crashes? Nvidia has given up their dominant position in the gaming market and made room for competitors to eat some (most?) of their pie, possibly even created an ultra-rare opportunity for a new competitor to pop up. That seems like a very short-sighted decision.
I think that we will instead see Nvidia abusing their dominant position to re-allocate DRAM away from gaming, as a sector-wide thing. They'll reduce gaming GPU production while simultaneously trying to prevent AMD or Intel from ramping up their own production.
It makes sense for them to retain their huge gaming GPU market share, because it's excellent insurance against an AI bust.
I am not saying you are wrong but here in Eastern Europe, while we did suffer the price hikes (and are suffering those of the DDR5 RAM now as well), the impact was minimal. People just holed up, said "the market's crazy right now, let's wait it out", and shrugged. And lo and behold, successfully wait it out they did.
As I mentioned in another comment in this thread, I highly doubt high RAM prices will survive even to 2027. Sure a good amount of stores and suppliers will try to hold on to the prices with a death grip... but many stores and vendors and suppliers hate stale stock. And some other new tech will start coming out. They would not be able to tolerate shelves full of merchandise at prices people don't want to buy at.
They absolutely _will_ budge.
I predict that by July/August 2026 some price decreases will start. They are likely to be small -- no more than 15% IMO -- but they will start.
The current craze of "let's only produce and sell to the big boys" has happened before, happens now, and will happen again. I and many others don't believe the hysteric "the market will never be the same again after" narrative either.
They can act as monopolist as they want. They can try anything and cackle maniacally at their amazing business acumen all they want.
Turns out, total addressable market is not infinite. Turns out people don't want to spend on RAM as much as they would on a used car. How shocking! And I am completely sure that yet again the MBAs would be unprepared for these grand revelations, like they are, EVERY time.
Still, let us wait and see. Most of us are not in a rush to build a gaming machine or a workstation next month or else puppies will start dying.
I am pretty sure the same people now rubbing hands and believing they have found the eternal gold, will come back begging and pleading for our measly customer dollars before not too long.
Qualcomm before they made all the chips they do today, ran a pretty popular and successful email client called Eudora.
Doing one thing well can lead to doing bigger things well.
More realistically, if the top end chips go towards the most demanding work, there might be more than enough lower grade silicon that can easily keep the gaming world going.
Plus, gamers rarely stop thinking in terms of gaming, and those insights helped develop GPUs into what they are today, and may have some more light to shine in the future. Where we see gaming and AI coming together, whether it's in completely and actually immersive worlds, etc, is pretty interesting.
Update: Adding https://en.wikipedia.org/wiki/Eudora_(email_client)
Most of the consumer market computes through their smartphones. The PC is a niche market now, and PC enthusiasts/gamers are a niche of a niche.
Any manufacturing capacity which NVIDIA or Micron devote to niche markets is capacity they can't use serving their most profitable market: enterprises and especially AI companies.
PCs are becoming terminals to cloud services, much like smartphones already are. Gaming PCs might still be a thing, but they'll be soldered together unexpandable black boxes. You want to run the latest games that go beyond your PC's meager capacity? Cloud stream them.
I know, I know. "Nothing is inevitable." But let's be real: one thing I've learned is that angry nerds can't change shit. Not when there's billions or trillions of dollars riding on the other side.
I'm kind of nostalgic for the Golden Age of graphics chip manufacturers 25 years ago, where we still had NVIDIA and ATI, but also 3DFX, S3, Matrox, PowerVR, and even smaller players, all doing their own thing and there were so many options.
To me what also feels is that there becomes more friction in an already really competitive and high-friction business of creating cloud.
With increasing ram prices which I (from my knowledge) would only decrease in 2027-2028 or when this bubble pops, It would be extremely expensive for a new entry of cloud provider in this space.
When I mention cloud provider, what I mean aren't the trifecta of AWS,Azure or GCP but rather all the other providers who bought their own hardware and are co-locating it to a datacenter and selling their services targeted at low/mid-range vps/vds servers
I had previously thought about creating cloud but in this economy and the current situations, I'd much rather wait.
The best bet right now for most people creating cloud /providing such services is probably whitewashing any other brand and providing services on top that make you special.
The servers are still rather cheap but the mood that I can see in providers right now is that they are willing to hold the costs for some time to not create a frenzy (so they still have low prices) but they are cautiously waiting and looking for the whole situation and if recent developments continue happening in such a way, I wouldn't be surprised if server providers might raise some prices because the effective underlying hardware's ram/prices increased too.
The end goal is the elimination of personal ownership over any tech. They want us to have to rent everything.
I don't exactly think that they did it on purpose to chokehold the supply but it sure damn happened and that doesn't change the fact that prices of hardware might / (already?) increase
But perhaps then the US will probably reply with tariffs on the PC parts (or even ban them!) Which is slowly becoming the norm for US economic policy, and which won't reverse even after Trump.
Interesting times they would be!
Is it the best? No! Is it the most performant? NO! is it rock solid reliable out of the box? YES!
CachyOS and Fedora also looked tempting but bog standard linux mint is my powerhouse right now
Except: If I want to kill some time being chaotic in GTA:V Online, and do that in Linux, then that is forbidden. Single player mode works beautifully, but multiplayer is verboten.
(And I'm sure that there are other games that similarly feature very deliberate brokenness, but that's my example.)
I bought my PS5 Pro in anticipation of GTA 6 and the (hopefully) upcoming DragonQuest 12. May my prayers be answered.
144fps + Mouse + Keyboard is just superior.
Everyone will own a presentation layer device. Anyone who can only afford the 1GB model can only get SNES quality visuals.
Snow Crash and Neuromancer have displaced the Bible as cognitive framework for tech rich.
Am working on an app that generates and syncs keys 1:1 over local BT and then syncs them again to home PC (if desired). The idea being cut out internet middle men and go back to something like IRC direct connect, that also requires real world "touch grass" effort to complicate greedy data collectors.
Testing now by sharing IP over Signal and then 1:1'ing over whatever app. Can just scaffold all new protocols on top of TCP/IP again.
The ecosystem isn't closed. TSMC doesn't exist in a vacuum. They may be the most advanced, however, there are a few reasons this will never work:
1) older fabs can be built in a garage by a smart person (it's been done a few times, I'd link the articles, but I don't have them handy)
2) Indie devs exist and often provide better gaming experience than AAA developers.
3) Old hardware/consoles exist, and will continue to exist for many decades to come (my Atari 2600 still works, as an example, and it it is older than I)
Sure, they MAY attempt to grab the market in this way. The attempt will backfire. Companies will go under, including possibly team green if they actually do exit the gaming market (because let's be real, at least in the U.S. a full blown depression is coming. When? No idea. However, yes, it's coming unless folks vote out the garbage.), and the player that doesn't give in, or possibly a chinese player that has yet to enter the market, will take over.
Yeah, with 1970s-era feature size. That's fine if your idea of AAA gaming is Hunt The Wumpus or Pong.
So.. a smart phone?
i predict that the "pc" is going to be slowly but surely eaten bottom-up by increasingly powerful SoCs.
Moore Threads in China just announced a new GPU.[1] Announced, not shipped.
[1] https://wccftech.com/moore-threads-lushan-gaming-huashan-ai-...
That said, things like improved environmental physics and NPC/enemy AI might enable new and novel game mechanics and creative game design. But that can come from AMD and others too.
I just upgraded to 9700 XT to play ARC Raiders and it's absolutely a feast for the eyes while also pioneering on several fronts especially around the bot movement and intelligence.
Have you seen the GTA VI trailer?
Frankly, the graphics chops are plenty strong for a decade of excellent games. The big push in the next couple decades will probably be AI generated content to make games bigger and more detailed and more immersive
Overall, I think that AMD is more focused and energetic than their competitors now. They are very close to taking over Intel on their long CPU race, both in the datacenter and consumer segments, and Nvidia might be next in the coming 5 years, depending on how the AI bubble develops.
They’ve become a big name now with AI, but they were never the only game in town in their home markets. They had an edge on the high-end so their name had some prestige, but market share wise it was quite even. Even with AI, they have a temporary head start but I wouldn’t be surprised if they get crowded in the coming years, what they do is not magic.
My general impression is that the US technology companies either treat competition from China seriously and actively engage, or Chinese tech companies will slowly and surely eat the cake.
There are numerous examples: the recent bankruptcy of iRobot, the 3D printer market dominated by Bambu Labs, the mini PC market where Chinese brands dominates.
Fuck this future.
Won't personally miss Nvidia, but we need competition in the space to keep prices 'reasonable' (although they haven't been reasonable for some years), and to push for further innovation.
It's possible these datacenter AI GPUs are built so different from conventional GPUs that they lack required hardware to draw polygons, like ROPs and texture units. Why waste chip engineering time and silicon die space to support applications that a product isn't designed for? Let me remind you that gaming is a small slice of NVidia's balance sheet[1], so it makes sense to not have to use one chip design for everything.
[0] https://www.youtube.com/watch?v=TY4s35uULg4
[1] $4.3 billion out of $57 billion https://nvidianews.nvidia.com/news/nvidia-announces-financia...
You can still have a tax implication when you sell the fully depreciated item but in theory it should only be a benefit unless your company has a 100% marginal tax rate somehow.
Of course it can cost more to store the goods and administer the sale then you recoup. And the manufacturer may do or even require a buyback to prevent the second hand market undercutting their sales. Or you may be disinclined to provide cheap hardware to your competitors.
Wowfunhappy•1mo ago
xattt•1mo ago
Don’t worry about Nintendo. Their pockets are deep and they are creative enough to pivot. They would retool their stack to support another ARM chip, or another arch entirely.
nicolaslem•1mo ago