frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Fabrice Bellard Releases MicroQuickJS

https://github.com/bellard/mquickjs/blob/main/README.md
447•Aissen•3h ago•149 comments

Terrence Malick's Disciples

https://yalereview.org/article/bilge-ebiri-terrence-malick
25•prismatic•1h ago•1 comments

Meta is using the Linux scheduler designed for Valve's Steam Deck on its servers

https://www.phoronix.com/news/Meta-SCX-LAVD-Steam-Deck-Server
330•yellow_lead•3h ago•153 comments

Volvo Centum is Dalton Maag's new typeface for Volvo

https://www.wallpaper.com/design-interiors/corporate-design-branding/volvo-new-font-volvo-centum
41•ohjeez•2h ago•34 comments

Towards a secure peer-to-peer app platform for Clan

https://clan.lol/blog/towards-app-platform-vmtech/
38•throawayonthe•3h ago•8 comments

Help My c64 caught on fire

https://c0de517e.com/026_c64fire.htm
13•ibobev•1h ago•0 comments

Adobe Photoshop 1.0 Source Code (1990)

https://computerhistory.org/blog/adobe-photoshop-source-code/
378•tosh•5d ago•109 comments

We replaced H.264 streaming with JPEG screenshots (and it worked better)

https://blog.helix.ml/p/we-mass-deployed-15-year-old-screen
172•quesobob•2h ago•120 comments

Instant database clones with PostgreSQL 18

https://boringsql.com/posts/instant-database-clones/
323•radimm•12h ago•133 comments

Astrophotography Target Planner: Discover Hidden Nebulas

https://astroimagery.com/techniques/imaging/astrophotography-target-planner/
37•kianN•4d ago•3 comments

Executorch: On-device AI across mobile, embedded and edge for PyTorch

https://github.com/pytorch/executorch
96•klaussilveira•5d ago•14 comments

Test, don't just verify

https://alperenkeles.com/posts/test-dont-verify/
156•alpaylan•7h ago•108 comments

An initial analysis of the discovered Unix V4 tape

https://www.spinellis.gr/blog/20251223/?yc261223
41•DSpinellis•2h ago•3 comments

Perfect Software – Software for an Audience of One

https://outofdesk.netlify.app/blog/perfect-software
8•ggauravr•3d ago•2 comments

Local AI is driving the biggest change in laptops in decades

https://spectrum.ieee.org/ai-models-locally
126•barqawiz•20h ago•99 comments

Space Math Academy

https://space-math.academy
18•dynamicwebpaige•3d ago•6 comments

Font with Built-In Syntax Highlighting (2024)

https://blog.glyphdrawing.club/font-with-built-in-syntax-highlighting/
129•california-og•10h ago•27 comments

The post-GeForce era: What if Nvidia abandons PC gaming?

https://www.pcworld.com/article/3013044/the-post-geforce-era-what-if-nvidia-abandons-pc-gaming.html
92•taubek•3d ago•164 comments

Toad is a unified experience for AI in the terminal

https://willmcgugan.github.io/toad-released/
66•nikolatt•1d ago•15 comments

The Coffee Warehouse

https://www.scopeofwork.net/the-coffee-warehouse/
40•NaOH•4d ago•35 comments

10 years bootstrapped: €6.5M revenue with a team of 13

https://www.datocms.com/blog/a-look-back-at-2025
238•steffoz•12h ago•89 comments

Snitch – A friendlier ss/netstat

https://github.com/karol-broda/snitch
295•karol-broda•19h ago•93 comments

iOS 26.3 brings AirPods-like pairing to third-party devices in EU under DMA

https://www.macrumors.com/2025/12/22/ios-26-3-dma-airpods-pairing/
140•Tomte•14h ago•108 comments

Carnap – A formal logic framework for Haskell

https://carnap.io/
95•ravenical•11h ago•20 comments

It's Always TCP_NODELAY

https://brooker.co.za/blog/2024/05/09/nagle.html
438•eieio•23h ago•159 comments

Show HN: CineCLI – Browse and torrent movies directly from your terminal

https://github.com/eyeblech/cinecli
283•samsep10l•15h ago•97 comments

Ryanair fined €256M over ‘abusive strategy’ to limit ticket sales by OTAs

https://www.theguardian.com/business/2025/dec/23/ryanair-fined-limit-online-travel-agencies-ticke...
206•aquir•9h ago•221 comments

Dancing around the rhythm space with Euclid

https://pv.wtf/posts/euclidean-rhythms
26•dracyr•1d ago•0 comments

The Illustrated Transformer

https://jalammar.github.io/illustrated-transformer/
467•auraham•1d ago•85 comments

Stop Slopware

https://stopslopware.net/
92•bradley_taunt•4h ago•116 comments
Open in hackernews

The post-GeForce era: What if Nvidia abandons PC gaming?

https://www.pcworld.com/article/3013044/the-post-geforce-era-what-if-nvidia-abandons-pc-gaming.html
92•taubek•3d ago

Comments

Wowfunhappy•3d ago
I'm also curious what this could mean for Nintendo.
xattt•3d ago
NVIDIA would still have service contract obligations to fulfil, and would provide support for its existing products for a period of time.

Don’t worry about Nintendo. Their pockets are deep and they are creative enough to pivot. They would retool their stack to support another ARM chip, or another arch entirely.

nicolaslem•3d ago
What goes into a Nintendo console is not prime silicon. When it's time to design the next console, I am sure Nvidia will still be more than happy to give them a design that they have laying around somewhere in a drawer if it means they ship 100M units.
eucryphia•3d ago
More children born?
0dayz•3d ago
It remains to be seen to be fair.

But if this does happen it will be in my opinion the start of a slow death of the democratization of tech.

At best it means we're going to be relegated to last tech if even that, as this isn't a case of SAS vs s-ata or u.2 vs m.2, but the very raw tech (chips).

butterknife•3d ago
Is it better to go short on them or buy AMD?
internet101010•52m ago
Betting on AMD's continued success in CPUs is far safer than Nvidia's demise.
pjmlp•3d ago
It means people get to enjoy more indie games with good designs, instead of having FOMO for cool graphics without substance.
newsclues•3d ago
It means lots of people will give up the hobby.

Let's be real, the twitch FPS CoD players aren't going to give that up and play a boring life simulator.

This has the potential to harm a lot of businesses from hardware to software companies, and change the lives of millions of people.

jonway•3d ago
That doesn't sem very plausible, how many people are driven away from CounterStrike or like League of Legends because the graphics weren't as good as Cyberpunk or whatever?

Theres a LOT of games that compete with AAA-massive-budget games on aggregate like Dwarf Fortress, CS, League, Fortnite, people are still playing arma 2, dayz, rust, etc Rainbow Six: Siege still has adherents and even cash-payout tournaments. EvE: Online, Ultima Online, Runescape, still goin'

These games have like no advertising and are still moneymakers. Eve and UO are like 20 and 30 years old. Heck, Classic WoW!

deaux•3d ago
I wonder if all the games you named combined surpass what Mihoyo makes off the likes of Genshin Impact.
jonway•2d ago
Dunno (maybe wow?) but is it the most expensive graphics hardware giving AAA all the money/air or because they have a great reputation as games, solid consistent advertising, a strong network effect and a spot on the top of new release lists?

I feel like league of legends has, wrt the genshin $s, I honestly haven’t checked!

Apocryphon•3h ago
Do gacha mobile games even require high-end graphics? Genshin Impact doesn't support native ray tracing.
re-thc•13m ago
Plenty of titles that support ray tracing eg wuthering waves.

Many gacha titles now offer amazing pc graphics on nvidia cards compared to mobile.

baobun•3d ago
PC gaming will be fine even without 8K 120fps raytracing. It will be fine even if limited to iGPUs. Maybe even better off if it means new titles are actually playable on an average new miniPC. More realistically I guess we get an AMD/Intel duopoly looking quite similar instead.

It will probably be a bigger blow to people who want to run LLMs at home.

A4ET8a8uTh0_v2•3d ago
Huh? No? It means that the overall platform is already at 'good enough' level. There can always be an improvement, but in terms of pure visuals, we are already past at a point, where some studios choose simple representations ( see some 2d platformers ) as a stylistic choice.

It gonna be ok.

newsclues•3d ago
My pc is good enough for now but it’s years old and when it dies, then what? You want me to give up gaming and start hanging out at your local bar?
A4ET8a8uTh0_v2•3d ago
It is not a question of want. Gaming will exist in some form so I am simply uncertain what you are concerned about.

Can you elaborate a little? What, exactly, is your concern here? That you won't have nvidia as a choice? That AMD will be the only game in town? That gpu market will move from duopoly ( for gaming specifically ) to monopoly? I have little to go on, but I don't really want to put words in your mouth based on minimal post.

newsclues•3d ago
I want a local gaming machine that I control.

Not a locked ecosystem console or a streaming service with lag!

I think if nvidia leaves the market for AI, why wouldn’t AMD and intel, with the memory cartel. So DIY market is gone. That kills lots of companies and creators that rely on the gaming market.

It’s a doom spiral for a lot of the industry. If gaming is just PlayStation and switch and iGPUs there is a lot less innovation in pushing graphics.

It will kill the hobby.

A4ET8a8uTh0_v2•3d ago
Interesting. Does nvidia offer control? Last time I checked they arbitrarily updated their drivers to degrade unwelcome use case ( in that case, for crypto ). It sounds to me like the opposite of that.

Separately, do you think they won't try to ingratiate themselves to gamers again once AI market changes?

Do you not think they are part of the cartel anyway ( and the DIY market exists despite that )?

<< So DIY market is gone.

How? One use case is gone. Granted, not a small one and one with an odd type of.. fervor, but relatively small nonetheless. At best, DIY market shifts to local inference machines and whatnot. Unless you specifically refer to gaming market..

<< That kills lots of companies and creators that rely on the gaming market.

Markets change all the time. EA is king of the mountain. EA is filing for bankruptcy. Circle of life.

Edit: ALso, upon some additional consideration and in the spirit of christmas, fuck the streamers ( aka creators ). With very, very limited exceptions, they actively drive what is mostly wrong with gaming these days. Fuck em. And that is before we get to the general retardation they contribute to.

<< It’s a doom spiral for a lot of the industry.

How? For AAA? Good. Fuck em. We have been here before and were all better for it.

<< If gaming is just PlayStation and switch and iGPUs there is a lot less innovation in pushing graphics.

Am I reading it right? AMD and Intel is just for consoles?

<< It will kill the hobby.

It is an assertion without any evidence OR a logical cause and effect.

So far, I am not buying it.

pjmlp•2d ago
There was no DIY market on 8 and 16 bit home computers with fixed hardware, yet bedroom coding (aka indies) not only did thrive, they were the genesis of many AAA publishers, and to this day those restrictions keep the Demoscene alive and recognised as World culture heritage.

PC was largely ignored for gaming, until finally EGA/VGA card, alongside AdLib/Soundblaster, became widespread in enough households to warrant development costs.

Apocryphon•3h ago
Who cares about "innovation in pushing graphics"? It's arguable that video game graphics reached 'good enough' a couple of console generations ago. Maybe as early as seventh gen.
acheron•3d ago
Why would nvidia not making gaming cards make you “give up gaming”?
pjmlp•3d ago
Let's be real, CoD only appeals to a small community in the whole planet.
newsclues•3d ago
Millions of people.
amanaplanacanal•3d ago
But if it didn't exist, those people would likely be playing something else.
newsclues•3d ago
If no pc hardware exists eventually there will be no games to play. Then you will have a bunch of angry gamers at the park pissing everyone off.

If my hobby is ruined and I can’t have fun, I’m going to be an asshole and make everyone else unhappy.

lolc•3d ago
Ahaha are you trolling for entitled gamers? Yeah wouldn't want the real world having to face those. No worries: as long as there are people willing to drop money into expensive gear, somebody will sell it.
pjmlp•3d ago
There are many more millions of gamers that don't even care CoD exists, it fits a small percentage on the world of gaming.
ThrowawayR2•3d ago
CoD has less players than Team Fortress 2 currently, according to Valve's charts: https://store.steampowered.com/charts/mostplayed . And TF2 has ancient graphics.
throwaway613745•2h ago
Steam isn’t a good metric because they sell the game on Battle.Net.

CoD is also huge on Playstation.

throwaway613745•2h ago
It’s only one of if not the best selling game every year.

Totally niche appeal, yeah right.

coldtea•3d ago
>It means lots of people will give up the hobby.

Oh, we can only hope!

>This has the potential to harm a lot of businesses from hardware to software companies, and change the lives of millions of people.

Including millions of gamers, but for the better.

newsclues•3d ago
You hate gamers? Why?

Why can’t you let people enjoy their hobby?

coldtea•3d ago
For the same reason I don't like alcoholism or meth use or gambling or porn addiction, even when the person "enjoys them".
tavavex•3d ago
That's one hell of a long shot. Are your views applicable to the rest of the entertainment industry? There's plenty of people wasting away in front on Netflix, after all. Or why just entertainment, any "useless" hobbies that are repeatedly done for fun but have no real productive output. Is any comparable "pleasurable" activity that also hooks a minority of people in an unhealthy way bad, or just gaming?

But what's most insane is trying to draw any parallels between gaming and these other things - something that was literally engineered to ruin human lives, biologically (hard drugs) or psychologically (gambling). The harm and evil caused by these two industries is incomprehensible (especially the legal parts of them, like alcohol and casino gambling/sports betting/online gambling), and trying to fit gaming in among them both downplays the amount of suffering inflicted by gambling and hard drugs, as well as villainizes normal people - like the hundreds of millions of people who play games in a sane, non-problematic way or indie game devs who make games because they want to express themselves artistically.

Anyways, I gotta log off HN for a while. I can feel my gaming withdrawal kicking in. I've bankrupted myself four times by only spending my money on gaming, and I've been in and out of rehab centres and ERs as I've been slowly destroying my body with gaming in a spiral of deadly addiction. I think I'll have to panhandle and threaten strangers on the street to buy some Steam cards.

coldtea•1d ago
>That's one hell of a long shot. Are your views applicable to the rest of the entertainment industry?

Yes.

0x1ch•2h ago
You must be the life of the party, if the party was a funeral.
thefaux•32m ago
I had a roommate who failed out of college because he was addicted to Everquest (yes, Everquest, and yes I am middle-aged). Your last paragraph is barely even hyperbolic. Do you think unemployed young men who live at home with their parents, do little to no physical activity, spending most of their time playing videogaming and/or trolling on the internet are not stuck destroying their bodies (and minds) in a spiral of deadly addiction? Maybe you are a functional gamer, but there are many, many gamers who are not and this technology is maybe a quasi effective cope for our punishing society writ large, but from the outside, gaming addicts appear to be living a sad and limited life.

Or to put it more succinctly, would you want your obituary to lead with your call of duty prowess?

GaryBluto•25m ago
>functional gamer

Excellent satire.

ssl-3•1h ago
I don't like football. Can we add football fans into the mix of people we're punching down on, alongside the gamers and the meth heads?

Thank you for your consideration.

hyghjiyhu•3d ago
Cod devs aren't stupid. They will design a game for the hardware their target market can get their hands on.
ikamm•3d ago
Most CoD players are on console or mobile, not PC
makeitdouble•3h ago
On graphics: there is a threshold where realistic graphics make the difference.

Not all games need to be that, but Ghost of Tsushima in GBA Pokemon style is not the same game at all. And is it badly designed ? I also don't think so. Same for many VR games which make immersion meaningful in itself.

We can all come up with a litany of bad games, AAA or indie, but as long as there's a set of games fully pushing the envelope and bringing new things to the table, better hardware will be worth it IMHO.

Apocryphon•3h ago
Sure, but would Ghosts of Tsushima be any less immersive with PS4 graphics? Even max PS3 graphics?
makeitdouble•2h ago
Yes.

The whole point is to convey details of an area you never lived in, of an actual place you never visited.

I'd make the same argument for Half-Life Alyx or BioHazard, the visceral reaction you get from a highly detailed and textured world works at a different level than just "knowing" what you have in front of your eyes.

Your brain is not filling the gaps, it is taking in the art of the creator.

Apocryphon•2h ago
Eh, eye of the beholder. It's made all the funnier that Ghosts of Tsushima has a Kurosawa Mode that converts all that detail into monochrome.

RE 7 Biohazard was made for the PS4! And its VR version and Half-Life Alyx probably do require higher graphical fidelity, as VR games are not exactly the same thing as regular video games.

makeitdouble•1h ago
> VR games are not exactly the same thing as regular video games.

That might be the fundamental divide, for that category of games I'm more on the VR camp and will settle for 2D only for convenience or availability.

I see it with different expectations than games like Persona or Zelda (or GTA?) which could compete solely on the mechanics and a lot more, and I get the feeling you're comparing it more to these genres?

Biohazard on PS4 was very meh to me, at that level I feel it could get down to Switch graphics to prioritize better game mechanics and more freedom of play. I never loved the clunkiness, as an action games it's pretty frustrating, and the VR game is even worse in gameplay quality. The immersiveness in VR is the only redeeming quality IMHO.

throwaway613745•2h ago
Ghost of Tsushima already is a PS4 game???
makeitdouble•1h ago
Yes. Just to note, I was referring to the PC version, I don't know how much of a difference that makes.
bitwize•1h ago
Daikatana in GBC style turned into a good game, lol.
PunchyHamster•1h ago
I can't name one in last 5 that has been "pushing the envelope" that would actually wow me. And the ones that did, did it by artstyle, not sheer amount of polygons pushed to the screen.

VR, sure, you want a lot of frames on 2 screens, that requires beef so the visual fidelity on same GPU will be worse than on screen, but other than that if anything graphical part of games have flatlined for me.

Also, putting the money literally anywhere else gonna have better results game quality wise. I want better stories and more complex/interesting systems, not few more animated hairs

georgefrowny•1h ago
Man, I remember playing UT GOTYE back in the 00s and the graphics blew us away when we fired it up and then Return to Castle Wolfenstein made my brother cry from the "realistic" zombies (on a CRT even!). It's amazing what you can take for granted when even a fraction of a modern card would have been called "photorealistic" back then.
jaapz•3d ago
AMD will be very happy when they do. They are already making great cards, currently running an RX7800XT (or something like that), and it's amazing. Linux support is great too
snvzz•3d ago
RX 7900gre, can confirm as much.
mdip•1h ago
Wow, yeah, I picked up one of these a few months before the new generation came out for $350. Everything shot up after that.

My son is using that card, today, and I'm amazed at everything that card can still power. I had a 5080 and just comparing a few games, I found if he used the SuperResolution correctly, he can set the other game settings at the same as mine and his frame-rate isn't far off (things like Fortnite, not Cyberpunk 2077)

There are many caveats there, of course. AMD's biggest problem is in the drivers/implementation for that card. Unlike NVidia's similar technology, it requires setting the game at a lower resolution which it then "fixes" and it tends to produce artifacts depending on the game/how high those settings go. It's a lot harder to juggle the settings between the driver and the game than it should be.

the_pwner224•58s ago
For games that have FSR built-in you can enable it in the game settings, then it'll only scale up the game content while rendering the HUD at native resolution. And can use the better upscaling algorithms that rely on internal game engine data / motion vectors, should reduce artifacts.

The other cool things is they also have Frame Gen available in the driver to apply to any game, unlike DLSS FG which only works on a few games. You can toggle it on in the AMD software just below the Super Res option. I quickly tried it in a few games and it worked great if you're already getting 60+ FPS, no noticeable artifacts. Though going from 30=>60 doesn't work, too many artifacts. And the extra FPS are only visible in the AMD software's FPS counter overlay, not in other FPS counter overlays.

acheron•3d ago
I got an.. AMD (even today I still almost say “ATI” every time) RX6600 XT I think, a couple years ago? It’s been great. I switched over to Linux back in the spring and yes the compatibility has been fine and caused no issues. Still amazed I can run “AAA” games, published by Microsoft even, under Linux.
chasd00•1h ago
My very gaming experienced and data oriented 13 year old wants to switch from Nvidia to AMD. I don’t understand all his reasons/numbers but I suppose that’s as good an endorsement as any for AMDs GPUs.
moffkalast•1h ago
AMD will certainly be very happy to raise prices significantly when they have a defacto monopoly over the market segment alright.
speedgoose•1h ago
If it’s too expensive, I will play on my phone or my macbook instead of a gaming pc. They can’t increase the prices too much.
keyringlight•45m ago
Similar with nvidia, you've got to consider what partner companies AMD likes working with. AMD/nvidia design chips, contract TSMC to make them, then sell the chips to the likes of ASUS/MSI/gigabyte/etc to put them on cards the consumer buys. The other market AMD serves is Sony/MS for their consoles and I'd argue they're a major motivator driving radeon development as they pay up-front to get custom APU chips, and there's synergy there with Zen and more recently the AI demand. Ever since ATi bought up the company (ArtX) that made the Gamecube GPU it seems to me that the PC side is keeping the motor running in-between console contracts as far as gaming demands go, given their low market share they definitely don't seem to prioritize or depend on it to thrive.
snvzz•3d ago
If NVIDIA exits the market, there is still AMD, Intel and PowerVR (Imagination Technologies is back at making discrete PC GPUs, although currently only in China).
ErroneousBosh•3d ago
Unfortunately none of those are any use for video work.
shmeeed•2d ago
Is that due to some kind of issue with the architecture, or just a matter of software support?

In the latter case, I'd expect patches for AMD or Intel to become a priority pretty quickly. After all, they need their products to run on systems that customers can buy.

ErroneousBosh•2d ago
Well, they don't support CUDA and I don't see CUDA coming to AMD any time soon.

Intel is just plain not capable of it because it's not really a GPU, more a framebuffer with a clever blitter.

pjmlp•2d ago
Great opportunity to adopt Khronos APIs.
ivanjermakov•3h ago
Isn't OpenCL is a widely supported compute platform alternative to CUDA? Not sure if it's fair comparison.
iszomer•2h ago
ZLUDA?
pdpi•1h ago
Presumably you mean Intel’s integrated GPUs? They do have the Arc line of discrete GPUs now, and those are a bit more than a frame buffer with a clever blitter.
grim_io•3d ago
Shrug and buy the next best thing?
fxtentacle•3d ago
I don’t think they can.

NVIDIA, like everyone else on a bleeding edge node, has hardware defects. The chance goes up massively with large chips like modern GPUs. So you try to produce B200 cores but some compute units are faulty. You fuse them off and now the chip is a GP102 gaming GPU.

The gaming market allows NVIDIA to still sell partially defective chips. There’s no reason to stop doing that. It would only reduce revenue without reducing costs.

jsheard•3d ago
You can't turn a GB200 into a GB202 (which I assume is what you meant since GP102 is from 2016), they are completely different designs. That kind of salvage happens between variants of the same design, for example the RTX Pro 6000 and RTX 5090 both use GB202 in different configurations, and chips which don't make the cut for the former get used for the latter.
cwzwarich•3d ago
Nvidia doesn't share dies between their high-end datacenter products like B200 and consumer products. The high-end consumer dies have many more SMs than a corresponding datacenter die. Each has functionality that the other does not within an SM/TPC, nevermind the very different fabric and memory subsystem (with much higher bandwidth/SM on the datacenter parts). They run at very different clock frequencies. It just wouldn't make sense to share the dies under these constraints, especially when GPUs already present a fairly obvious yield recovery strategy.
woah•2h ago
Why don't they sell these to datacenters as well, which could run a "low core section" with reduced power and cooling?
PunchyHamster•1h ago
> So you try to produce B200 cores but some compute units are faulty. You fuse them off and now the chip is a GP102 gaming GPU.

B200 doesn't have any graphics capabilities. The datacenter chips don't have any graphical units, it's just wasted die space.

As long as gaming GPUs will compete for same wafer space that AI chips use, the AI chips will be far more profitable to NVIDIA

iwontberude•42m ago
Well the good thing for NVIDIA AI business is that most of your chips can sit unused in warehouses and still get rich. 6 million H100s sold but infrastructure (water cooled dc) for only a third of them exists in the world.
sombragris•3d ago
I doubt that this would ever happen. But...

If it does, I think it would be a good thing.

The reason is that it would finally motivate game developers to be more realistic in their minimum hardware requirements, enabling games to be playable on onboard GPUs.

Right now, most recent games (for example, many games built on Unreal Engine 5) are unplayable on onboard GPUs. Game and engine devs simply don't bother anymore to optimize for the low end and thus they end up gatekeeping games and excluding millions of devices because for recent games, a discrete GPU is required even for the lowest settings.

Bridged7756•3d ago
True. Optimization is completely dead. Long gone are the days of a game being amazing because the devs managed to pull crazy graphics for the current hardware.

Nowadays a game is only poorly optimized if it's literally unplayable or laggy, and you're forced to constantly upgrade your hardware with no discernible performance gain otherwise.

archagon•3d ago
I feel like Steam Deck support is making developers optimize again.
batiudrami•2d ago
Crazy take, in the late 90s/early 00s your GPU could be obsolete 9 months after buying. The “optimisation” you talk about was the CPU in the ps4 generation was so weak and tech was moving so fast that any pc bought in 2015 onwards would easily brute force overpower anything that had been built for that generation.
ronsor•3h ago
> Crazy take, in the late 90s/early 00s your GPU could be obsolete 9 months after buying.

Not because the developers were lazy, but because newer GPUs were that much better.

jeltz•1h ago
There were lazy devs back then too but I feel lazy devs have become the norm now.
dijit•1h ago
I work in gamedev, historically AAA gamedev.

If you think that the programmers are unmotivated (lazy) or incompetent; you’re wrong on both counts.

The amount of care and talent is unmatched in my professional career, and they are often working from incomplete (and changing) specifications towards a fixed deadline across multiple hardware targets.

The issue is that games have such high expectations that they didn’t have before.

There are very few “yearly titles” that allow you to nail down the software in a nicer way over time, its always a mad dash to get it done, on a huge 1000+ person project that has to be permanently playable from MAIN and where unit/integration tests would be completely useless the minute they were built.

The industry will end, but not because of “lazy devs”, its the ballooned expectations, stagnant costs, increased team sizes and a pathological contingent of people using games as a (bad) political vehicle without regard for the fact that they will be laid off if they can’t eventually generate revenue.

—-

Finally, back in the early days of games, if the game didn’t work, you assumed you needed better hardware and you would put the work in fixing drivers and settings or even upgrading to something that worked. Now if it doesn’t work on something from before COVID the consensus is that it is not optimised enough. I’m not casting aspersions at the mindset, but it’s a different mentality.

gmueckl•1m ago
Most gamers don't have the faintest clue regarding how much work and effort a game requires these days to meet even the minimum expectations they have.
bombcar•2h ago
Obsolete in that you’d probably not BUY it if building new, and in that you’d probably be able to get a noticeably better one, but even then games were made to run in a wide gamut of hardware.

For awhile there you did have noticeable gameplay differences- those with GL quake could play better kind of thing.

tinco•1h ago
The GP was talking about Unreal Engine 5 as if that engine doesn't optimize for low end. That's a wild take, I've been playing Arc Raiders with a group of friends in the past month, and one of them hadn't upgraded their PC in 10 years, and it still ran fine (20+ fps) on their machine. When we grew up it would be absolutely unbelievable that a game would run on a 10 year old machine, let alone at bearable FPS. And the game is even on an off-the-shelf game engine, they possibly don't even employ game engine experts at Embark Studios.
fngjdflmdflg•23m ago
>And the game is even on an off-the-shelf game engine, they possibly don't even employ game engine experts at Embark Studios.

Perhaps, but they also turned off Nanite, Lumen and virtual shadow maps. I'm not a UE5 hater but using its main features does currently come at a cost. I think these issues will eventually be fixed in newer versions and with better hardware, and at that point Nanite and VSM will become a no-brainer as they do solve real problems in game development.

justsomehnguy•1h ago
> your GPU could be obsolete 9 months after buying

Or even before hitting the shelves, cue Trio3D and Mystique, but tha's another story.

pzmarzly•2h ago
> Long gone are the days of a game being amazing because the devs managed to pull crazy graphics for the current hardware.

DOOM and Battlefield 6 are praised for being surprisingly well optimized for the graphics they offer, and some people bought these games for that reason alone. But I guess in the good old days good optimization would be the norm, not the exception.

forrestthewoods•2h ago
> I think it would be a good thing.

This is an insane thing to say.

> Game and engine devs simply don't bother anymore to optimize for the low end

All games carefully consider the total addressable market. You can build a low end game that runs great on total ass garbage onboard GPU. Suffice to say these gamers are not an audience that spend a lot of money on games.

It’s totally fine and good to build premium content that requires premium hardware.

It’s also good to run on low-end hardware to increase the TAM. But there are limits. Building a modern game and targeting a 486 is a wee bit silly.

If Nvidia gamer GPUs disappear and devs were forced to build games that are capable of running on shit ass hardware the net benefit to gamers would be very minimal.

What would actually benefit gamers is making good hardware available at an affordable price!

Everything about your comment screams “tall poppy syndrome”. </rant>

bombcar•2h ago
I wonder what Balatro dos that wouldn’t be possible on a 486.
duskwuff•1h ago
The swirly background (especially on the main screen), shiny card effects, and the CRT distortion effect would be genuinely difficult to implement on a system from that era. Balatro does all three with a couple hundred lines of GLSL shaders.

(The third would, of course, be redundant if you were actually developing for a period 486. But I digress.)

oivey•1h ago
Virtually all the graphics? Modern computers are very fast.
woah•1h ago
I always chuckle when I see an entitled online rant from a gamer. Nothing against them, it's just humorous. In this one, we have hard-nosed defense of free market principles in the first part worthy of Reagan himself, followed by a Marxist appeal for someone (who?) to "make hardware available at an affordable price!".
filleduchaos•44m ago
...what exactly about "make hardware available at an affordable price" is "Marxist"?
venturecruelty•3m ago
TIL: being anti-monopoly is both entitled and Marxist.
georgefrowny•1h ago
> This is an insane thing to say.

I don't think it's insane. In that hypothetical case, it would be a slightly painful experience for some people that the top end is a bit curtailed for a few years while game developers learn to target other cards, hopefully in some more portable way. But also feeling hard done by because your graphics hardware is stuck at 2025 levels for a bit is not that much of hardship really, is it? In fact, if more time is spent optimising for non-premium cards, perhaps the premium card that you already have will work better then the next upgrade would have.

It's not inconceivable that the overall result is a better computing ecosystem in the long run. The open source space in particular, where Nvidia has long been problematic. Or maybe it'll be a multi decade gaming winter, but unless gamers stop being willing to throw large amounts of money chasing the top end, someone will want that money even if Nvidia didn't.

forrestthewoods•1h ago
There is a full actual order of magnitude difference between a modern discrete GPU and a high end card. Almost two orders of magnitude (100x) compare to an older (~2019) integrated GPU.

> In fact, if more time is spent optimising for non-premium cards, perhaps the premium card that you already have will work better then the next upgrade would have.

Nah. The stone doesn’t have nearly that much blood to squeeze. And optimizations for ultralow-end may or may not have any benefit to high end. This isn’t like optimizing CPU instruction count that benefits everyone.

georgefrowny•1h ago
One wonders what would happen in a SHtF situation or someone stubs their toe on the demolition charges switch at TSMC and all the TwinScans get minced.

Would there be a huge drive towards debloating software to run again on random old computers people find in cupboards?

gunalx•45m ago
Until we end up spending trillions recreating the fab capacity of tsmc, they dont have a full monppoly (yet)
bilegeek•1h ago
> The reason is that it would finally motivate game developers to be more realistic in their minimum hardware requirements, enabling games to be playable on onboard GPUs.

They'll just move to remote rendering you'll have to subscribe to. Computers will stagnate as they are, and all new improvements will be reserved for the cloud providers. All hail our gracious overlords "donating" their compute time to the unwashed masses.

Hopefully AMD and Intel would still try. But I fear they'd probably follow Nvidia's lead.

pegasus•1h ago
Is remote rendering a thing? I would have imagined the lag would make something like that impractical.
WackyFighter•1h ago
The lag is high. Google was doing this with stadia. A huge amount of money comes from online multiplayer games and almost all of them require minimal latency to play well. So I doubt EA, Microsoft, Activision is going to effectively kill those cash cows.

Game streaming works well for puzzle, story-esque games where latency isn't an issue.

filleduchaos•46m ago
Hinging your impression of the domain on what Google (notoriously not really a player in the gaming world) tried and failed will not exactly give you the most accurate picture. You might as well hinge your impression of how successful a game engine can be on Amazon's attempts at it.

GeForce NOW and Xbox Cloud are much more sensible projects to look at/evaluate than Stadia.

WackyFighter•3m ago
It doesn't matter who does it. To stream you need to send the player input across the net, process, render and then send that back to the client. There is no way to eliminate that input lag.

Any game that is requires high APM (Action Per Minute) will be horrible to play via streaming.

vachina•1h ago
It will be if personal computing becomes unaffordable. The lag is simply mitigated by having PoP everywhere.
gs17•53m ago
GeForce NOW is supposedly decent for a lot of games (depending on connection and distance to server), although if Nvidia totally left gaming they'd probably drop the service too.
vachina•1h ago
You wish. Games will just be published cloud-only and you can only play them via thin clients.
iwontberude•43m ago
This hurt my soul. Kudos.
treyd•36m ago
It's pretty consistently been shown that this just can't provide low-enough latency for gamers to be comfortable with it. Every attempt at providing this has experience has failed. There's few games where this can even theoretically be viable.

The economics of it also have issues, as now you have to run a bunch more datacenters full of GPUs, and with an inconsistent usage curve leaving a bunch of them being left idle at any given time. You'd have to charge a subscription to justify that, which the market would not accept.

Imustaskforhelp•26m ago
I am pretty sure that the current demand of gpu's can pretty much eat the left idle time issue at major datacenters because of the AI craze.

Not that its good or bad tho but we could probably have something more akin to spot instances of gpu being given for gaming purposes.

I do see a lot of company are having GPU access costs per second/instant shutdown/restart I suppose but overall I agree

My brother recently came for the holidays and I played ps5 for the first time on his mac connected to his room 70-100 kms away and honestly, the biggest factor of latency was how far the wifi connection (which was his phone's carrier) and overall, it was a good enough experience but I only played mortal kombat for a few minutes :)

cyber_kinetist•2m ago
Current datacenter GPUs are optimized for LLM compute, not for real-time rendering. The economics for running such beefy GPUs just for game streaming won't add up.
mikepurvis•58m ago
They're not targeting high-end PCs. They're targeting current generation consoles, specifically the PS5 + 1080p. It just turns out that when you take those system requirements and put them on a PC—especially a PC with a 1440p or 2160p ultrawide—it turns out to mean pretty top of the line stuff. Particularly if as a PC gamer you expect to run it at 90fps and not the 30-40 that is typical for consoles.
nerdsniper•49m ago
Without disagreeing with the broad strokes of your comment, it feels like 4K should be considered standard for consoles nowadays - a very usable 4K HDR TV can be had for $150-500.
futureshock•26m ago
Thats a waste of image quality for most people. You have to sit very close to a 4k display to be able to perceive the full resolution. On PC you could be 2 feet from a huge gaming monitor, but an extremely small percentage of console players have the tv size and distance ratio where they would get much out of full 4k. Much better to spend the compute on higher framerate or higher detail settings.
preisschild•31m ago
I agree re "optimizations", but I dont think there should be compromises on quality (if set to max/ultra settings)
venturecruelty•5m ago
I haven't been on HN even 60 seconds this morning and I've already found a pro-monopoly take. Delightful.
m4rtink•2h ago
We can really hope they do it and fast!

That way they will not only burn the most good will but will also get themselves entangled even more into the AI bubble - hopefully enough to go down with it.

throwaway613745•2h ago
I’ve always been an AMD customer because I’ve despised Nvidia’s business practices for 10+ years.

It would still suck if they left the market because who does AMD have to compete with with? Intel? LOL

Increased prices for everyone. Lovely. I can’t despise AI enough.

baal80spam•1h ago
> I’ve always been an AMD customer because I’ve despised Nvidia’s business practices for 10+ years.

I am 100% sure AMD would have done the exact same thing as NVIDIA does right now, given the chance.

Are you saying they wouldn't have milked the market to the last drop? Do you really believe it?

throwaway613745•12m ago
Of course they would have milked us that’s why I want Nvidia to stick around, to keep AMD in check.
keyringlight•8m ago
If you look at AMD's CPUs there's indications they do that. When Zen1/1+/2 came out they were priced below intel's products as they needed to rebuild mindshare with their promising new chips, from Zen3 onwards where they started building a performance lead in many categories as well as core count they jacked the prices up because they could demand it.
resfirestar•2h ago
This is just DRAM hysteria spiraling out to other kinds of hardware, will age like fine milk just like the rest of the "gaming PC market will never be the same" stuff. Nvidia has Amazon, Google, and others trying to compete with them in the data center. No one is seriously trying to beat their gaming chips. Wouldn't make any sense to give it up.
wmf•1h ago
It's not related to the DRAM shortage. Gaming dropped to ~10% of Nvidia's revenue a year or two ago due to AI and there was controversy years before that about most "gaming" GPUs going to crypto miners. They won't exit the gaming market but from a shareholder perspective it does look like a good idea.
htrp•1h ago
yet another reason to not listen to your shareholders.

if it were up to them, cuda would be a money losing initiative that was killed in 2009

willis936•1h ago
It's a bad idea and yet everyone does it.
internet101010•1h ago
Furthermore, I would wager a giant portion of people who have entered the ML space in the last five years started out by using CUDA on their gaming rigs. Throwing away that entrenchment vector seems like a terrible idea.
Animats•4m ago
> Gaming dropped to ~10% of Nvidia's revenue a year or two ago due to AI

Well, actually it's that the AI business made NVidia 10x bigger. NVidia now has a market cap of $4.4 trillion. That's six times bigger than General Motors, bigger than Apple, and the largest market cap in the world. For a GPU maker.

rhco•1h ago
If Nvidia did drop their gaming GPU lineup, it would be a huge re-shuffling in the market: AMD's market share would 10x over night, and it would open a very rare opportunity for minority (or brand-new?) players to get a foothold.

What happens then if the AI bubble crashes? Nvidia has given up their dominant position in the gaming market and made room for competitors to eat some (most?) of their pie, possibly even created an ultra-rare opportunity for a new competitor to pop up. That seems like a very short-sighted decision.

I think that we will instead see Nvidia abusing their dominant position to re-allocate DRAM away from gaming, as a sector-wide thing. They'll reduce gaming GPU production while simultaneously trying to prevent AMD or Intel from ramping up their own production.

It makes sense for them to retain their huge gaming GPU market share, because it's excellent insurance against an AI bust.

mananaysiempre•1h ago
Took what, four years for PC cases to get back to reasonable prices after COVID? And that’s a relatively low-tech field that (therefore) admits new entrants. I don’t know, I’m not feeling much optimism right now (haven’t at any point after the crypto boom), perhaps because I’ve always leaned towards stocking up on (main) RAM as a cheap way to improve a PC’s performance.
ZiiS•2h ago
Then Intel and AMD carry on, tbh having sewn up handhelds and consoles and made gaming on integrated graphics mainstream many won't notice. An AI bubble burst leaving loads of GPU laden datacenters is much more likely to hasten cloud gaming.
j45•1h ago
It wouldn't be unheard of.

Qualcomm before they made all the chips they do today, ran a pretty popular and successful email client called Eudora.

Doing one thing well can lead to doing bigger things well.

More realistically, if the top end chips go towards the most demanding work, there might be more than enough lower grade silicon that can easily keep the gaming world going.

Plus, gamers rarely stop thinking in terms of gaming, and those insights helped develop GPUs into what they are today, and may have some more light to shine in the future. Where we see gaming and AI coming together, whether it's in completely and actually immersive worlds, etc, is pretty interesting.

Update: Adding https://en.wikipedia.org/wiki/Eudora_(email_client)

flopsamjetsam•1h ago
I had completely forgotten about the existence of Eudora. Thanks friend, that lead me down a mental rabbit hole.
kps•1h ago
Mac Eudora was the best email client ever. If it had got UTF8 support I'd probably still be running it in an emulator.
j45•1h ago
I just learned today that there has been some efforts underway: https://hermes.cx/
dhosek•1h ago
If the AI bubble doesn’t burst is carrying an awful lot of water there…
bitwize•1h ago
Look.

Most of the consumer market computes through their smartphones. The PC is a niche market now, and PC enthusiasts/gamers are a niche of a niche.

Any manufacturing capacity which NVIDIA or Micron devote to niche markets is capacity they can't use serving their most profitable market: enterprises and especially AI companies.

PCs are becoming terminals to cloud services, much like smartphones already are. Gaming PCs might still be a thing, but they'll be soldered together unexpandable black boxes. You want to run the latest games that go beyond your PC's meager capacity? Cloud stream them.

I know, I know. "Nothing is inevitable." But let's be real: one thing I've learned is that angry nerds can't change shit. Not when there's billions or trillions of dollars riding on the other side.

ryandrake•1h ago
It would be great if more GPU competition would enter the field instead of less. The current duopoly is pretty boring and stagnant, with prices high and each company sorta-kinda doing the same thing and milking their market.

I'm kind of nostalgic for the Golden Age of graphics chip manufacturers 25 years ago, where we still had NVIDIA and ATI, but also 3DFX, S3, Matrox, PowerVR, and even smaller players, all doing their own thing and there were so many options.

venturecruelty•2m ago
We'd need our government to actually enforce antitrust laws that have been on the books for about a century. Good luck.
TehCorwiz•1h ago
If they do it'll likely be part of an industry wide push to kill off the home-built PC market. It's no secret that MS and others want the kind of ecosystem Apple has and governments want more backdoor access to tech. And which mfg wouldn't want to eliminate partial upgrades/repairs. Imagine that the only PC you could buy one day has everything tightly integrated with no user serviceable or replaceable parts without a high-end soldering lab. Now, since it's impractical to build your own they can raise the price to purchase one above reach of most people and the PC market succeeds in their rental PC aspirations.
Imustaskforhelp•31m ago
There is definitely a part of me which feels like with the increasing ram prices and similar. Its hard for people to have a home lab.

To me what also feels is that there becomes more friction in an already really competitive and high-friction business of creating cloud.

With increasing ram prices which I (from my knowledge) would only decrease in 2027-2028 or when this bubble pops, It would be extremely expensive for a new entry of cloud provider in this space.

When I mention cloud provider, what I mean aren't the trifecta of AWS,Azure or GCP but rather all the other providers who bought their own hardware and are co-locating it to a datacenter and selling their services targeted at low/mid-range vps/vds servers

I had previously thought about creating cloud but in this economy and the current situations, I'd much rather wait.

The best bet right now for most people creating cloud /providing such services is probably whitewashing any other brand and providing services on top that make you special.

The servers are still rather cheap but the mood that I can see in providers right now is that they are willing to hold the costs for some time to not create a frenzy (so they still have low prices) but they are cautiously waiting and looking for the whole situation and if recent developments continue happening in such a way, I wouldn't be surprised if server providers might raise some prices because the effective underlying hardware's ram/prices increased too.

thewebguyd•13m ago
Feel the same way here. Can't help but get the vibe that big tech wants to lock consumers out, eliminate the ability to have personal computing/self-hosted computing. Maybe in tandem with governments, not sure, but it's certainly appetizing to them from a profit perspective.

The end goal is the elimination of personal ownership over any tech. They want us to have to rent everything.

cyber_kinetist•5m ago
I think China will then try to sell their own PC parts instead, their semiconductor industry is catching up so who knows in a decade.

But perhaps then the US will probably reply with tariffs on the PC parts (or even ban them!) Which is slowly becoming the norm for US economic policy, and which won't reverse even after Trump.

webdevver•1h ago
the pc gaming market is a hobbyist niche compared to the ongoing infrastructure projects.

i predict that the "pc" is going to be slowly but surely eaten bottom-up by increasingly powerful SoCs.

Animats•1h ago
From the article: "(NVidia) AI data center revenue reached $51.2 billion versus just $4.3 billion from gaming in Q3 2025."

Moore Threads in China just announced a new GPU.[1] Announced, not shipped.

[1] https://wccftech.com/moore-threads-lushan-gaming-huashan-ai-...

SkyMarshal•1h ago
I'm not sure it would matter. It doesn't seem that graphics are the limiting factor in games anymore. Plenty of popular games use variations on cartoon-style graphics, for example - Fortnight, Overwatch, Valorant, etc. Seems gameplay, creativity, and player community are more determining factors.

That said, things like improved environmental physics and NPC/enemy AI might enable new and novel game mechanics and creative game design. But that can come from AMD and others too.

mikepurvis•53m ago
Notably the games you listed are all f2p/esports games, and that does matter in terms of how much budget developers have to polish a realistic look vs ship a cartoon and call it the "art style".

I just upgraded to 9700 XT to play ARC Raiders and it's absolutely a feast for the eyes while also pioneering on several fronts especially around the bot movement and intelligence.

Animats•2m ago
> It doesn't seem that graphics are the limiting factor in games anymore.

Have you seen the GTA VI trailer?

wewewedxfgdf•39m ago
AMD would do the same thing as Nvidia but $50 cheaper.
AtlasBarfed•26m ago
Game graphics are still a high margin silicon business. Someone will do it.

Frankly, the graphics chops are plenty strong for a decade of excellent games. The big push in the next couple decades will probably be AI generated content to make games bigger and more detailed and more immersive

oersted•25m ago
I don’t understand why most people in this thread thinks this would be such a big deal. It will not change the market in significant negative or positive ways. AMD has been at their heals for a couple of decades and are still very competitive, they will simply fill their shoes, most games consoles have been AMD centric for a long time regardless, they are fairly dominant in the mid range and they’ve lost had the best price/performance value.

Overall, I think that AMD is more focused and energetic than their competitors now. They are very close of taking over Intel on the CPU race, both on datacenter and consumer segments, and Nvidia might be next in the next 5 years, depending on how the AI bubble develops.

yegle•17m ago
I've heard good things about Moore Threads. Who knows, maybe the consumer GPU market is not a duopoly after all, Nvidia exiting the market would be a good thing longer term by introducing more competitions.

My general impression is that the US technology companies either treat competition from China seriously and actively engage, or Chinese tech companies will slowly and surely eat the cake.

There are numerous examples: the recent bankruptcy of iRobot, the 3D printer market dominated by Bambu Labs, the mini PC market where Chinese brands dominates.

t1234s•6m ago
Looks like an hit piece to trigger some people to dump their $NVDA stock. They worked phrases like "abandon" and "AI Bubble" into the title/subtitle. Authors other articles look like clickbait crap https://www.pcworld.com/author/jon-martindale