frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

I Found an AI Image Editor That Saves Me Hours Every Week

https://glm-image.pro/
1•ri-vai•1m ago•1 comments

Design and Implementation of Sprites

https://fly.io/blog/design-and-implementation/
3•sethev•2m ago•0 comments

Show HN: Next.js Boilerplate 6.1 – a Next.js starter I've maintained for 5 years

https://github.com/ixartz/Next-js-Boilerplate
1•creativedg•4m ago•0 comments

Why not call nullptr NULL?

https://software.codidact.com/posts/292718/292759#answer-292759
1•untilted•4m ago•0 comments

When the LLM Programs Its Own Thinking

https://lambpetros.substack.com/p/when-the-llm-programs-its-own-thinking
1•kristianpaul•5m ago•0 comments

Ask HN: Anyone else finding it impossible to land a job?

1•Arch485•6m ago•0 comments

Show HN: Compressing a Histogram from 16GB to 2KB

https://alexkranias.com/essays/histogram.html
4•alexkranias•6m ago•0 comments

OVH removes egress fees for Object Storage

https://www.ovhcloud.com/en/public-cloud/prices/
3•jaegerma•8m ago•1 comments

Telemetry Overlay for Approaching Vehicles

https://vasil.org/telemetry-overlay-for-approaching-vehicles/
1•vasildb•9m ago•0 comments

Should I turn my open-source project into a product with a $5k budget?

https://github.com/Luthiraa/julie
1•luthiraabeykoon•9m ago•1 comments

Why Containers?

https://willhbr.net/2026/01/15/why-containers/
1•Kerrick•10m ago•0 comments

Agent Design Patterns

https://rlancemartin.github.io/2026/01/09/agent_design/
1•eustoria•10m ago•0 comments

Wikipedia Signs AI Licensing Deals on Its 25th Birthday

https://news.slashdot.org/story/26/01/15/1516207/wikipedia-signs-ai-licensing-deals-on-its-25th-b...
1•nomilk•11m ago•0 comments

Mageia 10 Alpha Released – 32-Bit ISOs Still Available

https://www.phoronix.com/news/Mageia-10-Alpha
1•Qem•11m ago•2 comments

European military personnel arrive in Greenland as Trump says US needs island

https://www.bbc.com/news/articles/cd0ydjvxpejo
8•vinni2•12m ago•8 comments

Design Systems for Software Engineers

https://newsletter.pragmaticengineer.com/p/design-systems-for-software-engineers
1•eustoria•12m ago•0 comments

A Pail of Air (1951)

https://www.gutenberg.org/cache/epub/51461/pg51461-images.html
2•debo_•13m ago•0 comments

Anthropic's official plugin gets the core principle of the Ralph Wiggum wrong

https://github.com/0livare/ralph-wiggum-ai
1•skramzy•13m ago•1 comments

Dropbox changing policy: Now threatening to delete files over storage limit

2•huksley•14m ago•0 comments

Tell HN: 1B Jobs on GitHub Actions

1•dorianmariecom•15m ago•0 comments

Cacoco: Sbardef Editor Written in Rust

https://github.com/lizzieshinkicker/Cacoco
1•klaussilveira•15m ago•0 comments

Show HN: I lost €50K to non-paying clients, so I built an AI contract platform

https://www.accordio.ai/
1•deduxer•15m ago•0 comments

Ask HN: What to teach my kid if AI does math and CS?

4•devShark•17m ago•2 comments

Show HN: First professional-grade AI fonts

https://fonthero.com/
1•jrd79•17m ago•0 comments

FLUX.2 [Klein]: Towards Interactive Visual Intelligence

https://bfl.ai/blog/flux2-klein-towards-interactive-visual-intelligence
1•meetpateltech•18m ago•0 comments

Researchers use Apple Watch to train a disease-detection AI

https://www.wareable.com/health-and-wellbeing/empirical-health-mit-jets-ai-health-tracking-model-...
1•brandonb•18m ago•0 comments

How local network privacy could you

https://eclecticlight.co/2026/01/14/how-local-network-privacy-could-affect-you/
2•lladnar•18m ago•0 comments

Researchers find trees could spruce up future water conservation efforts

https://phys.org/news/2025-12-trees-spruce-future-efforts.html
1•PaulHoule•19m ago•0 comments

From Blobs to Managed Context: Why AI Applications Need a Stateful Context Layer

https://zhihanz.github.io/posts/from-blobs-to-managed-context/
1•georgehe9•19m ago•0 comments

Latency Monitor: lightweight tool for TCP and UDP monitoring

https://mirceaulinic.net/2026-01-15-latency-monitor/
2•mirceaulinic•20m ago•1 comments
Open in hackernews

Nvidia Reportedly Ends GeForce RTX 5070 Ti Production, RTX 5060 Ti 16 GB Next

https://www.techpowerup.com/345224/nvidia-reportedly-ends-geforce-rtx-5070-ti-production-rtx-5060-ti-16-gb-next
35•ndiddy•1h ago

Comments

voidfunc•1h ago
Death of PC gaming incoming.

Happy I just bought my 5080 before Christmas. Theyre all on borrowed time.

legobmw99•37m ago
I am a recent 5070ti purchaser so I'm also feeling lucky, though if they exit the gaming market entirely I suspect the drivers will all go to crap soon thereafter
ndiddy•12m ago
Well if you look at the SKUs they're discontinuing, they're taking out all the lowest end models with more VRAM to save the allocation for the higher end models with jucier margins. For example the 5070 Ti costs $500 less than the 5080 but both have 16 GB of VRAM. I imagine that for the near future, they'll have the 5060 8GB, 5070 12GB, and 16GB will be limited to the 5080 for consumers willing to spend $1300 on a GPU.
lemoncookiechip•1h ago
No worries, gamers.

You can subscribe to our GeForce NOW service to rent a top of the line card through our cloud service for the low low price of 11€$£ or 22€$£ a month with *almost no restrictions.

*Except for all the restrictions.

merpkz•1h ago
I just bought a 5070 Ti a week ago and can attest I have used it for maybe 3-4 hours since then. It begs the question maybe I should have rented the compute instead instead of paying 900 eur on spot - that's like 3 years worth of rent.
throwaway2027•1h ago
Because it'll set a precedent and eventually kill off being able to own the hardware to run things locally anymore in the future.
pixl97•1h ago
https://en.wikipedia.org/wiki/You%27ll_own_nothing_and_be_ha...
threetonesun•1h ago
Currently after 3 years the price of the GPU if you decide to sell it might be a wash, much like it was after the crypto boom. Granted you have to pay for electricity to run it, but you also have full control over what it runs.
observationist•1h ago
If the compute is the unit of value under consideration, maybe. But there's more - you have access, freedom from supervision, the capability to modify, upgrade, tweak, adjust anything you want, resell compute under p2p cloud services when idle, etc. And then if the market for these gets hot, you can sell and recoup your initial costs and then some. The freedom and opportunity benefit - as opposed to the dependence and opportunity cost of renting - is where I personally think you come out on top.
dyauspitr•58m ago
Where the hell is 900 eur 3 years of rent
diab0lic•54m ago
Haha. I read that the same way the first time I read it. The commenter means 3 years of renting GPU from nvidia via cloud services.
close04•40m ago
I think that's comparing to 3 years of GeForce Now at ~22EUR/month for the Ultimate plan, for a total of ~800 EUR. For someone using in 3h/week then you might as well go for the free plan and pay nothing. But renting has while owning can only have financial cost, renting has a hidden cost on top of that. It leads to "atrophy" of the ownership right and once you lose that option you'll never get it back. That will have incalculable costs.
dymk•56m ago
It cost 900eur because nvidia is shafting you
wongarsu•50m ago
The correct calculation is not 900€/36 months but (900€-$resell_value)/36 months. If you sell your GPU for 450€ after three years you saved a good bit of money. If the AI bubble doesn't pop, your resale value might even be a good bit higher than that. I've had a used 1080TI that I used for five years and then sold for nearly the same price, making it effectively free (minus electricity use and opportunity cost)
Aurornis•45m ago
The GeForce Now service is actually a decent deal for casual gamers.

The hardcore and frequent gamers won’t like it but it was never really for them.

theodric•40m ago
So is a Steam Deck, really
close04•11m ago
The problem is that they're always a great deal, the best even, while there are alternatives. The noose tightens only after everyone is onboard.

And the competition on the GPU market is soft to say the least.

Fire-Dragon-DoL•4m ago
Yes! Then you want to play one of the FromSoftware game and you are doomed.

Damn nvidia

PunchyHamster•1h ago
It's 33 eurodollar now. I'm sorry I meant 44
lemoncookiechip•56m ago
China will save us, except no, we'll just ban their hardware sales, sucks to suck.
fc417fc802•46m ago
Does China have any cutting edge fabs yet? I thought it was still just TSMC, Samsung, and maybe Intel.

Maybe consumer electronics will move backwards by a process node or two?

lemoncookiechip•33m ago
They've just recently been able to reverse engineer ASML's EUV machines. They're years and years behind, although the way things are moving forward with hardware prices skyrocketing (RAM, SSD, GPUs...) regular consumer won't have much choice in anything anyway for a while.
re-thc•1h ago
Time for AMD to shine?
roboror•1h ago
They already had worse margins so probably not unless they've been hoarding RAM. AMD also wants DC money.
PunchyHamster•1h ago
you'd fool me looking at how PITA is to make stuff work compared to NVIDIA
pixl97•1h ago
Wanting something, and executing on it properly are two different things.
whatevaa•1h ago
Knowing AMD, shine by doing the same.
keyringlight•37m ago
My impression for the past decade or so is that Radeon for PC is AMD's way of keeping their GPU division's engine running between contracts for consoles or compute products. At the very least it's a welcome byproduct that provides a competent iGPU and a test bed for future 'main' products. It's been a long while since AMD has shown future vision for PC GPUs or they've led with a feature instead of following what others do.
MrBuddyCasino•54m ago
AMD avoids a price war with Nvidia for the simple reason that Nvidia has much, much more cash and will win this war, easily.
zvqcMMV6Zcr•51m ago
AMD's approach to pricing was "comparable NVidia card minus $50". If price of remaining NVidia cards goes up then AMD will follow.
ecshafer•1h ago
Crucial shut down, Nvidia not producing consumer cards. Even with AMD cards, if there's no memory available then we can't get them either.

Ram is 4-5x the price of a year ago.

Is AI going to kill the consumer computer industry?

fc417fc802•1h ago
That's one possibility. Another is that it's temporary until production can be ramped up (but I doubt it because fabs). Pessimistic take is that the suppliers expect the bubble to pop soon (and very violently) and want to maximize their take while they still can.

Or maybe assuming the trend holds in the longer term it could mean that consumers will move downstream of datacenters. Anyone who wants a GPU rocking 3 to 5 year old recycled enterprise gear.

baal80spam•1h ago
> Is AI going to kill the consumer computer industry?

Even if, the death of the AAA gaming is nothing I will cry about. Most games don't require anything remotely as performant as 5070.

ecshafer•58m ago
I don't play any AAA games really. The only "AAA" game I've played in the past few years is basically Baldurs Gate 3 and Kingdom Come Deliverance II. But mostly I play rpgs and strategy games that don't require much gpu power at all.
emsign•23m ago
I don't care about AAA gaming either, it's stale but one day the AI bubble will kill something you cry about though.
piva00•41s ago
There are niches like sim racing which require a high powered GPU if you want to run ultrawide or triple screens though.

Just saying that your grudges with AAA games have a blast effect you might not be aware of.

Aurornis•40m ago
> Nvidia not producing consumer cards.

This is a false statement. They’re still producing consumer cards. You can go buy a 5070FE in stock on their web store at MSRP right now. You can buy a discounted 5060 from Best But below MSRP.

They’re changing production priorities for a little while if the rumors are accurate.

RAM prices have always been cyclical and prone to highs and lows. This is an especially high peak but it will pass like everything else.

These predictions that the sky is falling are way too dramatic.

throwaway2027•1h ago
Maybe game companies will be forced to optimize their games and focus on innovative gameplay elements rather than graphics.
wongarsu•43m ago
AAA game companies won't care, they'll just continue targeting the latest console. For most of them releasing on PC is a nice bonus, not a core part of their strategy
Anonyneko•1h ago
Bought a slightly overpriced, even by its own standards, 5090 in May, hope that it lasts me through the next five years of madness, and that the madness will have some kind of a temporary respite (or I luck out on a higher paying job, or figure out how to invest properly - it seems like a lottery these days).

My only small regret is that I decided to build an SFF PC, otherwise I would've gone for 128 GB of RAM instead of just 64. Oh well, ̶6̶4̶0̶ ̶K̶B̶ 64 GB should be enough for most purposes.

827a•1h ago
IMO, sadly: the DIY PC world is on life support and will likely be something that isn't even really possible to do, for top-of-the-line performance, by 2028.

I don't necessarily think that everything is going doomer "subscription based cloud streaming"; the economics of these services never made sense, especially for gaming, and there's little reason to believe that the same incentives that led to Nvidia, Crucial, etc wanting out of the consumer hardware business wouldn't also impact that business.

Instead, the future is tightly integrated single-board computers (e.g. Framework Desktop, the new HP keyboard, Mac Mini, RPi, etc). They're easier for consumers to buy. Integrated memory, GPU, and cooling means we can drive higher performance. All of the components getting sourced by one supplier means the whole "X is leaving the consumer market" point is moot, and allows better bulk deals to be negotiated. They're smaller. It allows one company (e.g. Framework) to capture more margin than sharing with ten GPU or memory middle-men who just slap a sports car-looking cooler on whatever they bought from Micron and saying they're a real business.

My lingering hope is that we do see some company succeed who can direct-sell these high-end SBCs to consumers, so if you want to go the route of a custom case and such, you still can. And that we don't lose modular storage. But I've lost all hope that DIY PCs will survive this decade; to be frank, they haven't made economic sense for a while.

fc417fc802•50m ago
> All of the components getting sourced by one supplier means the whole "X is leaving the consumer market" point is moot, and allows better bulk deals to be negotiated.

I don't think that checks out. The fabs are booked out AFAIU. This is going to hit SoCs (and anything else you can come up with) sooner rather than later because it all depends on the same fabs producing the same silicone at the end of the day. It's just packaged differently.

They left the consumer market due to the price difference. It's not that there aren't middlemen willing to purchase in bulk right now. It's that the OEMs aren't willing to sell at any price because they've already sold their entire future inventory at absurd prices for the next however many months or years.

I assume there will still be at least a few SoCs to choose from but the prices will likely be completely absurd because they will have to match the enterprise price for the components that go into them.

newsclues•54m ago
Gamers are going to burn DC to the ground.
xnx•47m ago
From a resource allocation perspective it's crazy that millions of valuable GPUs (and memory!) is sitting in personal computers and game consoles unused.
CivBase•40m ago
You could say that about litterally anything. Food, housing, fuel, heat, water. There are always solutions for better optimizing global resource allocation - especially if you're willing to ignore the wants and rights of the people.
Anonyneko•38m ago
If it was easy enough to rent my desktop while I'm not using it (such that I can get it back whenever I need it myself, within a few minutes at most), I would happily do it.
nerdjon•36m ago
Do you mean when the computer is not in use or “unused” in the sense that even when gaming it is just being used for gaming and not something “productive”.

2 very different arguments and not fully clear which you are trying to make.

xnx•17m ago
> Do you mean when the computer is not in use

This. No judgement on any particular use. Just worth a reminder that the most advanced machines every produced make this magic rocks that sit there idle most of the time.

emsign•41m ago
Rumors have it they'll stop producing gaming GPUs all together. :(
infecto•39m ago
I don’t subscribe to all of this doom and gloom. I would like to consider myself a gamer and to be frank I used the same computer setup for since 2018 until I recently upgraded it in the past few months. Even with increased costs we are seeing the dollar spent per hour of entertainment is ridiculously cost effective.
nerdjon•38m ago
It will be interesting to see what the long term impact of this will be, the headline misses the biggest part that they (if the phrasing they use is correct) should be producing more of the lower speced (and cheaper) 5060 8GB model.

So while the news is not great, I think it is far from any doom and gloom if we are in fact going to be getting more 5060 cards.

As it is the value of the crazy higher speced cards was questionable with most developers targeting console specs anyways. But it does bring to question how this might impact the next generation of consoles and if those will be scaled back.

We will likely be seeing some stagnation of capability for a couple years. Maybe once the bubble pops all the work that went into AI chips can come back to gaming chips and we can have a big leap in capability.

etempleton•37m ago
I could see Nvidia completely stepping out of the low to mid range Desktop GPU space. The margins have to be peanuts compared to their other business lines.
patapong•36m ago
Very curious about the second order effects of the hundreds of billions poured into LLMs. Perhaps even if LLMs do not pan out, we will have a massive increase in green energy production, grid enhancements and a leap in capacity for general-purpose computing over the next few years? Or maybe that is my naive side talking...