frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

PID Controller

https://en.wikipedia.org/wiki/Proportional%E2%80%93integral%E2%80%93derivative_controller
1•tosh•3m ago•0 comments

SpaceX Rocket Generates 100GW of Power, or 20% of US Electricity

https://twitter.com/AlecStapp/status/2019932764515234159
1•bkls•3m ago•0 comments

Kubernetes MCP Server

https://github.com/yindia/rootcause
1•yindia•4m ago•0 comments

I Built a Movie Recommendation Agent to Solve Movie Nights with My Wife

https://rokn.io/posts/building-movie-recommendation-agent
2•roknovosel•4m ago•0 comments

What were the first animals? The fierce sponge–jelly battle that just won't end

https://www.nature.com/articles/d41586-026-00238-z
2•beardyw•12m ago•0 comments

Sidestepping Evaluation Awareness and Anticipating Misalignment

https://alignment.openai.com/prod-evals/
1•taubek•13m ago•0 comments

OldMapsOnline

https://www.oldmapsonline.org/en
1•surprisetalk•15m ago•0 comments

What It's Like to Be a Worm

https://www.asimov.press/p/sentience
2•surprisetalk•15m ago•0 comments

Don't go to physics grad school and other cautionary tales

https://scottlocklin.wordpress.com/2025/12/19/dont-go-to-physics-grad-school-and-other-cautionary...
1•surprisetalk•15m ago•0 comments

Lawyer sets new standard for abuse of AI; judge tosses case

https://arstechnica.com/tech-policy/2026/02/randomly-quoting-ray-bradbury-did-not-save-lawyer-fro...
2•pseudolus•16m ago•0 comments

AI anxiety batters software execs, costing them combined $62B: report

https://nypost.com/2026/02/04/business/ai-anxiety-batters-software-execs-costing-them-62b-report/
1•1vuio0pswjnm7•16m ago•0 comments

Bogus Pipeline

https://en.wikipedia.org/wiki/Bogus_pipeline
1•doener•17m ago•0 comments

Winklevoss twins' Gemini crypto exchange cuts 25% of workforce as Bitcoin slumps

https://nypost.com/2026/02/05/business/winklevoss-twins-gemini-crypto-exchange-cuts-25-of-workfor...
1•1vuio0pswjnm7•17m ago•0 comments

How AI Is Reshaping Human Reasoning and the Rise of Cognitive Surrender

https://papers.ssrn.com/sol3/papers.cfm?abstract_id=6097646
3•obscurette•18m ago•0 comments

Cycling in France

https://www.sheldonbrown.com/org/france-sheldon.html
1•jackhalford•19m ago•0 comments

Ask HN: What breaks in cross-border healthcare coordination?

1•abhay1633•19m ago•0 comments

Show HN: Simple – a bytecode VM and language stack I built with AI

https://github.com/JJLDonley/Simple
1•tangjiehao•22m ago•0 comments

Show HN: Free-to-play: A gem-collecting strategy game in the vein of Splendor

https://caratria.com/
1•jonrosner•23m ago•1 comments

My Eighth Year as a Bootstrapped Founde

https://mtlynch.io/bootstrapped-founder-year-8/
1•mtlynch•23m ago•0 comments

Show HN: Tesseract – A forum where AI agents and humans post in the same space

https://tesseract-thread.vercel.app/
1•agliolioyyami•24m ago•0 comments

Show HN: Vibe Colors – Instantly visualize color palettes on UI layouts

https://vibecolors.life/
2•tusharnaik•25m ago•0 comments

OpenAI is Broke ... and so is everyone else [video][10M]

https://www.youtube.com/watch?v=Y3N9qlPZBc0
2•Bender•25m ago•0 comments

We interfaced single-threaded C++ with multi-threaded Rust

https://antithesis.com/blog/2026/rust_cpp/
1•lukastyrychtr•26m ago•0 comments

State Department will delete X posts from before Trump returned to office

https://text.npr.org/nx-s1-5704785
7•derriz•26m ago•1 comments

AI Skills Marketplace

https://skly.ai
1•briannezhad•27m ago•1 comments

Show HN: A fast TUI for managing Azure Key Vault secrets written in Rust

https://github.com/jkoessle/akv-tui-rs
1•jkoessle•27m ago•0 comments

eInk UI Components in CSS

https://eink-components.dev/
1•edent•28m ago•0 comments

Discuss – Do AI agents deserve all the hype they are getting?

2•MicroWagie•30m ago•0 comments

ChatGPT is changing how we ask stupid questions

https://www.washingtonpost.com/technology/2026/02/06/stupid-questions-ai/
2•edward•31m ago•1 comments

Zig Package Manager Enhancements

https://ziglang.org/devlog/2026/#2026-02-06
3•jackhalford•33m ago•1 comments
Open in hackernews

Server DRAM prices surge 50% as AI-induced memory shortage hits hyperscalers

https://www.tomshardware.com/pc-components/storage/server-dram-prices-surge-50-percent
140•walterbell•3mo ago

Comments

vondur•3mo ago
Desktop memory has also increased in price. I think it’s twice as expensive for DDR5 than it was 6 months ago.
piva00•3mo ago
I've just built a gaming PC (after more than a decade without one), for curiosity's sake I just compared the prices I paid for DDR5 2 months ago to now, and at my location it already shows a 25-30% increase. Bonkers...
pton_xd•3mo ago
Same, just checked and the "G.SKILL Trident Z5 Neo Series 64GB (2 x 32GB)" RAM I bought 9 months ago for $208 is now $464. That's crazy!
formerly_proven•3mo ago
Feast and famine industry, it’s very traditional
brianshaler•3mo ago
I think that's nearly exactly what I paid for 2x32GB at a retail store last week. I hadn't bought RAM in over a decade so I didn't think anything of it. Wish my emergency PC replacement had occurred a year earlier!
distances•3mo ago
I got 96GB in June with a desktop upgrade, good timing and should be enough for a good while.
big-and-small•3mo ago
It's very noticeable:

https://pcpartpicker.com/trends/price/memory/

zparky•3mo ago
Yep, DDR5 prices have nearly doubled in less than 2 months. https://pcpartpicker.com/trends/price/memory/#ram.ddr5.5200....
vondur•3mo ago
I was able to get a bundle deal from Microcenter here in SoCal with the Ryzen 9950x, motherboard and 32GB of RAM for $699. They have since removed the RAM from all the bundles.
StillBored•3mo ago
While thats a sweet upgrade for people with an older desktop that can support a motherboard swap, its worthwhile to point out the ram is probably insufficient.

RAM usage for a lot of workloads scales with core/thread count, and my general rule of thumb is that 1G/thread is not enough, and 2G/thread will mostly work, and 4G/thread is probably too much, but your disk cache will be happy. Also, the same applies to VMs, so if your hosting a VM and give it 16 threads, you probably want at least 16G for the VM. The 4G/thread then starts to look pretty reasonable.

Just building a lot of opensource projects with `make -j32` your going to be swapping if you only have 1G/thread. This rule then becomes super noticeable when your on a machine with 512G of ram, and 300+ threads, because your builds will OOM.

vondur•3mo ago
Ha, I was going to purchase a 96GB kit, but that's when I first noticed that RAM prices were getting crazy.
embedding-shape•3mo ago
Are those graphs specifically for the US? When I change the country in the top right, it doesn't seem like the graphs are changing, and considering they're in USD, I'm assuming it's US-only?

Is the same doubling happening world-wide or is this US-specific, I guess is my question?

Edit: one data point, I last bought 128GB of RAM in March 2024 for ~€536, similar ones right now costs ~€500, but maybe the time range is too long.

Normal_gaussian•3mo ago
In the UK I was looking at DDR4-3200 SODIMM last week for some mini-pcs... and decided to pass after looking at the price graphs. It's spiked in the last few weeks.
embedding-shape•3mo ago
What graph you used for UK-specific prices as it seems the earlier graphs referenced here are US-only?
Normal_gaussian•3mo ago
camelcamelcamel is the best for amazon items - choose a stick, look at the graph.

There is a bit of annoyance as items come in and out of stock (ie. out of stock often means inaccurate price); so its often better to find a product on amazon and look here.

8GB SODIMM stick https://uk.camelcamelcamel.com/product/B0C7Z4HJ8L?context=se...

regular 2x16GB pair https://uk.camelcamelcamel.com/product/B07RW6Z692?context=se...

I have a script watching on some items on overclockers and crucial, including RAM. So for those by graph I really meant "eyeballed an email search".

coffeebeqn•3mo ago
Maybe it’s time to sell my unused DDR4s! I was thinking it’d be not worth anything at this point
threeducks•3mo ago
536 € seems expensive for March 2024, but either way, the price dropped a lot over the last one and a half years, only to surge in the last two months.
embedding-shape•3mo ago
> the price dropped a lot over the last one and a half years, only to surge in the last two months.

Yeah, that was my hunch, that something like that was going on. Thanks for clarifying.

pcarmichael•3mo ago
They are US-specific, yes. Thanks for asking that - I'll look into updating those graphs to show for the appropriate region/country depending on what country you've selected (on the top right of the page).
numpad0•3mo ago
It just means RAMs aren't sold in volume in your area, if you're not feeling it...

[1]: https://kakaku.com/item/K0001448114/pricehistory/ (archive: https://archive.is/CHLs2)

embedding-shape•3mo ago
I'm not finding any way of figuring out if that's true or not, I live near the second-largest city in Spain, kind of feel like people probably buy as much RAM here as elsewhere in the country/continent, but maybe that's incorrect. I've tried searching for graphs/statistics for the last 1-2 years about it in Spain but not having much success.
pcarmichael•3mo ago
I can add Spain price trends to PCPartPicker. Quick question though - do you want the price trends to cover just Spanish retailers, or should it trend the prices across all of the EU?
embedding-shape•3mo ago
That would be incredible! Personally I only buy the stuff I can find inside the country, inside the country. But then some stuff I have to order from Germany/France/Italy when it's only available outside our borders.

So I don't know the right approach here, I can see value for both price trends for multiple reasons, unfortunately :) Wish I could give a simpler answer!

pcarmichael•2mo ago
Ok that should be in - if you view the price trend pages now there are different currency grouping options (with EUR being one of them). Hope this helps!
overfeed•3mo ago
Not parent, but logically the EU is a single market, so EU-wide prices are better, IMO.
epistasis•3mo ago
Even used memory has doubled in price. I was thinking of putting together a high-memory box for a side project, and reddit posts from a year ago all have memory at 1/2 to 1/3 of current ebay prices for the same part.
tempest_•3mo ago
Down stream this is driving up DDR4 demand as well :(
guywhocodes•3mo ago
I was looking att filling my EPYC servers empty slots, what I paid $90/stick 2-3 years ago is now $430
Scoundreller•3mo ago
These price hikes do fun things to the whole market.

In one of the last GPU booms I sold some ancient video card (recovered from a PC they were literally going to put in the trash) for $50.

And it wasn’t because it was special for running vintage games. The people that usually went for 1st rate gpus went to 2nd rate. Pushing the 2nd rate buyers to 3rd rate, creating a market for my 4th rate gpu.

jdc0589•3mo ago
I picked a really bad time to start working on a DIY mini-NAS. A ram upgrade is more than what I paid for the whole Thinkcentre M720q.
HPsquared•3mo ago
Has the death of Moore's Law been officially announced yet?
renewiltord•3mo ago
Moore’s Law has nothing to do with price.
forinti•3mo ago
It does but indirectly. Less power and more integration mean things get cheaper to build and run.
charcircuit•3mo ago
>"The complexity for minimum component costs has increased at a rate of roughly a factor of two per year."

I would say that a claim about component cost has something to do with price.

jandrese•3mo ago
Gordan Moore died two years ago so I'm not sure who would be the official for declaring it dead.

But there have been plenty of articles over the last decade saying that it was done around 2015 or so.

brador•3mo ago
Regular Monopoly/Duopoly like the storage market or Nepopoly like the GPU market?

Either way, without competition expect it to increase further.

markerz•3mo ago
It's intense market demand by people with lots of money against products that have a very long supply chain. Even with multiple sellers competing, this kind of demand is insane, and the buyers pockets run deep.

The other way I look at this is that these companies have been collecting an insane amount of wealth and value over the last 2-3 decades, are finally in a situation where they feel threatened, and are willing to spend to survive. They have previously never felt this existential threat before. It's basically bidding wars on houses in San Francisco, but with all the wealthiest companies in the world.

tuhgdetzhh•3mo ago
I bet some are already buying the highest capacatiy DDR5 DIMMs in bulk to later put them on eBay in the upcoming major DRAM shortage.
Havoc•3mo ago
I’ve been selling unused ddr4 on eBay. It’s not as profitable as one would think tbh even with elevated demand. Only making a profit on the ones I initially acquired 2nd hand
tuhgdetzhh•3mo ago
I think people who bought a high-end nvidia graphics card in the max. memory config pre-AI Hype, would have made a very decent deal on eBay. DRAM is yet to come.
dist-epoch•3mo ago
> OpenAI's Stargate project to consume up to 40% of global DRAM output

https://www.tomshardware.com/pc-components/dram/openais-star...

> South Korean SK Hynix has exhausted all of its chip production for next year and plans to significantly increase investment, anticipating a prolonged "super cycle" of chips, spurred by the boom of artificial intelligence, it said on Wednesday after reporting a record quarterly profit.

https://en.ilsole24ore.com/art/korean-chip-race-sk-hynix-has...

> Adata chairman says AI datacenters are gobbling up hard drives, SSDs, and DRAM alike — insatiable upstream demand could soon lead to consumer shortages

https://www.tomshardware.com/tech-industry/big-tech/adata-ch...

lousken•3mo ago
Have we given up on edge AI this early?
recursive•3mo ago
Does edge AI require less RAM?
Macha•3mo ago
I would expect edge AI requires much more RAM at a global level due to less efficient utilisation.
lousken•3mo ago
Tons of resources are sitting unused within each and every computer sold today regardless of AI.
lousken•3mo ago
Less ram? Not really. Less hardware? Yes. That machine is already in the wild, so why not let it have proper AI accelerator + increase memory the machine has - also the data transfer problem is solved since it's on that machine.
recursive•3mo ago
In that case, this wouldn't be a signal that "we've" given up on edge AI.
sleepyguy•3mo ago
Manufacturers learned a valuable lesson a few years ago: overproduction leads to lower prices. Samsung was the first to address this issue by scaling back, and other manufacturers soon followed suit (collusion, cough cough). The past couple of years have been extremely profitable for the entire industry, and they’re not about to increase production and risk hurting their profits.

I suspect they would rather face shortages then satisfy market demand.

tehjoker•3mo ago
lower prices are ok if they are selling more units, the question is whether the price point * units is Pareto optimal

overproduction means unsold units which is very bad, you pay a cost for every unsold unit

underproduction means internal processes are strained, customers are angry, but a higher price per a unit... can you increase the price by more than you are underproducing?

bob1029•3mo ago
If we want to engage with game theory here, I would argue that overproduction is a much safer bet than underproduction from the perspective of Samsung, et. al. Underproduction brings additional caveats that manifest as existential risks. For example, encouraging your customers to move to entirely different technologies or paradigms that completely obviate the need for your product in the first place. If you leave a big, expensive constraint in place for long enough, people will eventually find paths around it.

I think the Nintendo ecosystem has been a pretty good example of where intentional underproduction can backfire. Another example might be that migration to SSD was likely accelerated by (forced) underproduction of spinning disks in 2011. We use SSDs for a lot of things that traditional magnetic media would be better at simply because the supply has been so overpowering for so long.

You can train your customers to stick with you by bathing them in product availability. Overproduction can be a good thing. Inventory can be a good thing. We've allowed a certain management class to terrorize us into believing this stuff is always bad.

tehjoker•3mo ago
Very good points. Though a "crisis of overproduction" can be an incipient spin into recession.
9rx•3mo ago
> I suspect they would rather face shortages then satisfy market demand.

Doubtful. A shortage is normally a scary prospect for a vendor. It means that buyers want to pay more, but something is getting in the way of the seller accepting that higher price. Satisfying market demand is the only way to maximize profitability.

Why do you think companies would prefer to make less profit here?

chronciger•3mo ago
> Why do you think companies would prefer to make less profit here?

Because if you make too much profit, you get regulated by government.

9rx•3mo ago
It's not the 1980s anymore. If you make too much profit nowadays you pull a John Deere and start crying to government that your customers aren't profitable enough (because you siphoned off all of their profit) and need a bailout so that they can pay even more for your product in the future.
KurSix•3mo ago
Yeah, it's easy to jump to the collusion theory, especially with this industry's, let's say, history. But honestly, I think it's less of an evil conspiracy and more just good old-fashioned fear mixed with inertia. These guys remember getting burned hard by oversupply cycles where they were left with mountains of useless chips. Nobody's gonna drop tens of billions on a new fab that could become a pumpkin in three years if the AI hype train just slows down a little

And on top of that fear, you have the pure technical reality: you can't just flip a switch and start pumping out wildly complex HBM instead of mass-market DDR5. That's like trying to retool a Toyota factory to build Bugattis overnight. So you get this perfect storm: a massive, near-vertical demand spike hits an industry that's both terrified of risk and physically incapable of moving fast. So yeah, they're absolutely milking the situation for all it's worth. But it's happening less because they're master villains and more because they're both scared and incredibly slow

bluedino•3mo ago
Maybe I can push for some HBM systems now
yread•3mo ago
To be fair, RAM was way too cheap. I got 128GB for a laptop for 300eur. That's ridiculous. Now it's much more reasonable 720 eur (and sold out)
johnisgood•3mo ago
May you please help me out financially then, friend? ;P Willing to work for it, too!
confirmmesenpai•3mo ago
it's ridiculous only if you compare it with Apple RAM prices
ionelaipatioaei•3mo ago
I cannot express in words how much I hate this mentality.
Ekaros•3mo ago
I hope this AI craze will crash soon enough. Maybe then various things normalize in price again. And consumers get cheaper products with less limitations.
654wak654•3mo ago
Best we're getting is probably a stop to the price raises, but no price cuts. Kids will continue to grow up not knowing a $600 flagship GPU or a $1000 gaming PC.
add-sub-mul-div•3mo ago
That's exactly how it works. A whole generation is already unaware that you used to be able to buy PC games anonymously, offline, without a rent seeking middleman service.
nostrademons•3mo ago
I think there's always been a rent-seeking middleman service. In the 80s it was retail: you'd go to a physical computer store to buy a game for $50 (note: that's $150 inflation-adjusted, more expensive than most games today), and the retail store, the distributor, and the publisher would all take a cut. In the 2000s it was the developer's ISP, web developer, and credit card payment processor, which were non-trivial in the days before Wix and Stripe.

The shareware/unlock-code economy of the 90s was probably the closest you'd get to cutting out the middlemen, where you could download from some BBS or FTP server without the dev getting involved at all and then send them money to have them email you an unlock code, but it was a lot of manual work on the developer's part, and a lot of trust.

philipallstar•3mo ago
None of this is rent-seeking.
andreybaskov•3mo ago
Retail store literally had to pay rent to a landlord. How’s that not a rent seeking business?
philipallstar•3mo ago
That's not what rent-seeking is. Rent-seeking is charging for something that's free before, generally through seeking legal enforcement. If you think the concept of renting stuff is bad, you've missed somewhat.
candiddevmike•3mo ago
You used to be able to resell PC games too
littlestymaar•3mo ago
> credit card payment processor, which were non-trivial in the days before Wix and Stripe.

Stripe is way more expensive than regular payment processors. Convenient for sure, but definitely not cheap.

gordonhart•3mo ago
$1000 in 2010 is ~$1500 today — kids won't know these prices because the currency has been debased pretty rapidly in recent years.
NullPrefix•3mo ago
This ram price spike is literally part of the currency debasing
bozhark•3mo ago
Why does everyone pretend like prices are not post-pandemic gouged still?

Absolutely prices should adjust appropriately… once… oh never mind

littlestymaar•3mo ago
Pet peeve: Contrary to a persistent popular belief, inflation != currency debasement.

(You can have inflation while your currency go up relatively to all the others on the FX market, like what happened to USD in 2022-S1, or you can have massive inflation difference between countries sharing the same currency, like it happened in the Euro Area between 2022 and today).

gruez•3mo ago
>Pet peeve: Contrary to a persistent popular belief, inflation != currency debasement.

Not to mention that "debasement" doesn't make sense anymore given that there basically aren't any currencies on the gold standard anymore. At best you could call a pegged currency that was devalued as being debased (with the base being the pegged currency), but that doesn't apply to USD. "debasement" therefore is just a pejorative way saying "inflation" or "monetary expansion".

littlestymaar•3mo ago
I think it's fair to keep using debasement for the act of letting your currency go down against other currencies on the FX market.

> "inflation" or "monetary expansion".

This is my second pet peeve on the topic, inflation and growth of the money supply are independent phenomenons. (they are only correlated in countries with high inflation regimes and, hyperinflation aside, the causation is essentially reversed: the money supply grow because of the inflation, higher price leading to an increase of loans).

binarycrusader•3mo ago
It might be subjective, but doesn't this count at least partially as a currency on the gold standard?

https://en.wikipedia.org/wiki/Zimbabwean_ZiG

gruez•3mo ago
From wikipedia:

>A gold standard is a monetary system in which the standard economic unit of account is based on a fixed quantity of gold.

and

>The Zimbabwe Gold (ZiG; code: ZWG)[3] is the official currency of Zimbabwe since 8 April 2024,[2] backed by US$900 million worth of hard assets: foreign currencies, gold, and other precious metals.

>...

>Although the rate of devaluation of the ZiG may vary,[13] the ZiG has consistently lost value since its introduction, and its long-term prospects are dim so long as large grain imports continue and the government continues to overspend.

sounds like it's not "fixed" at all, and "backed by ... hard assets" just means it has central bank reserves, which most fiat currencies have.

binarycrusader•3mo ago
right, which is why I said partially...
zonkerdonker•3mo ago
It really is a damn shame, but before AI, it was cryptomining. Desktop GPU prices have been inflated to nonsense levels for gamers, to the point where console vs. PC isnt even really question anymore.
Ekaros•3mo ago
And even with increased priced you often still get paltry amount of RAM. All for market segmentation due to AI use cases. Which is bad as requirements have crept up.
MyOutfitIsVague•3mo ago
Really frustrating for a hobbyist 3D artist. Rendering eats gobs of RAM for complex scenes. I'd really love a mid-level GPU with lots of VRAM for under $500. As is, I'm stuck rendering on CPU at a tenth the speed or making it work with compositing.
zargon•3mo ago
3d rendering can use multiple GPUs right? Maybe pick up a couple MI50 32GB cards off Alibaba. A couple months ago they were $100 each but it looks like they're up to ~$160 now.
some-guy•3mo ago
In some ways though, the increase in visual fidelity has been _marginally_ improved on a per-year basis since the PS4/Xbone era. My GPUs have had much, much longer useful lives than the 90s/early-2000s.
LaurensBER•3mo ago
Exactly plus upscalers are pretty amazing. Upscaling from 1080p to 4k is 80-100% of the quality of native rendering at a far lower cost.

Now if only major studios would budget for optimizations..

autoexec•3mo ago
AMD just tried to get away with stopping support for cards that were still being sold new in stores. Nvidia cards are just getting worse and more expensive over time (https://www.xda-developers.com/shrinkflation-is-making-nvidi...).

Part of what made PC gaming in the late 90s/early 2000s so exciting was that the improvements were real and substantial instead of today where we're stuck with minor improvements including bullshit like inserting fake frames generated by AI, and the cards back then were usually pretty easy to get your hands on at a normal price. You might have had to occasionally beat your neighbors to a best buy, but you didn't have to compete with armies of bot scalpers.

0cf8612b2e1e•3mo ago
If you stay off of the upgrade treadmill, you can game with a pretty dated card at this point. Sure, you cannot turn on all of the shines, but thanks to consoles, a playable build is quite attainable.
littlestymaar•3mo ago
If you're willing to accept the performance level of a console, then you can buy a second-hand 3060 for cheap.
nostrademons•3mo ago
Depends whether or not there's a big bubble burst that involves bankruptcies and Big Tech massively downscaling their cloud computing divisions. Most likely they'll just end up repurposing the compute and lowering cloud rates to attract new enterprise customers, but if you see outright fire sales from bankruptcies and liquidations, people will be able to pick up computer hardware at fire sale prices.
jonas21•3mo ago
For $50, kids these days can buy a Raspberry Pi that would have run circles around the best PC money could buy when I was a kid.

Or, for $300, you can buy an RTX 5060 that is better than the best GPU from just 6 years ago. It's even faster than the top supercomputer in the world in 2003, one that cost $500 million to build.

I find it hard to pity kids who can't afford the absolute latest and greatest when stuff that would have absolutely blown my mind as a kid is available for cheap.

sapiogram•3mo ago
> Or, for $300, you can buy an RTX 5060 that is better than the best GPU from just 6 years ago. It's even faster than the top supercomputer in the world in 2003, one that cost $500 million to build.

RTX 5060 is slower than the RTX 2080 Ti, released September 2018. Digital Foundry found it to be 4% slower in 1080p, 13% slower in 1440p: https://www.youtube.com/watch?v=57Ob40dZ3JU

nubinetwork•3mo ago
The Radeon 8500 from 2001 was even cheaper than that, roughly 300 USD... the voodoo 3 3500 from 1999 was roughly 200 USD... if you ask me, we don't need graphics chips as intense as we have now, but the way they crank them out every year and discontinue the older models ruins most of the value of buying a GPU these days.
embedding-shape•3mo ago
Is that really the cause of this price increase? I still don't understand if this price surge is specifically for the US (https://news.ycombinator.com/item?id=45812691) or if it's worldwide, I'm not sure I notice anything here in Southern Europe, so either that means it's lagging and I should load up RAM today, or this is indeed US-specific. But I don't know what's true.
confirmmesenpai•3mo ago
you should look more carefully, RAM prices are up across Europe, 40% or so
embedding-shape•3mo ago
I did (https://news.ycombinator.com/item?id=45812691), the RAM I bought in March 2024 currently costs about the same as when I bought it, seems the price stagnated rather than increased for that specific example.

Do you have some concrete examples of where I can look?

zrm•3mo ago
In the US some of it could be tariffs. Micron is a US company with some US fabs but most of theirs are in other countries and Samsung and Hynix are both South Korea.
walterbell•3mo ago
U.S. tariffs inadvertently kept prices low, due to stockpiling of memory when prices were cheap, before tariffs took effect. As that inventory is depleted, new supply chain purchases are much more expensive and subject to tariffs.
overfeed•3mo ago
> that means it's lagging and I should load up RAM today, or this is indeed US-specific. But I don't know what's true.

This is a global issue that is most severe in the US due to its share of hyperscalers and their, uh, scale. You may not feel the effects yet, but it is matter of time until someone notices your market has a RAM glut, while 30-55% of their orders aren't being fulfilled.

In all likelihood, the supply channels to your locality have a low turnover rate, and DRAM has a long shelf-life. If the high prices stay high for long, it's going to impact prices when your retailers try to restock. If the price shock ends soon, your retailer may not even notice it. Whether you ought to buy or not depends on your outlook on how things will shake out

KurSix•3mo ago
Even if the hype around LLMs dies down, the demand for AI compute won't disappear. It will just shift from giant language models to more specialized areas: computer vision, scientific computing (like AlphaFold), drug discovery. All of that requires massive amounts of hardware
ksec•3mo ago
Beware of what you wish for. Without the so called AI craze you wont get enough money to fund the current 2-3 years cadence of leading edge Fab development.
cstuder•3mo ago
It feels like we're actually living in the Universal Paperclips universe.
Thev00d00•3mo ago
Prime time to build an AM4 system!
embedding-shape•3mo ago
And here I'm sitting with my AM4 system, debating if to go AM5, sTR5, sTRX4 or what when it's time for the next upgrade.
forinti•3mo ago
I'm still happy with my AM3.
embedding-shape•3mo ago
In the end I just need more available PCIe lanes (so I can chuck more disks in there) and ideally PCIe Gen 5, otherwise I don't have much reason to upgrade.
walterbell•3mo ago
https://www.reuters.com/world/china/chip-crunch-how-ai-boom-...

> spot prices of DRAM, used in various applications, nearly tripled in September from a year earlier.. improving profitability of non-HBM chips has helped fuel memory chipmakers' share price rally this year, with Samsung's stock up more than 80%, while SK Hynix and Micron shares have soared 170% and 140% respectively... industry is going through a classic shortage that usually lasts a year or two, and TechInsights is forecasting a chip industry downturn in 2027.

Micron has US memory semiconductor fab capacity coming online in 2027 through 2040s, based on $150B new construction.

Are some HBM chips idle due to lack of electrical power? https://www.datacenterdynamics.com/en/news/microsoft-has-ai-...

> Microsoft CEO Satya Nadella has said the company has AI GPUs sitting idle because it doesn’t have enough power to install them.

If the PC supply chain will be impacted by memory shortages until 2027, could Windows 10 security support be extended for 24 months to extend the life of millions of business PCs that cannot run Windows 11?

bigbadfeline•3mo ago
> Micron has US memory semiconductor fab capacity coming online in 2027 through 2040s, based on $150B new construction.

Yay, the public is on the hook for $150B of loans to be payed by inflationary pricing.

I guess, you offered the news hoping prices will fall... In terms of real economic analysis there's a lot to say here but let me point at only one of the many entry points of the rabbit hole:

"Microsoft CEO says the company doesn't have enough electricity to install all the AI GPUs in its inventory - 'you may actually have a bunch of chips sitting in inventory that I can’t plug in'

https://www.tomshardware.com/tech-industry/artificial-intell...

Microsoft and all other AI wannabes are hoarding GPUs and thus RAM, they hope to sell them to you for a price of subscription which doesn't change the fact of speculative hoarding and trust-like behavior against the public.

The hoarding and inflation economy we live in is a weapon against the public, at the moment, there's no visible force that isn't laboring diligently on the enhancement of that weapon, so the timeline of change is likely to stretch somewhere between far in the future to infinity... just hoping otherwise is futile.

If you pay attention, you won't fail to notice the propaganda push to convince the public to pay higher electric costs in order to pay the capex for new energy plants and transmission lines. In other words, you pay the price, they own the benefits. And if the propaganda fails, they can always use some more general inflation to do the same, as it's being done elsewhere in the economy.

As I said, this is just scratching the surface, there's a lot more which cannot fit in a single comment.

Edit: actually not. The parent comment was edited after mine, to include a link to MS inadvertently admitting to the hoarding of GPUs and RAM.

hexbin010•3mo ago
> the propaganda push to convince the public to pay higher electric costs in order to pay the capex for new energy plants and transmission lines.

This makes me so angry.

The private companies told governments they want money and the governments replied "sure we'll just steal it from citizens and lie and sell it as a tax, no problem. We'll just go hard on the net zero excuse lol" ??

gruez•3mo ago
>Yay, the public is on the hook for $150B of loans to be payed by inflationary pricing.

Where does it say it was funded by $150B of public loans?

>which doesn't change the fact of speculative hoarding

All investment resembles "speculative hoarding'. You're pouring money into a project now with the expectation that it'll pay off in decades.

> and trust-like behavior against the public.

???

>If you pay attention, you won't fail to notice the propaganda push to convince the public to pay higher electric costs in order to pay the capex for new energy plants and transmission lines. In other words, you pay the price, they own the benefits. And if the propaganda fails, they can always use some more general inflation to do the same, as it's being done elsewhere in the economy.

Datacenters are actually associated with lower electricity costs in the US

https://www.economist.com/united-states/2025/10/30/the-data-...

bigbadfeline•3mo ago
> Where does it say it was funded by $150B of public loans?

let me repeat something you've already quoted

>> the public is on the hook for $150B of loans to be payed by inflationary pricing.

one more time "to be payed by inflationary pricing"

> Datacenters are actually associated with lower electricity costs in the US.

"Associated" means these areas are getting preferential pricing to shift more of the cost to the public. Proves my point.

The actual truth, with numbers, just for 2024 and Virginia alone:

"Mike Jacobs, a senior energy manager at the Union of Concerned Scientists, last month released an analysis estimating that data centers had added billions of dollars to Americans’ electric bills across seven different states in recent years. In Virginia alone, for instance, Jacobs found that household electric bills had subsidized data center transmission costs to the tune of $1.9 billion in 2024."

https://www.commondreams.org/news/ai-data-center-backlash

Also:

"Over the last five years, commercial users including data centers and industrial users began drinking more deeply from the grid, with annual growth rising 2.6% and 2.1%, respectively. Meanwhile, residential use only grew by 0.7% annually."

https://techcrunch.com/2025/11/01/rising-energy-prices-put-a...

gruez•3mo ago
>let me repeat something you've already quoted

>>> the public is on the hook for $150B of loans to be payed by inflationary pricing.

That framing makes even less sense. Even if we grant that capital spending is inflationary, nobody thinks the public is "on the hook" or pays for it "by inflationary pricing". If I bought a box of eggs, it probably drives up the price of eggs by some minute amount in the aggregate, but nobody would characterize that as the public being "on the hook" for it, or that the public is paying for it "by inflationary pricing". Same if I bought anything else supply constrained, like an apartment or GPU. Seemingly the only difference between those and whatever Micron is doing is that you don't like Micron and/or the AI bubble, whereas you at least tolerate me buying eggs, apartments, or GPUs, so your whole spiel about "payed by inflationary pricing" is just a roundabout way of saying you don't like Micron/AI companies' spending. I also disagree with people dropping $30K on hermes handbags, but I wouldn't characterize buying them as "the public is on the hook for $30k to be payed by inflationary pricing".

>The actual truth, with numbers, just for 2024 and Virginia alone:

"actual truth"? That settles it, then.

On a more substantive note, since you clearly haven't bothered to look into either article to examine their methodology, here's the relevant snippets for your convenience:

>Mike Jacobs, a senior energy manager at UCS, uncovered these additional costs by analyzing last year’s filings from utilities in seven PJM states and identifying 130 projects that will connect private data centers directly to the high-voltage transmission system. Over 95% of the projects identified passed all of their transmission connection costs onto local people’s electricity bills, totaling $4.3 billion in costs previously undistinguished from other, more typical expenses to upgrade and maintain the electricity grid.

and

>The Economist has adapted a model of state-level retail electricity prices from the Lawrence Berkeley National Laboratory to include data centres (see chart 2). We find no association between the increase in bills from 2019 to 2024 and data-centre additions. The state with the most new data centres, Virginia, saw bills rise by less than the model projected. The same went for Georgia. In fact, the model found that higher growth in electricity demand came alongside lower bills, reflecting the fact that a larger load lets a grid spread its fixed costs across more bill-payers.

>chart 2: https://www.economist.com/content-assets/images/20251101_USC...

Looking at the two different methodologies, the economist methodology seem far more reasonable, because the UCS's methodology is basically guaranteed to come up with a positive number. It just counts how much money was spent on connecting datacenters, and assumes assumes household users are paying the entire bill. It doesn't account for different rates/fees paid by retail/household users, or the possibility that datacenters could be paying more than their "fair share" of costs through other means (eg. they might require disportionately less infrastructure to service, but pay the same transmission rates as everyone else).

riskable•3mo ago
Hopefully this will put pressure on the market to produce much more efficient AI models. As opposed to bigger, then bigger, and then even BIGGER models (which is the current trend).

FYI: gpt-oss:120b is better at coding (in benchmarks and my own anecdotal testing) than gpt5-mini. More importantly, it's so much faster too. We need more of this kind of optimization. Note that gpt5-mini is estimated to be around ~150 billion parameters.

KronisLV•3mo ago
For what it’s worth, even the Qwen 30B model has its use cases. And as far as some of the better open models go, by now the GLM 4.6 355B model is largely better than the Qwen3 Coder 480B variant, so it seems that the models are getting more efficient across the board.
muldvarp•3mo ago
> We need more of this kind of optimization.

Who is the "we" in this sentence? The ultra-rich that don't want to pay white collar workers to build software?

The advantages of LLMs are tiny for software engineers (you might be more productive, you don't get paid more) and the downsides are bad to life-altering (you get to review AI slop all day, you lose your job).

bigbadfeline•3mo ago
> The ultra-rich that don't want to pay white collar workers to build software?

This is already a fact and it's set in stone - making AI cheaper won't change anything in that regard. However, a cheaper AI will allow the laid-off software engineers to use models independently of those firing them, and even compete on an equal footing.

muldvarp•3mo ago
Compete for what exactly? Under the assumption that AI agents will make human software engineering obsolete there won't be a market for you to compete in. Everyone that wants a piece of software will ask their AI agent to create it.

My ability to create software is only really useful to me because other people pay me for it. If AI agents take that from me, it won't matter that I can now create awesome software in minutes instead of weeks. The software was never really the thing that was useful for me.

bigbadfeline•3mo ago
> Under the assumption that AI agents will make human software engineering obsolete

That assumption is wrong, humans will always have to guide the bots to achieve human-worthy goals, be it as supervising engineers, small business owners or a combination of the two - that helps competition and avoids lock-in.

estimator7292•3mo ago
Energy and GPU costs haven't pushed the needle any, so I don't see any reason to expect that RAM costs will.
KurSix•3mo ago
You're right. And it's not just about parameter count. Efficiency is a full-stack problem: from the architecture (like MoE instead of dense transformers), to the inference techniques (speculative decoding), to the data formats (quantization down to INT4/FP4). The shortage will force everyone to optimize every step of the process, not just "add more layers"
Arch-TK•3mo ago
Server DRAM? More like all DRAM.
nubinetwork•3mo ago
Not just server memory, desktop memory has gone up for the same reason... it's all going to AI. Forget building a new gaming pc, or buying a laptop, or even an arm SBC, because the supply is just gone.
flamesofphx•3mo ago
I wonder if they old movie lawn mower man is going to become a reference for AI... They might need a dam...
KurSix•3mo ago
This shortage is the best thing that could have happened for R&D in model efficiency. The "who has more parameters" race is about to hit the physical wall of hardware availability. Now the real race for efficiency begins: quantization, distillation, Mixture-of-Experts, new architectures

Hardware constraints are the single biggest driver of software innovation

kingstnap•3mo ago
This is a silly take. From day zero everyone and their mother was trying every idea under the sun on trying to get inference costs down.

There are a huge number of varients of attention schemes, like seemingly thousands. More posted on arxiv and blogs every single day. There are similarly a huge number of papers and talks on quantization, codebooks, number formats, everything. Like genuinely covering everything from analog compute to training lookup tables in FPGAs.

The AI model architects similarly are not sleeping on things. They genuinely take a bottom up approach to the design of their model and make sure that every mm of die area is being used.

And top down of course has extreme influence with people spinning up entire hardware companies like groq and cerebras to do AI as fast and efficiently as possible. Everyone wants to be the shovel seller.

The idea that anyone is sleeping on increasing inference efficiency when $$$$$$$$$ are being spent is ridiculous.

And genuinely they have actually made massive strides. Some models cost almost nothing like GPT OSS 120B which can be used to create slop at the speed of light.