In one of the last GPU booms I sold some ancient video card (recovered from a PC they were literally going to put in the trash) for $50.
And it wasn’t because it was special for running vintage games. The people that usually went for 1st rate gpus went to 2nd rate. Pushing the 2nd rate buyers to 3rd rate, creating a market for my 4th rate gpu.
I would say that a claim about component cost has something to do with price.
But there have been plenty of articles over the last decade saying that it was done around 2015 or so.
Either way, without competition expect it to increase further.
The other way I look at this is that these companies have been collecting an insane amount of wealth and value over the last 2-3 decades, are finally in a situation where they feel threatened, and are willing to spend to survive. They have previously never felt this existential threat before. It's basically bidding wars on houses in San Francisco, but with all the wealthiest companies in the world.
https://www.tomshardware.com/pc-components/dram/openais-star...
> South Korean SK Hynix has exhausted all of its chip production for next year and plans to significantly increase investment, anticipating a prolonged "super cycle" of chips, spurred by the boom of artificial intelligence, it said on Wednesday after reporting a record quarterly profit.
https://en.ilsole24ore.com/art/korean-chip-race-sk-hynix-has...
> Adata chairman says AI datacenters are gobbling up hard drives, SSDs, and DRAM alike — insatiable upstream demand could soon lead to consumer shortages
https://www.tomshardware.com/tech-industry/big-tech/adata-ch...
I suspect they would rather face shortages then satisfy market demand.
overproduction means unsold units which is very bad, you pay a cost for every unsold unit
underproduction means internal processes are strained, customers are angry, but a higher price per a unit... can you increase the price by more than you are underproducing?
I think the Nintendo ecosystem has been a pretty good example of where intentional underproduction can backfire. Another example might be that migration to SSD was likely accelerated by (forced) underproduction of spinning disks in 2011. We use SSDs for a lot of things that traditional magnetic media would be better at simply because the supply has been so overpowering for so long.
You can train your customers to stick with you by bathing them in product availability. Overproduction can be a good thing. Inventory can be a good thing. We've allowed a certain management class to terrorize us into believing this stuff is always bad.
Doubtful. A shortage is normally a scary prospect for a vendor. It means that buyers want to pay more, but something is getting in the way of the seller accepting that higher price. Satisfying market demand is the only way to maximize profitability.
Why do you think companies would prefer to make less profit here?
Because if you make too much profit, you get regulated by government.
The shareware/unlock-code economy of the 90s was probably the closest you'd get to cutting out the middlemen, where you could download from some BBS or FTP server without the dev getting involved at all and then send them money to have them email you an unlock code, but it was a lot of manual work on the developer's part, and a lot of trust.
Stripe is way more expensive than regular payment processors. Convenient for sure, but definitely not cheap.
Absolutely prices should adjust appropriately… once… oh never mind
(You can have inflation while your currency go up relatively to all the others on the FX market, like what happened to USD in 2022-S1, or you can have massive inflation difference between countries sharing the same currency, like it happened in the Euro Area between 2022 and today).
Not to mention that "debasement" doesn't make sense anymore given that there basically aren't any currencies on the gold standard anymore. At best you could call a pegged currency that was devalued as being debased (with the base being the pegged currency), but that doesn't apply to USD. "debasement" therefore is just a pejorative way saying "inflation" or "monetary expansion".
> "inflation" or "monetary expansion".
This is my second pet peeve on the topic, inflation and growth of the money supply are independent phenomenons. (they are only correlated in countries with high inflation regimes and, hyperinflation aside, the causation is essentially reversed: the money supply grow because of the inflation, higher price leading to an increase of loans).
Or, for $300, you can buy an RTX 5060 that is better than the best GPU from just 6 years ago. It's even faster than the top supercomputer in the world in 2003, one that cost $500 million to build.
I find it hard to pity kids who can't afford the absolute latest and greatest when stuff that would have absolutely blown my mind as a kid is available for cheap.
RTX 5060 is slower than the RTX 2080 Ti, released September 2018. Digital Foundry found it to be 4% slower in 1080p, 13% slower in 1440p: https://www.youtube.com/watch?v=57Ob40dZ3JU
Do you have some concrete examples of where I can look?
> spot prices of DRAM, used in various applications, nearly tripled in September from a year earlier.. improving profitability of non-HBM chips has helped fuel memory chipmakers' share price rally this year, with Samsung's stock up more than 80%, while SK Hynix and Micron shares have soared 170% and 140% respectively... industry is going through a classic shortage that usually lasts a year or two, and TechInsights is forecasting a chip industry downturn in 2027.
Micron has US memory semiconductor fab capacity coming online in 2027 through 2040s, based on $150B new construction.
Are some HBM chips idle due to lack of electrical power? https://www.datacenterdynamics.com/en/news/microsoft-has-ai-...
> Microsoft CEO Satya Nadella has said the company has AI GPUs sitting idle because it doesn’t have enough power to install them.
If the PC supply chain will be impacted by memory shortages until 2027, could Windows 10 security support be extended for 24 months to extend the life of millions of business PCs that cannot run Windows 11?
Yay, the public is on the hook for $150B of loans to be payed by inflationary pricing.
I guess, you offered the news hoping prices will fall... In terms of real economic analysis there's a lot to say here but let me point at only one of the many entry points of the rabbit hole:
"Microsoft CEO says the company doesn't have enough electricity to install all the AI GPUs in its inventory - 'you may actually have a bunch of chips sitting in inventory that I can’t plug in'
https://www.tomshardware.com/tech-industry/artificial-intell...
Microsoft and all other AI wannabes are hoarding GPUs and thus RAM, they hope to sell them to you for a price of subscription which doesn't change the fact of speculative hoarding and trust-like behavior against the public.
The hoarding and inflation economy we live in is a weapon against the public, at the moment, there's no visible force that isn't laboring diligently on the enhancement of that weapon, so the timeline of change is likely to stretch somewhere between far in the future to infinity... just hoping otherwise is futile.
If you pay attention, you won't fail to notice the propaganda push to convince the public to pay higher electric costs in order to pay the capex for new energy plants and transmission lines. In other words, you pay the price, they own the benefits. And if the propaganda fails, they can always use some more general inflation to do the same, as it's being done elsewhere in the economy.
As I said, this is just scratching the surface, there's a lot more which cannot fit in a single comment.
Edit: actually not. The parent comment was edited after mine, to include a link to MS inadvertently admitting to the hoarding of GPUs and RAM.
This makes me so angry.
The private companies told governments they want money and the governments replied "sure we'll just steal it from citizens and lie and sell it as a tax, no problem. We'll just go hard on the net zero excuse lol" ??
Where does it say it was funded by $150B of public loans?
>which doesn't change the fact of speculative hoarding
All investment resembles "speculative hoarding'. You're pouring money into a project now with the expectation that it'll pay off in decades.
> and trust-like behavior against the public.
???
>If you pay attention, you won't fail to notice the propaganda push to convince the public to pay higher electric costs in order to pay the capex for new energy plants and transmission lines. In other words, you pay the price, they own the benefits. And if the propaganda fails, they can always use some more general inflation to do the same, as it's being done elsewhere in the economy.
Datacenters are actually associated with lower electricity costs in the US
https://www.economist.com/united-states/2025/10/30/the-data-...
let me repeat something you've already quoted
>> the public is on the hook for $150B of loans to be payed by inflationary pricing.
one more time "to be payed by inflationary pricing"
> Datacenters are actually associated with lower electricity costs in the US.
"Associated" means these areas are getting preferential pricing to shift more of the cost to the public. Proves my point.
The actual truth, with numbers, just for 2024 and Virginia alone:
"Mike Jacobs, a senior energy manager at the Union of Concerned Scientists, last month released an analysis estimating that data centers had added billions of dollars to Americans’ electric bills across seven different states in recent years. In Virginia alone, for instance, Jacobs found that household electric bills had subsidized data center transmission costs to the tune of $1.9 billion in 2024."
https://www.commondreams.org/news/ai-data-center-backlash
Also:
"Over the last five years, commercial users including data centers and industrial users began drinking more deeply from the grid, with annual growth rising 2.6% and 2.1%, respectively. Meanwhile, residential use only grew by 0.7% annually."
https://techcrunch.com/2025/11/01/rising-energy-prices-put-a...
>>> the public is on the hook for $150B of loans to be payed by inflationary pricing.
That framing makes even less sense. Even if we grant that capital spending is inflationary, nobody thinks the public is "on the hook" or pays for it "by inflationary pricing". If I bought a box of eggs, it probably drives up the price of eggs by some minute amount in the aggregate, but nobody would characterize that as the public being "on the hook" for it, or that the public is paying for it "by inflationary pricing". Same if I bought anything else supply constrained, like an apartment or GPU. Seemingly the only difference between those and whatever Micron is doing is that you don't like Micron and/or the AI bubble, whereas you at least tolerate me buying eggs, apartments, or GPUs, so your whole spiel about "payed by inflationary pricing" is just a roundabout way of saying you don't like Micron/AI companies' spending. I also disagree with people dropping $30K on hermes handbags, but I wouldn't characterize buying them as "the public is on the hook for $30k to be payed by inflationary pricing".
>The actual truth, with numbers, just for 2024 and Virginia alone:
"actual truth"? That settles it, then.
On a more substantive note, since you clearly haven't bothered to look into either article to examine their methodology, here's the relevant snippets for your convenience:
>Mike Jacobs, a senior energy manager at UCS, uncovered these additional costs by analyzing last year’s filings from utilities in seven PJM states and identifying 130 projects that will connect private data centers directly to the high-voltage transmission system. Over 95% of the projects identified passed all of their transmission connection costs onto local people’s electricity bills, totaling $4.3 billion in costs previously undistinguished from other, more typical expenses to upgrade and maintain the electricity grid.
and
>The Economist has adapted a model of state-level retail electricity prices from the Lawrence Berkeley National Laboratory to include data centres (see chart 2). We find no association between the increase in bills from 2019 to 2024 and data-centre additions. The state with the most new data centres, Virginia, saw bills rise by less than the model projected. The same went for Georgia. In fact, the model found that higher growth in electricity demand came alongside lower bills, reflecting the fact that a larger load lets a grid spread its fixed costs across more bill-payers.
>chart 2: https://www.economist.com/content-assets/images/20251101_USC...
Looking at the two different methodologies, the economist methodology seem far more reasonable, because the UCS's methodology is basically guaranteed to come up with a positive number. It just counts how much money was spent on connecting datacenters, and assumes assumes household users are paying the entire bill. It doesn't account for different rates/fees paid by retail/household users, or the possibility that datacenters could be paying more than their "fair share" of costs through other means (eg. they might require disportionately less infrastructure to service, but pay the same transmission rates as everyone else).
"Eggs"? "Eggs" are the same as "apartment or GPU"? You display all of the comprehension abilities of an LLM... or the mainstream economist "teaching" it.
Semiconductor capex are huge compared to, eh, "eggs", and they must be payed off as part of pricing. Artificial jump in demand, as from hoarding, makes the capex artificially high and the pricing inflationary.
Also, hoarding semiconductors (private NPUs like TPUs, Trainiums, etc, stocking on hard to obtain GPUs) reduces competition and via renting, the respective services can extract the inflated capex plus high profits.
FYI: gpt-oss:120b is better at coding (in benchmarks and my own anecdotal testing) than gpt5-mini. More importantly, it's so much faster too. We need more of this kind of optimization. Note that gpt5-mini is estimated to be around ~150 billion parameters.
Who is the "we" in this sentence? The ultra-rich that don't want to pay white collar workers to build software?
The advantages of LLMs are tiny for software engineers (you might be more productive, you don't get paid more) and the downsides are bad to life-altering (you get to review AI slop all day, you lose your job).
This is already a fact and it's set in stone - making AI cheaper won't change anything in that regard. However, a cheaper AI will allow the laid-off software engineers to use models independently of those firing them, and even compete on an equal footing.
vondur•3h ago
piva00•3h ago
pton_xd•3h ago
formerly_proven•3h ago
brianshaler•2h ago
distances•1h ago
big-and-small•3h ago
https://pcpartpicker.com/trends/price/memory/
zparky•3h ago
vondur•3h ago
StillBored•2h ago
RAM usage for a lot of workloads scales with core/thread count, and my general rule of thumb is that 1G/thread is not enough, and 2G/thread will mostly work, and 4G/thread is probably too much, but your disk cache will be happy. Also, the same applies to VMs, so if your hosting a VM and give it 16 threads, you probably want at least 16G for the VM. The 4G/thread then starts to look pretty reasonable.
Just building a lot of opensource projects with `make -j32` your going to be swapping if you only have 1G/thread. This rule then becomes super noticeable when your on a machine with 512G of ram, and 300+ threads, because your builds will OOM.
vondur•22m ago
embedding-shape•3h ago
Is the same doubling happening world-wide or is this US-specific, I guess is my question?
Edit: one data point, I last bought 128GB of RAM in March 2024 for ~€536, similar ones right now costs ~€500, but maybe the time range is too long.
Normal_gaussian•2h ago
embedding-shape•2h ago
coffeebeqn•2h ago
threeducks•2h ago
embedding-shape•2h ago
Yeah, that was my hunch, that something like that was going on. Thanks for clarifying.
pcarmichael•2h ago
numpad0•1h ago
[1]: https://kakaku.com/item/K0001448114/pricehistory/ (archive: https://archive.is/CHLs2)
embedding-shape•36m ago
pcarmichael•28m ago
epistasis•1h ago