I got 2 sticks of 16GB DDR4 SODIMM for €65.98 back in February. The same two sticks in the same store now cost €186
Props on building a PC with your kid, I have very fond memories of doing that with my dad. Have fun!
I'm now seeing $480 CAD for a single stick.
Bad time if you need to build a computer.
Looking at it optimistically, you're probably going to find DDR5 at bargain prices again in 2027.
When do you think prices will recede again?
And because everything needs prices expect all electronics to be ~20%-80% more expensive in 2027 compared to today, naturally this includes the profit margin.
and naturally every regulation related companies don't like will supposedly be at fault for this (e.g. right to repair)
at least that is a wild speculation on my side
People just have to wait. As prices are sky high, production capacity will likely increase. Some AI companies will go bust. Demand will plummet and we will buy RAM for pennies while the market consolidates.
https://investors.micron.com/news-releases/news-release-deta...
~~~~~helps that Apple's SoC has the RAM on the main die itself. They're probably immune from these price hikes, but a lot of the PC/Windows vendors would, which would only make Apple's position even stronger~~~~
If Apple is insulated it is likely because Apple signs big contracts for large supply and manufacturers would prefer to be insulated from short-term demand shocks and have some reliability that their fabs can keep running and producing profitable chips.
And in this article you can see a photo of the memory chips attached outside of the Apple component https://www.gizmochina.com/2020/11/19/apple-mac-mini-teardow...
Big manufacturers also order their DRAM in advance with contractually negotiated pricing. They're not paying these spot market prices for every computer they ship.
I guess I'm glad I bought when I did; didn't realize how good of a deal I was getting.
It used to be a general rule of thumb that you could build a computer of roughly equivalent power for the cost of a game console, or a little more — now the memory costs more than the whole console.
On a PC you may have the bright idea to open a browser along with the game for walkthroughs/hints. Or Discord to chat with your friends while gaming.
Due to javascript bloat, your working set size goes from 16 to 48-64 Gb in a jiffy.
You do have the option to open up Discord voice chats on PS5. Amazing what Discord could do when forced to actually write something efficient.
Youtube also exists as an app, and maybe you can trick the heavily gimped built in browser to go there as well, although last I checked it wasn't trivial.
Maybe 6 once. Try not to leave it for weeks displaying the memes/cat photos channels…
Can’t use a ps2 controller to play a ps2 game on a ps2 without the ps2 console.
If this is still true or not, I don’t know. I do know that the ps5 with an optical drive cost $100 more than the digital edition. I also know that the drive does not cost $100 and sincerely doubt the labor makes up the difference.
So maybe I talked myself out of my whole point.
The PS5 has 16GB of RAM. One can buy 16GB of RAM for ~$100 [1].
[1] https://pcpartpicker.com/product/9fgFf7/kingston-fury-beast-...
But since it's 16 GB, the comparison doesn't really make sense.
Gaming consoles are something people buy. Any parent or gamer has an idea what they cost.
People do not buy London Buses themself
I wonder what you'd think if bus tires exploded in price and started costing .25 London busses per tire.
That's a an analogy-- a literary technique the writer is using, to show the correspondence between the price of a specific amount of DDR5 RAM to a fully integrated system, so the reader can follow the conclusions of their article easier.
If LLMs' utility continues to scale with size (which seems likely as we begin training embodied AI on a massive influx of robotic sensor data) then it will continue to gobble up memory for the near future. We may need both increased production capacity _and_ a period of more efficient software development techniques as was the case when a new 512kb upgrade cost $1,000.
But yes we're going to need more fabs for sure
If the shortage of RAM is because of AI (so servers/data centers I presume?), wouldn't that mean the shortage should be localized to RDIMM rather than the much more common UDIMM that most gaming PCs use? But it seems to me like the pricing is going up more for UDIMM than RDIMM.
Also vary a bit between winter a summer, basically in winter they can get away with putting a bit more volatile compounds coz it's colder
Which, every modern ECU will do automatically based on output from the knock sensors.
In theory, your average Camry running on 87 is pulling spark timing to ride the edge of knock for best fuel efficiency by being lean, but how much? It was designed to be safe on even kinda shitty gas, that has lower than 87 octane at points, and the ECU is going to err on the side of caution.
That naturally aspirated 2AR-FE in a Camry does not have the ability to compress harder, so if you put 93 in it, it may only be able to "utilize" the extra knock resistance up to say 89 by advancing spark timing.
Meanwhile your average Golf TSI probably can, and the VW GTI I have demonstrably gets better gas mileage on 93 octane, even though it is "rated" for 87 octane (and therefore has a lower mpg claim than it is capable of), but this was an engine that previously was rated at 91 octane and nerfs itself so hard on 87 that it is dramatically easier to stall, and the power figures are rated on 91 octane anyway.
You are almost certainly spending more money on gas even if you eke out a percentage point or two extra mpg on higher octane fuels, as they are priced at higher margins and have lower scale.
Wouldn't that mean that a shortage of DRAM chips should cause price difference in all of them? Not sure that'd explain why RDIMM prices aren't raising as sharply as UDIMM. That the fab and assembly lines have transitioned into making other stuff makes sense why'd there be a difference though, as bradfa mentioned in their reply.
The manufacturers make the individual chips, not the modules (DIMMs). (EDIT: Some companies that make chips may also have business units that sell DIMMS, to be pedantic.)
The R in RDIMM means register, aka buffer. It's a separate chip that buffers the signals between the memory chips and the controller.
Even ECC modules use regular memory chips, but with extra chips added for the ECC capacity.
It can be confusing. The key thing to remember is that the price is driven by the price of the chips. The companies that make DIMMs are buying chips in bulk and integrating them on to PCBs.
Quite a few unbuffered designs in the past had a "missing chip". If you ever wondered why a chip was missing on your stick, it's missing ECC. Don't know if it's still the case with DDR5 though.
Also with DDR5 each stick is actually 2 channels so you get 2 extra dies.
Was reading a series of displeased posts about it. Can't seem to find it now.
If so, that's terrible news. It was already difficult enough to find ECC RAM for "workstation" class machines (i.e.: High end, non-server CPUs that support ECC such as AMD Threadripper).
The manufacturers are scumbags is more likely answer.
Anyway, that's the kind of market that governments always need to act upon and either supply directly or regulate intensively.
Maximizing profit is the only sane way to play a rigged game
But my guess is that this shortage is short-lived (mostly because of the threat above). There's no OPEC for tech.
Being shocked that companies try their best to deal with the bad cards they have been dealt with should be expected. The money system simply cannot express the concept of surplus capital or abundance. Positive interest means capital is scarce, so capital must be made scarce even if there is abundance.
Before you come up with the argument that the interest rate is supposed to reflect a market property and therefore does not force itself upon the market, remember that I said that there is an artificial restriction in the money system that prevents the state of the real market to be expressed. The non-profit economy has never had a chance to exist, because our tools are too crude.
The non-profit economy includes resilient production with slight/minor overproduction.
Think about how stupid the idea of a guaranteed 0% yield bond is (aka cash). The government obligates itself to accept an infinite amount of debt if the real return on capital would ever fall negative. No wonder it has an incentive to inflate the value of the bond away.
I am hoping some of that Clayton Christensen disruption the tech theocracy keep preaching about comes along with some O(N) decrease in transformer/cDNN complexity that disrupts the massive server farms required for this AI boom/bubble thing.
Games were always going to go 3d sooner or later, the real pressure of the high volume competitive market got us more and more capable chips until they were capable enough for the kind of computation needed for neural networks faster than a slow moving specialty market could have.
Yes. That is my point. The customers willing to pay the high initial R+D costs opened up the potential for wider adoption. This is always the case.
Even the gaming GPUs which have grown in popularity with consumers are derivatives of larger designs intended for research clusters, datacenters, aerospace, and military applications.
No question that chip companies are happy to take consumers money. But I struggle to think of an example of a new technology which was invented and marketed to consumers first.
Many 3d games like doom, quake1, flight unlimited,etc ran purely on software rendering since CPU's were already providing enough oomph to render fairly useful 3d graphics in the mid 90s. CPU power was enough but consoles/arcades showed that there was more to be gotten (but nothing hindered games at that point).
And already there, the capital investment for game consoles (Atari,NES,SNES,PS1,PS2, etc) and arcade games(like the above mentioned 3d games) were big enough to use custom chipsets not used or purposed for anything else (I think also that in the 80s/90s the barrier of entry to making competitive custom chips was a tad lower, just consider the cambrian explosions of firms during the 90s making x86 and later ARM chips).
Yes, there was vendors that focused on the high end commercial customers, and yes many alumnis of those firms did contribute a ton of expertise towards what we have today.
But if you look at what companies survived and pushed the envelope in the longer run it was almost always companies that competed in the consumer market, and it was only when those consumer chips needed even more advanced processing that we breached the point where the chips became capable of NN's.
In fact I'd say that had the likes of SGI prevailed we would've had to wait longer for our GPU revolution. Flight simulators,etc were often focused on "larger/detailed" worlds, PS2-era chips with higher polycounts and more memory would have been fine for simulator developers for a long time (since more details in a military scenario would have been fine).
Leisure games has always craved fidelity on a more "human" level, to implement "hacks" for stuff with custom dynamic lighting models, then global illumination, subsurface scattering,etc we've needed the arbitrary programmability since the raw power wasn't there (the most modern raytracing chips are _starting_ to approach that levels without too ugly hacks).
Perfectly stated. I think comments like the one above come from a mentality that the individual consumer should be the center of the computing universe and big purchasers should be forced to live with the leftovers.
What's really happening is the big companies are doing R&D at incredible rates and we're getting huge benefits by drafting along as consumers. We wouldn't have incredible GPUs in our gaming systems and even cell phones if the primary market for these things was retail entertainment purchases that people make every 5 years.
Computing didn't take off until it shrank from the giant, unreliable beasts of machines owned by a small number of big corporations to the home computers of the 70s.
There's a lot more of us than them.
There's a gold rush market for GPUs and DRAM. It won't last forever, but while it does high volume sales at high margins will dominate supply. GPUs are still inflated from the crypto rush, too.
The iPhone isn't exactly a consumer computation device. From that perspective, it does less work at a higher cost.
(I remember the huge window in which phone companies desperately put out feature phones with sub-par touch screens, completely missing the value to consumers. The iPod Touch should've been warning enough... and should've been (one of) my signal(s) to buy Apple stock, I guess :-)
3Dfx was not the inventor of the GPU. There’s a long history of GPU development for corporate applications.
The iPhone wasn’t the first mobile phone. Early mobile phones were very expensive and targeted as businesses who wanted their executives in touch
You’re still thinking from a consumer-centric view. Zoom out and those consumer companies were not the first to develop the products. You didn't even think about the actual originators of those types of products because you don’t see them as a consumer.
Arguably we don't. Most of the improvements these days seem to be on the GPGPU side with very little gains in raster performance this decade.
I have a flagship 7-8 year old GPU in one machine and a mid-level modern GPU in another.
It’s flat out wrong to claim “very little gains” during this time. The difference between those two GPUs is huge in games. The modern GPU also does it with far less power and noise.
I can’t understand this HN mentality that modern hardware isn’t fast or that we’re not seeing gains.
HN is strange. I have an old gaming build from 7-8 years ago and while it can do high end games on low settings and resolution, it doesn’t hold a candle to even a mid-range modern build.
“viable” is doing a lot of work in that claim. You can tolerate it at low res and settings and if you’re okay with a lot of frame rate dips, but nobody is going to mistake it for a modern build.
You’re also exaggerating how fast video cards became obsolete in the past. Many of us gamed just fine on systems that weren’t upgraded for 5-6 years at a time.
And back in the early 2000s, even bleeding edge current-year rigs would struggle with new games like Doom 3, Far Cry, Crysis, and so on. Hardware was advancing so rapidly that games were being built in anticipation of upcoming hardware, so you had this scenario where high end systems bought in one year would struggle with games released that year, let alone systems from 5-6 years prior.
Obviously if you're referencing CRPGs and the like, then yeah - absolutely anything could run them. The same remains even more true today. Baldur's Gate 3's minimum requirement is a GTX 970, a card more than 11 years old. Imagine a 1989 computer trying to run Baldur's Gate 2!
A colleague who worked with me about 10 years ago on a VDI project ran some numbers and showed that if a Time Machine were available, we could have brought like 4 loaded MacBook Pros back and replaced a $1M HP 3PAR ssd array :)
Hard disagree. A $600 Mac Mini with 16GB of RAM runs everything insanely faster than even my $5000 company-purchased developer laptops from 10 years ago. And yes, even when I run Slack, Visual Studio Code, Spotify, and a gazillion Chrome tabs.
The HN rhetoric about modern computing being slow is getting strangely disconnected from the real world. Cheap computers are super fast like they've never been before, even with modern software.
People ran multiple browser windows, a 3D video game, irc (chat application), teamspeak/ventrilo (voice chat) and winamp (music) all at once back in the early 2000s. This is something an 8 year old phone can do these days.
Bullshit. It was cramped and I wasn't able to do half of what I was wanting to actually do. Maybe it was plenty for your usecases, but such a small amount of memory was weak for my needs in the late 90s and 2000s. 64MB desktops struggled to handle the photo manipulations I wanted to do with scanned images. Trying to do something like edit video on a home PC was near impossible with that limited amount of memory. I was so happy when we managed to get a 512MB machine a few years later, it made a lot of my home multimedia work a lot better.
Besides, you just said you only needed 512MB, which is still nothing these days.
I didn't say I "only needed 512MB", only that things were a lot better once we got a 512MB machine. Things continued to get massively better as I upgraded to a 1GB machine, an 8GB machine, etc.
> I'm talking in general
Isn't doing some light picture editing/organizing, playing back multimedia, etc. pretty dang general computer use these days? Or what, is "general" computer usage entirely limited to 80 column text manipulation? You'd have a hard time even just keeping my displays drawn with 64MB of memory at the resolutions and bit depths and multiple desktops that are common.
I play around with retro computers (especially the early/mid 90s for that nostalgia) and I'm constantly reminded of how little memory we really had to play with back then, and these are on pretty much fully loaded home desktop machines. Have a Word document open and you're trying to play back an MP3 and have a couple browser windows open? Oof, good luck! You want to stream a video? I hope its about 10FPS at 320x240! Opening one photo from my camera today and you'll have used half the memory before its even hit the framebuffer.
That an 11-year-old PC can keep up today (with or without an upgrade) is evidence that systems are keeping up with software bloat just fine. :)
Compute is cheaper than ever. The ceiling is just higher for what you can buy.
Yes, we have $2000 GPUs now. You don't have to buy it. You probably shouldn't buy it. Most people would be more than fine with the $200-400 models, honestly. Yet the fact that you could buy a $2000 GPU makes some people irrationally angry.
This is like the guy I know who complains that pickup trucks are unfairly priced because a Ford F-150 has an MSRP of $80,000. It doesn't matter how many times you point out that the $80K price tag only applies to the luxury flagship model, he anchors his idea of how much a pickup truck costs to the highest number he can see.
Computing is cheaper than ever. The power level is increasing rapidly, too. The massive AI investments and datacenter advancements are pulling hardware development forward at an incredible rate and we're winning across the board as consumers. You don't have to buy that top of the line GPU nor do you have to max out the RAM on your computer.
Some times I think people with this mentality would be happier if the top of the line GPU models were never released. If nVidia stopped at their mid-range cards and didn't offer anything more, the complaints would go away even though we're not actually better off with fewer options.
So no one makes a 25k model.
-the whole reason why the GPU is $2000 is because of said AI bubble sucking up wafers at TSMC or elsewhere, with a soupçon of Jensen's perceived monopoly status...
-for a good part of the year, you could not actually buy said $2000 GPU (I assume you are referring to the 5090) also because of said AI bubble
(granted, while Jensen does not want to sell me his GPU, I would like to point out that Tim Cook has no problem taking my money).
on that point, I can go and buy a Ford F150 tomorrow. Apparently, per the article, I would have problems buying bog standard DDR5 DIMMS to build my computer.
If the result was that games were made and optimised for mid-range cards, maybe regular folks actually would be better off.
Low end is ryzen integrated graphics now, xx60 is mid range at best. Maybe even xx50 if those still exist.
"It's mid range if it exists" doesn't make sense.
Also you're missing that they're talking about 3070, a card from 2020 (5 years ago), 2 generations behind this year's 50xx series. The 30xx matters more than the xx70 here. It was an upper midrange card when it came out, and it's solidly midrange for Nvidia's product lineup today. You can have cheaper and decent just fine (integrated Ryzens like you mentioned are fine for 1080p gaming on most titles).
If we're talking performance. Not price.
A GTX 1080 came out in the first half of 2016. It had 8 GB of VRAM and cost $599 with a TDP of 180W.
A GTX 1080 Ti came out in 2017 and had 11 GB of VRAM at $799.
In 2025 you can get the RTX 5070 with 12 GB of VRAM. They say the price is $549, but good luck finding them at that price.
And the thing with VRAM is that if you run out of it then performance drops off a cliff. Nothing can make up for it without getting a higher VRAM model.
I did one Google search for "rtx 5070 newegg usa" and they have MSI Ventus GeForce RTX 5070 12G down from $559 to $499 for Black Friday, and ASUS Prime RTX 5070 12GB for $543.
https://www.newegg.com/msi-geforce-rtx-5070-12g-ventus-2x-oc...
https://www.newegg.com/asus-prime-rtx5070-12g-geforce-rtx-50...
This is missing the forest for the trees quite badly. The 2000 price GPUs are what would've been previously 600-700, and the 200-400 dollar GPUs are now 600-700. Consumers got a shit end of the deal when crypto caused GPUs to spike and now consumers are getting another shitty deal with RAM prices. And even if you want mid range stuff it's harder and harder to buy because of how fucked the market is.
It would be like if in your example companies literally only sold F-150s and stopped selling budget models at all. There isn't even budget stock to buy.
You still do. There is no "AI movement" you need to participate in. You can grab a copy of SICP and a banged up ten year old thinkpad and compute away, your brain will thank you. It's like when people complain that culture is unaffordable because the newest Marvel movie tickets cost 50 bucks, go to the library or standardebooks.org, the entire Western canon is free
Living on the edge from 4 years ago is basically free.
The move to cloud computing and now AI mean that we're back in the mainframe days.
Unforseen things like the pandemic hurt profits.
Letting things go this unmanaged with a 3 year run way for AI demand seems a little hard to understand. In this case, not anticipating demand seems to creates more profit.
Most DRAM is already purchased through contracts with manufacturers.
Manufacturers don't actually want too many extremely long term contracts because it would limit their ability to respond to market price changes.
Like most commodities, the price you see on places like Newegg follows the "spot price", meaning the price to purchase DRAM for shipment immediately. The big players don't buy their RAM through these channels, they arrange contracts with manufacturers.
The contracts with manufacturers will see higher prices in the future, but they're playing the long game and will try to delay or smooth out purchasing to minimize exposure to this spike.
> Additionally, we're likely to see Chinese fab'd DRAM now, which they've been attempting since the '70s but never been competitive at.
Companies like Samsung and SK Hynix have DRAM fabs in China already. This has been true for decades. You may have Chinese fab'd DRAM in the computer you're using right now.
Are you referring to complete home-grown DRAM designs? That, too, was already in the works.
Yes, via cxmt as discussed by Asianometry here: https://www.youtube.com/watch?v=mt-eDtFqKvk
As I mentioned, various groups within China have been working on China-native DRAM since the '70s. What's new are the margins and market demand to allow them to be profitable with DRAM which is still several years behind the competition.
> Manufacturers don't actually want too many extremely long term contracts because it would limit their ability to respond to market price changes.
I don't agree with this sentence. Why would not the same apply advice to oil and gas contracts? If you look at the size and duration of oil and gas contracts for major energy importers, they often run 10 years or more. Some of the contracts in Japan and Korea are so large, that a heavy industrial / chemical customers will take an equity stake in the extraction site.Except silicon, power, and water (and a tiny amount of plastic/paper for packaging), what else does a fab need that only produces DRAM? If true, then power is far and away the most variable input cost.
Various chemicals too, https://haz-map.com/Processes/97
Because oil & gas suppliers only ever sell one product, and memory fabs can dynamically switch product mix in response to supply & demand to optimize profits. The same sand, power and water can make DDR4, HBM or DDR5
I'm not sure I follow, varying range necessarily implies varying ratios (e.g. a product missing from the range means its ratio is zero).
Even when in theory you can obtain some higher quality products, the composition of the crude can make it too complex and expensive to practically obtain them.
You don't want to refine gasoline from heavy crude, especially in winter when demand is lower. For gasoline or kerosene you want to start from lighter crude. Same with many undesired components (either from the crude or resulting from the refining methods), the more you have, the more complex the refining, and the resulting ratio of products you obtain varies.
So in practice what you get out of the refining process absolutely depends on the characteristics of the crude, and many other things like market demand or the capability of your refinery.
Same as with silicon. The process to make the wafer results in different quality if you want to make low tech or cutting edge semiconductor products.
What do you mean "Break contracts"? I thought the conversation was about Futures contracts, you don't break them. You sell your contract or you take/give delivery (or cash settle).
Not all gas is sold by futures, you can have a contract for, say, delivery of 20 million cubic metres of gas a year and a penalty if that isn't met. Some people actually want the gas for gas-related purposes rather then as a financial phantom.
Same for DRAM - Dell actually wants the chips to put in computers, an economic abstraction doesn't help much when you need to ship real computers to get paid, and many customers aren't in the market for a laptop future (Framework pre-orders notwithstanding).
Borrowing costs can be wildly variable and are the main cost of making silicon. All the "inputs" over the lifecycle of a fab are so completely dwarfed by the initial capital costs that you can pretty much ignore them in any economic analysis. The cost of making chips is the cost of borrowing money to pay for capital costs, and the depreciation of the value of that capital.
Even if you pay them all 500k per year, that's "only" about a billion a year in payroll.
The New York fab plan costs something like 20 billion more or less now to build, with 100 billion over 20 years.
Also, maybe the calculus is different right now in the US, but it used to be the semiconductor workers were expected to have PhDs coming out of their ears but were not actually paid very well, with salaries in Taiwanese fabs being around the $50-60k mark, and lower paid workers being more like $20k or less. Presumably US fabs will be automated to an even greater extent due to labour costs.
So it's very possible that servicing debt on the capital outlay is substantially more expensive than the payroll.
If only.
20 years ago, fabs were being built to use 90nm class technology. Chips made on such an old node are so cheap today it can't pay even fraction of a percent of the capital costs of the plant per year. So all of it's capital has to have been depreciated a long time ago.
The oldest process node in high-volume production for memory is currently 1α, which started production in January 2021. It is no longer capable of making high-end products and is definitely legacy, and also has to have essentially depreciated all of the capital costs. The time a high-end fab stays high-end and can command premium prices, and during which it has to depreciate all the capital is ~3-5 years. After that either you push the plant to produce legacy/low price and low margin items, or you rebuild it with new tools with costs >$10B.
Also, even if fabs did last 20-30 years, the capital costs would dominate.
> And wouldn't "wildly variable borrowing costs" also affect oil and gas who need to finance the research phase and construction of the plant?
I don't understand? Nothing else costs anywhere near as much capital to produce than silicon chips. Thanks to the inexorable force of Moore's second law, fabs are machines that turn capital investment into salable product, nothing like it has ever existed before.
A Japanese factory that made epoxy resin for chips was destroyed and the price of SIMM chips skyrocketed (due to lack of availability).
I remember being very upset that I wasn't going to be able to upgrade to 4MB.
The memory business is a pure commodity and brutally cyclic. Big profit => build a fab => wait 2 years => oh shit, everyone else did it => dump units at below cost. Repeat.
Commander Data's specifications in the Star Trek TNG episode The Measure of a Man from 1989: 800 quadrillion bits of storage, computing at 60 trillion operations per second.
100 petabytes. That's a big machine. A very big machine. But supercomputers now have memories measured in petabytes.
They never used "bits" again in any Star Trek script. It was kiloquads and gigaquads from then on.
Then I did some googling and it turns out that a single 5090 GPU has a peak FP32 performance of over 100 TFLOPS!
> Memory: 16 GB GDDR6 SDRAM
So unless the RAM price jumps to 4x the price of a PS5, getting a PS5 is not the most cost efficient way to get to 64 GB of RAM.
In comparison, PS3 has been used to build cheap clusters[2].
I have a gaming PC, it runs Linux because (speaking as a Microsoft sysadmin with 10 years under my belt) I hate what Windows has become, but on commodity hardware it’s not quite there for me. Thought I’d play the PlayStation backlog while I wait for the Steam Machine.
Next is probably CPUs, even if AIs don't use them that much, manufactures will shift production to something more profitable, then gouge prices so that only enterprises will pay for them.
What's next? Electricity?
Where the f*k is all the abundance that AI was supposed to bring into the world? /rant
That can be a bigger problem for civilization.
If you make $4k/mo and rent is $3k, it's pretty silly to state that it's a meaningful thing for someone to scrimp and invest $100/mo into a brokerage account.
They definitely should do this, but it's not going to have any meaningful impact on their life for decades at best. Save for a decade to get $12k in your brokerage account, say it doubles to $24k. If you then decide you can get a generous 5% withdrawal rate you are talking $600/yr against rent that is now probably $3500/mo or more. Plus you're killing your compounding.
It's good to have so emergencies don't sink you - but it's really an annoying talking point I hear a lot lately. Eye rolling when you are telling someone struggling this sort of thing.
It really only makes a major impact if you can dump large amounts of cash into an account early in life - or if you run into a windfall.
like under a bridge or something? Pardon the hyperbole, but you would have to assume people with no disposable income are idiots in order to suggest that solution.
Theres some exceptions, like rural doctors can make more more than city doctors due to high demand. But the less "physical" your job is, the rarer these exceptions become.
For software devs, you can move out of silicon valley. Maybe to Texas. And now those 1.5 million dollar homes are only 700,000 dollars. But your salary will reflect that.
5% of that would be $8100.
Is $48k / year a typical income?
Yeah obviously if i can sock away $12,000 a year for 10 years i'll have money. Just be aware that i was in a bunch of funds from 2010-2020 and were it not for what happened at the end of that decade i wouldn't have made any additional money at all. In fact, i would have lost a decent chunk of money - not just in fees, but inflation.
Also, where are you guaranteed 6% for a decade? t-bills or something?
The centabillionaire who invests their fortune and receives $50M a day from the market and cares about nothing more than keeping the gravy train going is the problem. They head to Washington and splash their free money hose around in exchange for political support to keep that hose pumping at all costs. That's why politicians were so lock-step about how it was super important for the US to sell our industry to China and drop capital gains tax below income tax and open the Double Irish Sandwich and now they're getting rid of social programs and instituting 50 year mortgages and so on.
The fact that the guy with the firehose can pump the firehose by stepping on the guy investing $1k month is the core of the problem. Until you are at escape velocity -- your net worth is enough to cover your lifestyle from investment returns alone -- you lose by default.
What is your source on that? Moore's Law is Dead directly contradicts your claims by saying that OpenAI has purchased unfinished wafers to squeeze the market.
Tom’s Hardware: “Samsung raises memory chip prices by up to 60% since September as AI data-center buildout strangles supply,” Nov 2025. https://www.tomshardware.com/tech-industry/samsung-raises-me...
Note the consistent "up to 60% since September" figure in the above recent reports. That's for one module capacity, with others being up 30% to 50% - and it certainly isn't the 200% or more we're apparently seeing now in the retail market. That's pure panic hoarding, which is actually a very common overreaction to a sudden price spike.
Things being too cheap allows money to pool at the bottom in little people's hands in the forms of things like "their homes" and "their computers" and "their cars".
You don't really want billions in computing hardware (say) being stashed down there in inefficient, illiquid physical form, you want it in a datacentre where it can be leveraged, traded, used as security, etc. If it has to be physically held down there, ideally it should be expensive, leased and have a short lifespan. The higher echelons seem apparently to think they can drive economic activity by cycling money at a higher level amongst themselves rather than looping in actual people.
This exact price jump seems largely like a shock rather then a slow squeeze, but I think seeing some kind of reversal of the unique 20th century "life gets better/cheaper/easier every generation".
To me the #1 most important factor in a maintaining a prosperous and modern society is common access to tools by the masses, and computing hardware is just the latest set of tools.
Yes, that's the point. People fixing things themselves doesn't make the line go up, therefore it will be made harder.
And I assume some of them read these threads, so my advice to them would be to remember that the bunker air vents will probably be the main weak point.
I remember when there was a flood somewhere in Thailand in the 2011 and the prices of hardisks went up through the roof.
https://www.forbes.com/sites/tomcoughlin/2011/10/17/thailand...
Abundance isn't even the right framing. What most people actually want and need is a certain amount of resources - after which their needs are satiated and they move onto other endeavors. It's the elites that want abundance - i.e. infinite growth forever. The history of early agriculture is marked by hunter-gatherers outgrowing their natural limits, transitioning to farming, and then people figuring out that it's really fucking easy to just steal what others grow. Abundance came from making farmers overproduce to feed an unproductive elite. Subsistence farming gave way to farming practices that overtaxed the soil or risked crop failure.
The history of technology had, up until recently, bucked this trend. Computers got better and cheaper every 18 months because we had the time and money to exploit electricity and lithography to produce smaller computers that used less energy. This is abundance from innovation. The problem is, most people don't want abundance; the most gluttonous need for computational power can be satisfied with a $5000 gaming rig. So the tech industry has been dealing with declining demand, first with personal computers and then with smartphones.
AI fixes this problem, by being an endless demand for more and more compute with the economic returns to show for it. When AI people were talking about abundance, they were primarily telling their shareholders: We will build a machine that will make us kings of the new economy, and your equity shares will grant you seats in the new nobility. In this new economy, labor doesn't matter. We can automate away the entire working and middle classes, up to and including letting the new nobles hunt them down from helicopters for sport.
Ok, that's hyperbole. But assuming the AI bubble doesn't pop, I will agree that affordable CPUs are next on the chopping block. If that happens, modular / open computing is dead. The least restrictive computing environment normal people can afford will be a Macbook, solely because Apple has so much market power from iPhones that they can afford to keep the Mac around for vanity. We will get the dystopia RMS warned about, not from despotic control over computing, but from the fact that nobody will be able to afford to own their own computer anymore. Because abundance is very, very expensive.
more money for shareholder, 5 Trillion Nvidia???? more like a quadrillion for nvidia market cap
That'll come with the bubble bursting and the mass sell off.
Yes. My electricity prices jumped 50% in 3 years.
How much is due to long overdue infrastructure upgrades and greed by providers, vs the cost of energy?
Also, consumer prices _have_ risen (mine included), but it's not clear that this is only because AI. While EV charging is not at the scale of all data centers combined, it seems to grow even faster than the datacenter's consumption, and is expected to eclipse the latter around 2030. Maybe sooner due to missing solar incentives.
Also, to rant on: According to [1], an average Gemini query costs about 0.01 cents (Figure 2 - say 6000 queries per kWh at 60 cents/kWh, which is probably more than the industrial consumers pay). The same paper says one other providers is not off by that much. I dare say that at least for me, I definitely save a lot of time and effort with these queries than I'd traditionally have to (go to library, manually find sources on the web, etc), so arguably, responsibly used, AI is really quite environmentally friendly.
Finally: Large data centers and their load is actually a bit fungible, so they can be used to stabilize the grid, as described in [2].
I would think it would be best if there were more transparency on where the costs come from and how they can be externalized fairly. To give one instance, Tesla could easily [3] change their software to monitor global grid status and adjust charging rates. Did it happen ? Not that I know. That could have a huge effect on grid stability. With PowerShare, I understand that vehicles can also send energy back to power the house - hence, also offload the grid.
[1] https://services.google.com/fh/files/misc/measuring_the_envi...
[2] https://www.linkedin.com/feed/update/urn:li:activity:7358514...
[3] that's most likely a wild exaggeration
This only makes sense if you ignore profits. We've been paying the bills since before this was "overdue"; for instance i am still paying a storm recovery surcharge on my electric bill from before i ever moved to this state. At the point where a "temporary infrastructure surcharge for repairs" becomes a line item on their profit statement, that's where i start to get real annoyed.
Our electric company has 287,000 customers and has a market cap of >$800,000,000
what percentage of that eight tenths of a billion in market cap came from nickel and diming me?
* note: nickel and dime was established as "an insignificant amount of money" in the 1890s, where sirloin, 20% fat was $0.20 a pound. That's $13.50 now (local); chuck $0.10 and $19 now. So somewhere between 67 and 180 times less buying power from a nickel and dime, now. Also that means that, y'know, my surcharges being $15-$30 a month is historically "nickels and dimes"
https://babel.hathitrust.org/cgi/pt?id=uiug.30112019293742&s...
I mean part of me thinks it's a necessary evil because we relied too much on Russian gas in the first place. But that's because we extracted most of our own gas already (https://en.wikipedia.org/wiki/Groningen_gas_field), which that article lists as one of the factors in the Dutch welfare state being a thing - it and smaller fields out at sea contributed over 400 billion to the Dutch economy since the 1950's.
It may have been a bit self-deprecating, but I think your “rant” is a more than justified question that really should be expanded well beyond just this matter. It’s related to a clear fraud that has been perpetrated upon the people of the western world in particular for many decades and generations now in many different ways. We have been told for decades and generations that “we have to plunder your money and debase of and give it to the rich that caused the {insert disaster caused by the ruling class} misery and we have to do it without any kind of consequences for the perpetrators and no, you don’t get any kind of ownership or investment and we have to do it now or the world will end”
In the hands of the owners of the AI, as a direct consequence of the economic system. It was never going to play out any other way.
Humanity will have to adopt new human-focused modes of living and organizing society, or else. And climate change is coming along to make sure the owning class can't ignore this fact any longer.
But please, don't be coy: tell us about that other system that is designed for "human flourishing" - we're dying to learn about it.
Because I grew up under communism and I lived its miserable failures: the non-profit system didn't even manage to feed, cloth or warm/cool us.
> new human-focused modes of living and organizing society
Oh, these sounds sooo promising. Please do tell us: would you by any chance be willing to use force to "convince" the laggards of the benefits of switching? What if some refuse to believe your gospel? Will you turn to draconic laws and regulations?
Capitalism increasingly fails to provide well-being to the majority of the global population. It's obvious we need to come up with something else, even if it's not clear yet what shape that will take.
If we can't find an alternative that works, we can also just wind down humanity, and not much of value to the universe will be lost :)
You don't need to go full communist to make things better.
There are shades of grey here. Capitalism is a system with many inherent problems. Exploring alternatives is not the same thing as being a Stalinist
It's like the lack the most basic understanding of economics and they never read any history. I mean, communism has failed everywhere it was tried and there were so many A/B test that plainly show each system's results: North vs South Korea, Eastern Europe before vs after 1990, USA vs USSR, Argentina during the last hundred years, Venezuela before and after Chavez, etc.
Or they push socialism under new names ("democratic") as if it's a new thing, not just a watered down form of communism, with authoritarian communism being the logical end game of socialism - because "at some point you run out of other people's money" and you need force to keep fleecing them. Just like it happened in Venezuela...
You seem well aware of authoritarian communism, but generally unaware of libertarian socialism. They are distinct, and the latter has a decades long history of despising the former as much as you do, though for being based on the same primary issue as capitalism: coercive hierarchy.
It all starts with assuming the freedom of human beings, and that the only way to organize a system has nothing to do with efficiency or profit, and everything to do with maintaining that human freedom. It must be based on human freedom, and the concept that no one knows how to run your life better than you do. That no one deserves to be able to force you to do things. Whole systems arise from that fact, that have a long basis in history. As I've said elsewhere, you don't end up with men on the moon or dollar stores, but you do get people who are in control of their own lives.
We've seen here in EU socialist policies cripple our economy to the point we're lagging the USA and we can't even defend ourselves from the blood-hungry psychopath in the east.
> libertarian socialism
Is this a real thing? Any examples of being implemented anywhere? Cause it sounds like a oxymoron to me. Socialism means regulations and confiscation from the productive members of society (taxation). Those must be enforced with the threat of force so less liberty there...
> no one deserves to be able to force you to do things
So I shouldn't be forced to pay my taxes? How would then a socialist government implement its expensive social policies?!
Let me catch you up on 150 years of capitalist alternative, theory, critique, and analysis, that answers every single snarky little "gotcha" remark you have: https://theanarchistlibrary.org/library/the-anarchist-faq-ed...
I'm not going to spoon-feed you anymore because you're clearly not interested in a conversation, and are quite happy with the status quo. I hope for your sake you don't end up on the wrong side of capitalism.
> But please, don't be coy: tell us about that other system that is designed for "human flourishing" - we're dying to learn about it.
Libertarian socialism, anarchocommunism, any system where human freedom is the basis, and not coercion or hierarchy. This stuff is not new or radical, it's just not favored by people with lots of money to lose.
> Oh, these sounds sooo promising. Please do tell us: would you by any chance be willing to use force to "convince" the laggards of the benefits of switching? What if some refuse to believe your gospel? Will you turn to draconic laws and regulations?
Lucky for you, no. The complete opposite. Freedom of association is the entire foundation of it. We all get to associate with whomever we want, when and for as long as we want. Someone being a condescending prick in your local comment section? You get to ignore them! No commissars or cops or Party. Someone wants to go play hierarchical capitalism with his friends? As long as he's not messing with other people or contravening their rights, they get to do whatever they want.
Will any of these systems result in 99 cent stores, fast food restaurants, or people on the moon? Almost definitely not. But those are all irrelevant to creating a sustainable environment designed for human beings, and not profit.
The lack of innovation (or even reading of basic history...) in what is possible in terms of organizing human societies is frankly sad, especially among tech workers. Most people are too influenced by capitalism (intentionally so) to believe that how things are now is the only way they can be. There is so little scope for innovation and change, and that starts with the owning class who have no interest in it changing.
There is a saying: you can be communist under capitalism but you can't be capitalist under communism. And it's true: there are plenty of communes, coops and non-profits being run in the US right now.
Can you describe how that would work the other way around under "libertarian socialism"?
> sustainable environment designed for human beings, and not profit
Profit is also for humans, not space aliens. Without profit, how do you convince people do the necessary jobs nobody wants to do, like sanitation or war?
> people on the moon? Almost definitely not.
Then the country and the system resulting in people on the moon will come and take over your own "designed for human beings" because they will innovate and advance technologically while you will be playing hippy in the park.
An abundance of intelligence on Earth with all its spoils: new medicine, energy, materials, technologies and new understandings and breakthroughs - these seem quite significant to me.
Super-intelligence is a completely different can of worms. But I'm not optimistic about super-intelligence either. It seems super naive to me to assume that the spoils of super-intelligence will be shared with the people who no longer can bring anything to the table. You aren't worth anything to the super-rich unless you can do something for them which the super-intelligence can't do.
And when did "the rich" hoard anything for themselves only?! Usually I see them democratizing products and services so they are more accessible to everyone, not less.
Computers in my pocket and on my wrist, TVs as big as a wall and thin like a book, electric cars, flights to anywhere I dream of traveling, investing with a few clicks on my phone - all made possible to me by those evil and greedy rich in their race for riches. Thank you rich people!
You still need to be rich to partake. Most business ventures will still require capital even in the age of super-intelligence. Super-intelligence will make labor worthless (or very cheap) it won't make property worthless.
> And when did "the rich" hoard anything for themselves only?! Usually I see them democratizing products and services so they are more accessible to everyone, not less.
There are plenty of examples of rich people hoarding their wealth. Countries with natural resources often have poor citizens because those citizens are not needed to extract that wealth. There is little reason why super-intelligence will not lead to a resource curse where the resource is human intelligence or even human labor.
> Computers in my pocket and on my wrist, TVs as big as a wall and thin like a book, electric cars, flights to anywhere I dream of traveling, investing with a few clicks on a website - all made possible to me by those evil and greedy rich in their race for riches. Thank you rich people!
Those rich people didn't share with you out of the goodness of their heart but because it was their best strategy to become even richer. But that's no longer the case when you can be replaced by super-intelligence.
Again, you can invest, today, in AI stocks and ETFs, with just $100 and a Robinhood account. No need to be rich.
> Super-intelligence will make labor worthless (or very cheap) it won't make property worthless.
If the labor is worthless, the great majority of people will be poor. Due to the law of supply & demand, property will be worthless since there will be very little demand for it.
> Countries with natural resources often have poor citizens because those citizens are not needed to extract that wealth.
Countries with or without resources often have poor citizens simply because being poor is the natural state of mankind. The only system that, historically, allowed the greatest number of people to exit poverty is capitalism. Here in Eastern Europe we got to witness an astonishing change of fortunes when we switched from communism to capitalism. The country and its resources didn't change, just the system and, correspondingly, the wealth of the population.
> it was their best strategy to become even richer. But that's no longer the case when you can be replaced by super-intelligence.
How can they become richer when most people are dirt broke (because they were replaced by AIs) and thus can't buy their products and services? Look at how even Elon's fortunes shrink when his company misses a sales forecast. He is only as rich as the number of customers he can find for his cars.
And then? I'll compensate the loss of thousands of dollars I don't earn anymore every month with the profits of a $100 investment in some ETF?
> If the labor is worthless, the great majority of people will be poor. Due to the law of supply & demand, property will be worthless since there will be very little demand for it.
Property has inherent value. A house I can live in. A farm can feed me. A golf course I can play golf on. These things have value even if nobody can buy them off me (because they don't have anything I want). Supply and demand determine only the _price_ not the _value_ of goods and services.
> Countries with or without resources often have poor citizens simply because being poor is the natural state of mankind. The only system that, historically, allowed the greatest number of people to exit poverty is capitalism. Here in Eastern Europe we got to witness an astonishing change of fortunes when we switched from communism to capitalism. The country and its resources didn't change, just the system and, correspondingly, the wealth of the population.
None of this has any connection to anything I've written. I'm talking about the concept of a resource curse. Countries rich in natural resources (oil, diamonds, ...) where the population is poor as dirt because the ruling class has no incentive to share any of the profits. The same can happen with AI if we don't do anything about it.
> How can they become richer when most people are dirt broke (because they were replaced by AIs) and thus can't buy their products and services?
Other rich people can buy their products and services. They don't need you to buy their products and services because you don't bring anything to the table because all you have is labor and labor isn't worth anything (or at least not enough to survive off it). Put differently: Why do you think rich people would like to buy your labor if using AI/robots is cheaper? What reason would they have to do that?
> Look at how even Elon's fortunes shrink when his company misses a sales forecast. He is only as rich as the number of customers he can find for his cars.
You're proving my point: Elon still lives in a world where labor is worth something. Because Elon lives in a world where labor is worth something it is in his interest that there are many people capable of providing that labor to him. This means it is in his interest that the general population has access to food and water, is well eduacated, ...
If Elon were to live in a world where labor is done by AI/robots there would be little reason for him to care. Yes, he couldn't sell his cars to the average person anymore, but he wouldn't want to anyway. He could still sell his cars to Altman in exchange for an LLM that strokes his ego or whatever rich people want.
The point is: Because rich and powerful people still have to pay for labor, their incentives are at least somewhat aligned with the incentives of the average person.
Probably most of it at least, because under your supposition that the AGI will replace labor we'll get incredibly cheap products and services as a result.
> Property has inherent value.
You weren't talking about inherent value when you wrote "Super-intelligence will make labor worthless (or very cheap) it won't make property worthless." which is what I replied to.
> None of this has any connection to anything I've written. I'm talking about the concept of a resource curse.
And my point was that the wealth of a nation does not come from its resources but its entrepreneurs. Resources are a course usually when monopolized and administrated (looted) by corrupt governments, not when exploited by private entities. AIs controlled by governments would scare me indeed.
> Other rich people can buy their products and services.
> He could still sell his cars to Altman
Are you joking?! How many cars do you think Altman can buy?! Do you really think the rich people can be an actual market?! How many rich people do you think there are out there?! Are you talking about middle class by any chance?
> Why do you think rich people would like to buy your labor if using AI/robots is cheaper?
Because labor evolves too, just like it evolved when automation, IT and outsourcing came around. Yes, I can't sell my dirt digging services in the age of digging machines but I can learn to drive one and sell my services as a driver. Maybe I can't sell coding in the age of AI but I can sell my ability to understand, verify and control complex systems with code written by AIs.
And so on, you get the idea. Adaptation, creativity and innovation is the name of the game.
> You're proving my point
> The point is: Because rich and powerful people still have to pay for labor their incentives are at least somewhat aligned with the incentives of the average person
Not at all. My point was that Elon and rich people are interested in you as a customer, not for your labor. That is the old mindset and the one we need to evolve from. See yourself as selling and buying products and services, not your labor, and the world will be full of opportunities. "The rich" won't seem like a separate class from you, but regular people you can interact and profit from (while mutually benefiting).
No, we will get cheap _labor_, not necessarily cheap _products_.
> You weren't talking about inherent value when you wrote "Super-intelligence will make labor worthless (or very cheap) it won't make property worthless." which is what I replied to.
I was talking about value, not price.
> AIs controlled by governments would scare me indeed.
What is the difference?
> Are you joking?! How many cars do you think Altman can buy?!
Why would Elon need to sell more cars? And for what exactly? You have nothing Elon wants.
> Maybe I can't sell coding in the age of AI but I can sell my ability to understand, verify and control complex systems with code written by AIs.
Unless the super-intelligence is better than you here too. Why wouldn't it be?
> Adaptation, creativity and innovation is the name of the game.
It is the name of the game until super-intelligence comes along which will be better at all of this than you. That's exactly the scary thing about super-intelligence.
> My point was that Elon and rich people are interested in you as a customer, not for your labor.
This is the same thing. I can only be a customer if I can bring something to the table that Elon wants from me. That thing is money. I can only bring money to the table if someone that has money needs something I can provide. That thing is human labor. If super-intelligence removes the economic value of human labor, I can no longer earn money and consequently Elon will not be interested in me as a customer.
> See yourself as selling and buying products and services, not your labor, and the world will be full of opportunities.
Where exactly is the difference between me "selling a service" and me selling "labor"?
> "The rich" won't seem like a separate class from you, but regular people you can interact and profit from (while mutually benefiting).
I doesn't matter whether or not you see the rich as a seperate class. What matters is simply the following:
People who own a lot of stuff, don't sell their labor and/or buy a lot of labor will profit if labor becomes cheap. People who don't own a lot of stuff, sell their labor and don't buy a lot of labor face an existential threat if labor becomes cheap.
We know that none of the goods you listed would be available to the masses unless there was profit to be gained from them. That's the point.
I have a hard time believing a large group being motivated and mutually benefiting towards progression of x thing would result in worse outcomes than a few doing so. We just have never had an economic system that could offer that, so you assume the greedy motivations of a few is the only path towards progress.
Please propose it yourself.
> you assume the greedy motivations of a few is the only path towards progress
No. I assume the greedy motivations of the many is the best path towards progress. Any other attempts to replace this failed miserably. Ignoring human nature in ideologies never works.
If you want to look at what historically has happened when the rich have had a sudden rapid increase in intelligence and labor, we have examples.
After the end of the Punic wars, the influx of slave labor and diminution of economic power of normal Roman citizens lead to: accelerating concentration of wealth, civil war and an empire where the value of human life was so low that people were murdered in public for entertainment.
Yet those things did not happen in communist countries (or happened way less in socialist ones), during the same time period, even though the market was there too. That is why EU's socialist countries consume high tech products and services from the USA and not the other way around.
With a hundred bucks and a Robinhood account, you too can be part of this greedy, evil and mysterious "owners of AI" class and (maybe) some day enjoy the promised spoils.
Oh the wonders of Capitalism, the economic system offering unequal abundance to everyone caring to take part... Where are the other, much touted systems, masters at spreading misery equally?
next?
https://www.cnbc.com/2025/11/14/data-centers-are-concentrate...
AI will lead to abundance. For those that own stuff and no longer have to pay for other people to work for them.
Why are you saying that? Anybody working for a living (but saving money) can invest in AI stocks or ETFs and partake in that potential abundance. Robinhood accounts are free for all.
Investing in AI companies is just about the last piece of advice I'd give someone who's struggling financially.
The billionaires will largely be fine. They hedge their bets and have plenty of spare assets on the side. Little guy investors? Not so much. They'll go from worrying about their retirement plan to worrying about becoming homeless.
> (but saving money)
These two are already difficult or impossible for many people. Especially a big chunk of USAmericans have been living paycheck to paycheck for a long time now, often taking multiple jobs just to make ends meet.
And then to gamble it on a speculative market, whose value does not correlate with its performance (see e.g. Tesla, compare its sales / market share with its market value compared to other car manufacturers). That's bad advice. And for an individual, you'd only benefit a fraction from what the big shareholders etc earn. Investing is a secondary market.
That doesn't mean they are poor, just poor with their money. Saving is a skill that needs to be learned.
> That's bad advice.
No. Not investing, when the S&P500 index had a 6% inflation-adjusted annual historical return over the last 100 years - is bad advice. Not hedging the arrival of an AGI that you think can replace you - is bad advice.
Yes, you can drill your own well to have water in a society, for some. Or, you come up with the unheard-of idea of public utilities, so people can simply open a tap and enjoy. In some regions, even drink it. Personally, growing up in a lucky place like that, I have a hard time imagining to ever live in a place that required me to buy bottled water.
Yes, you can demand each member of society to learn about ETFs. Personally, I enjoy every part of life where complexity is being dealt with for me; I wouldn’t survive a day without that collaboration.
We have a choice to design society, like we have a choice to design computer systems.
It's not utopian, is downright dystopian. Redistribution means forceful confiscation (through taxation) from the most productive members of society. This in practice means punishing work, innovation and creation and rewarding laziness and low productivity. This will logically lead to a less productive societies that will fall behind and be either bought out or conquered by more successful, more aggressive societies. We've seen this scenario unfolding in the EU.
> I enjoy every part of life where complexity is being dealt with for me
Me too. But I want private companies dealing with that complexity, because market competition controls them and keeps them honest, unlike governments which are monopolies happy to give people free benefits and entitlements to buy themselves their next elections.
I also want participation in such schemes to be voluntary not compulsory since this keeps people responsible, aware and educated. Compulsory schemes are widely hated and rejected, even when being a net positive otherwise.
> public utilities
My water utility is a state-granted monopoly charging outrageous prices. Same for my electricity provider. I would love to quit them, but any competition was outlawed, of course.
> you can demand each member of society to learn about ETFs. Personally
If you have time, go to an online calculator and compute how much the compounded amount taken for pension from your salary every month would be worth today if invested in an S&P500 index ETF - then compare to your projected state pension. It was eye opening to me.
(*) my definition of reality includes not only observable “facts“ that we may agree on, but also our experience of it; our perception, judgments, values etc.
How long do you estimate this period of supply constraint will be? Will manufacturers continue to be greedy, or will they grow less greedy as the supply improves based on the price signals the high price indicates?
The AI bubble has also pushed up secondhand prices. I work in ewaste recycling. Two weeks ago, a petabyte's worth of hard drives showed up. After erasing, testing, listing, and selling them, I'll have a very merry Christmas.
That and water. Electricity: Google made a post about scaling k8s to 135.000 nodes yesterday, mentioning how each node has multiple GPUs taking up 2700 watts max.
Water, well this is a personal beef, but Microsoft built a datacenter which used potable / drinking water for backup cooling, using up millions of liters during a warm summer. They treat the water and dump it in the river again. This was in 2021, I can imagine it's only gotten worse again: https://www.aquatechtrade.com/news/industrial-water/microsof...
Why do you think this is a lot of water? What are the alternatives to pulling from the local water utility and are those alternatives preferable?
[0] https://en.wikipedia.org/wiki/North_Holland
[1] https://en.wikipedia.org/wiki/Water_supply_and_sanitation_in...
AI/LLM companies will pay TSMC more than Apple is willing to further subsidize this neat little box.
ebay .com /itm/ 256168320806
(no association, just trying to help, I am still using DDR4)I'd bet that a good chunk of the apparently sudden demand spike could be last month's Microsoft Windows 10 end-of-support finally happening, pushing companies and individuals to replace many years worth of older laptops and desktops all at once.
I'm joking, but only kind of. It's not a domain that I do a lot of, but I haven't touched Qt in so long that it would basically be starting from scratch if I tried to write an app with it; I could write an Election app in like an hour.
> Despite server-grade RDIMM memory and HBM being the main attractions for hardware manufacturers building AI servers, the entire memory industry, including DDR5, is being affected by price increases. The problem for consumers is that memory manufacturers are shifting production prioritization toward datacenter-focused memory types and producing less consumer-focused DDR5 memory as a result.
But I'm sure the hysteria around that isn't helping prices come back down either.
Felt like I overpaid at the time too. Wow
AI: RAM
Thanks for taking away years of affordable computing from people. Time is more valuable; there's no getting it back.
As someone currently fighting to shave megabytes off a C++ engine, it hurts my soul to see a simple chat app (Electron) consume 800MB just to idle. We spent the last decade using Moore's Law to subsidize lazy garbage collection, five layers of virtualization, and shipping entire web browsers as application runtimes. The computer is fast, but the software is drowning it.
Safari is still leaking memory where a single tab managed to use 50GB regardless of which website it is.
I bought right after this curve hit, like the day after. I went into Microcenter for a new PC w/64gb ddr5. The day before, their kits were ~$189. The day I bought they were $360. Now the same kit on Newegg is $530.
It's been 2 weeks.
It will probably take a while, but is the general public going to be priced out of computers eventually?
geerlingguy•2mo ago
Havoc•2mo ago
Hamuko•2mo ago