frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

OpenCiv3: Open-source, cross-platform reimagining of Civilization III

https://openciv3.org/
553•klaussilveira•10h ago•157 comments

The Waymo World Model

https://waymo.com/blog/2026/02/the-waymo-world-model-a-new-frontier-for-autonomous-driving-simula...
876•xnx•15h ago•532 comments

How we made geo joins 400× faster with H3 indexes

https://floedb.ai/blog/how-we-made-geo-joins-400-faster-with-h3-indexes
79•matheusalmeida•1d ago•18 comments

What Is Ruliology?

https://writings.stephenwolfram.com/2026/01/what-is-ruliology/
8•helloplanets•4d ago•3 comments

Unseen Footage of Atari Battlezone Arcade Cabinet Production

https://arcadeblogger.com/2026/02/02/unseen-footage-of-atari-battlezone-cabinet-production/
13•videotopia•3d ago•0 comments

Show HN: Look Ma, No Linux: Shell, App Installer, Vi, Cc on ESP32-S3 / BreezyBox

https://github.com/valdanylchuk/breezydemo
191•isitcontent•10h ago•24 comments

Monty: A minimal, secure Python interpreter written in Rust for use by AI

https://github.com/pydantic/monty
190•dmpetrov•10h ago•84 comments

Show HN: I spent 4 years building a UI design tool with only the features I use

https://vecti.com
303•vecti•12h ago•133 comments

Microsoft open-sources LiteBox, a security-focused library OS

https://github.com/microsoft/litebox
347•aktau•16h ago•169 comments

Sheldon Brown's Bicycle Technical Info

https://www.sheldonbrown.com/
347•ostacke•16h ago•90 comments

Dark Alley Mathematics

https://blog.szczepan.org/blog/three-points/
75•quibono•4d ago•16 comments

Hackers (1995) Animated Experience

https://hackers-1995.vercel.app/
444•todsacerdoti•18h ago•226 comments

Show HN: If you lose your memory, how to regain access to your computer?

https://eljojo.github.io/rememory/
242•eljojo•13h ago•148 comments

PC Floppy Copy Protection: Vault Prolok

https://martypc.blogspot.com/2024/09/pc-floppy-copy-protection-vault-prolok.html
46•kmm•4d ago•3 comments

Delimited Continuations vs. Lwt for Threads

https://mirageos.org/blog/delimcc-vs-lwt
17•romes•4d ago•2 comments

An Update on Heroku

https://www.heroku.com/blog/an-update-on-heroku/
379•lstoll•16h ago•258 comments

How to effectively write quality code with AI

https://heidenstedt.org/posts/2026/how-to-effectively-write-quality-code-with-ai/
225•i5heu•13h ago•171 comments

Why I Joined OpenAI

https://www.brendangregg.com/blog/2026-02-07/why-i-joined-openai.html
103•SerCe•6h ago•84 comments

Learning from context is harder than we thought

https://hy.tencent.com/research/100025?langVersion=en
162•limoce•3d ago•85 comments

I spent 5 years in DevOps – Solutions engineering gave me what I was missing

https://infisical.com/blog/devops-to-solutions-engineering
131•vmatsiiako•15h ago•56 comments

Introducing the Developer Knowledge API and MCP Server

https://developers.googleblog.com/introducing-the-developer-knowledge-api-and-mcp-server/
41•gfortaine•8h ago•11 comments

Show HN: R3forth, a ColorForth-inspired language with a tiny VM

https://github.com/phreda4/r3
63•phreda4•9h ago•11 comments

Female Asian Elephant Calf Born at the Smithsonian National Zoo

https://www.si.edu/newsdesk/releases/female-asian-elephant-calf-born-smithsonians-national-zoo-an...
20•gmays•5h ago•3 comments

Show HN: ARM64 Android Dev Kit

https://github.com/denuoweb/ARM64-ADK
14•denuoweb•1d ago•2 comments

Understanding Neural Network, Visually

https://visualrambling.space/neural-network/
262•surprisetalk•3d ago•35 comments

I now assume that all ads on Apple news are scams

https://kirkville.com/i-now-assume-that-all-ads-on-apple-news-are-scams/
1035•cdrnsf•19h ago•428 comments

Zlob.h 100% POSIX and glibc compatible globbing lib that is faste and better

https://github.com/dmtrKovalenko/zlob
6•neogoose•2h ago•3 comments

FORTH? Really!?

https://rescrv.net/w/2026/02/06/associative
56•rescrv•18h ago•19 comments

Show HN: Smooth CLI – Token-efficient browser for AI agents

https://docs.smooth.sh/cli/overview
85•antves•1d ago•63 comments

WebView performance significantly slower than PWA

https://issues.chromium.org/issues/40817676
20•denysonique•6h ago•3 comments
Open in hackernews

1GB Raspberry Pi 5, and memory-driven price rises

https://www.raspberrypi.com/news/1gb-raspberry-pi-5-now-available-at-45-and-memory-driven-price-rises/
170•shrx•2mo ago

Comments

zbendefy•2mo ago
What has changed now in the memory landscape/ai workload in the recent months compared to summer or spring?
jsheard•2mo ago
Apparently OpenAI locked down 40% of the global DRAM supply for their Stargate project, which then caused everyone else to start panic-buying, and now we're here: https://pcpartpicker.com/trends/price/memory/
giancarlostoro•2mo ago
I dont even get this trend, wouldnt OpenAI be buying ECC RAM only anyway? Who in their right mind runs this much infrastructure on NON ECC RAM??? Makes no sense to me. Same with GPUs they aren't buying your 5090s. Peoples perception is wild to me.
jsheard•2mo ago
OpenAI bought out Samsung and SK Hynixes DRAM wafers in advance, so they'll prioritize producing whatever OpenAI wants to deploy whether that's DDR/LPDDR/GDDR/HBM, with or without ECC. That means way less wafers for everything else so even if you want a different spec you're still shit out of luck.
nirui•2mo ago
You forgot to mention that everyone else also raised their price because, you know, who don't like free money.

Last year I brought two 8G DDR3L RAM stick made by Gloway for around $8 each, now the same stick is priced around $22, a 275% increase in price.

SSD makers are also increasing their prices, but that started one or two years ago, and they did it again recently (of course).

It looked like I'll not be buying any first-hand computers/parts before the price can go normal again.

wqaatwt•2mo ago
> you know, who don't like free money.

Yes but otherwise you’d get huge shortages and would be unlikely to be able to buy it at all. Also a significant proportion of the surplus currently going to manufacturers/etc. would go to various scalper and resellers

raddan•2mo ago
I seriously doubt that single bit errors on the scale of OpenAI workloads really matters very much, particularly for a domain that is already noisy.
PunchyHamster•2mo ago
Till they hit your program memory. We just had really interesting incident where one of the Ceph nodes didn't fail but started acting erratically, bringing whole cluster to a crawl, once a failing RAM module had some uncorrectable errors.

And that was caught because we had ECC. If not for that we'd be replacing drives, because metrics made it look like it is one of OSDs slowing to a crawl, which usual reason is drive dying.

Of course, chance for that is pretty damn small, bit also their scale is pretty damn big.

close04•2mo ago
Random bit flips is their best path to AGI.
drum55•2mo ago
At the chip level there’s no difference as far as I’m aware, you just have 9 bits per byte rather than 8 bits per byte physically on the module. More chips but not different chips.
cesarb•2mo ago
> you just have 9 bits per byte rather than 8 bits per byte physically on the module. More chips but not different chips.

For those who aren't well versed in the construction of memory modules: take a look at your DDR4 memory module, you'll see 8 identical chips per side if it's a non-ECC module, and 9 identical chips per side if it's an ECC module. That's because, for every byte, each bit is stored in a separate chip; the address and command buses are connected in parallel to all of them, while each chip gets a separate data line on the memory bus. For non-ECC memory modules, the data line which would be used for the parity/ECC bit is simply not connected, while on ECC memory modules, it's connected to the 9th chip.

(For DDR5, things are a bit different, since each memory module is split in two halves, with each half having 4 or 5 chips per side, but the principle is the same.)

MangoToupe•2mo ago
On the flipside, LLMs are so inconsistent you might argue ECC is a complete waste of money. But Open Ai wasting money is hardly anything new.
kvemkon•2mo ago
Using digital chips instead of some novel analog approach is even greater waste.

> China's AI Analog Chip Claimed to Be 3000X Faster Than Nvidia's A100 GPU (04.11.2023)

https://news.ycombinator.com/item?id=38144619

> Q.ANT’s photonic chips – which compute using light instead of electricity – promise to deliver a 30-fold increase in energy efficiency and a 50-fold boost in computing speed, offering transformative potential for AI-driven data centers and HPC environments. (24.02.2025)

https://qant.com/press-releases/q-ant-and-ims-chips-launch-p...

MisterTea•2mo ago
ECC modules use the same chips as non ECC modules so it eats into the consumer market too.
officialchicken•2mo ago
Good point! But they are slightly more energy hungry. At these scales I wonder if Stargate could go with one less nuclear reactor simply by switching to non-ECC RAM
Majromax•2mo ago
Penny-wise and pound foolish. Non-ECC RAM might save on the small amount of RAM power, but if a bit-flip causes a failed computation then an entire forwards/backwards step – possibly involving several nodes – might need to be redone.
coldtea•2mo ago
>but if a bit-flip causes a failed computation then an entire forwards/backwards step – possibly involving several nodes – might need to be redone.

Which for the most part it would be an irrelevant cost-of-doing business compared to the huge savings from non-ECC and how incosequential it is if some ChatGPT computation fails...

hylaride•2mo ago
Linus Torvalds was recently on Linux Tech Tips to build a new computer and he insisted on ECC RAM. Torvalds was convinced that memory errors are a much greater problem for stability than otherwise posted and he's spent an inordinate amount of time chasing phantom bugs because of it.

https://www.youtube.com/watch?v=mfv0V1SxbNA

coldtea•2mo ago
ECC RAMs utility is overblown. Major companies often use off-the-shelves non enterprise parts for huge server installations, including regular RAM. The rare bit flipping is hardly a major concern at their scale, and for their specific purposes.
Glemkloksdjf•2mo ago
Do you have a source for this?

I would not want to rerun a whole run just because of bit flips and bit flips become a lot more relevant the more servers you need.

wtallis•2mo ago
Most server CPUs require RDIMMs, and while non-ECC RDIMMs exist, they are not a high-volume product and are intended for workstations rather than servers. The used parts market would look very different if there were lots of large-scale server deployments using non-ECC memory modules.
crote•2mo ago
ECC memory is a bit like RAID: A consumer-level RAM stick will (traditionally) have 8 8-bit-wide chips operating basically in RAID-0 to provide 64-bit-wide access, whereas enterprise-level RAM sticks will operate with 9 8-bit-wide chips in something closer to RAID-4 or -5.

But they are all exactly the same chips. The ECC magic happens in the memory controller, not the RAM stick. Anyone buying ECC RAM for servers is buying on the same market as you building a new desktop computer.

embedding-shape•2mo ago
> Anyone buying ECC RAM for servers is buying on the same market as you building a new desktop computer.

Even when the sticks are completely incompatible with each other? I think servers tend to use RDIMM, desktops use UDIMM. Personally I'm not seeing as step increase in (b2b) RDIMMs compared to the same stores selling UDIMM (b2c), but I'm also looking at different stores tailored towards different types of users.

StrLght•2mo ago
The expensive part is DRAM chips. They drive prices for sticks.
kvemkon•2mo ago
> enterprise-level RAM sticks will operate with 9 8-bit-wide chips

Since DDR5 has 2 independent subchannels, 2 additional chips are needed.

KeplerBoy•2mo ago
The 5090 is the same chip as the workstation RTX 6000.

Of course OpenAI is also not buying that but B200 DGX systems, but that is still the same process at TSMC.

throw46106186•2mo ago
It's kind of depressing to see that it takes just one asshole to screw the entire electronic market. If you read this, Sam, FU.
b00g13bored•2mo ago
The proles will get dumb screens tethered to their sanctioned models; and we will be grateful!
MisterTea•2mo ago
I got one of the Newegg circulars in my email advertising a sweet little uATX AMD server board and got to thinking that my home FreeBSD server could use a CPU bump and more memory. As soon as I saw how much 128GB of DDR5 ECC would cost my jaw dropped and noped the fuck out. The cheapest 32GB modules are around $300 and upwards of $500. Thought I was going to gift myself early this Christmas. Depressing indeed.
buildbot•2mo ago
Indeed, it makes mini computers with soldered ram actually end up being quite cheap by comparison. HP will currently sell you 128GB AMD or Nvidia boxes for 1.7-2.8k depending on your flavor of choice. Not ECC though.
PunchyHamster•2mo ago
...till supplies last. Which won't be long when people do exactly that (hey, that mini PC is now cheaper than building similar setup)
buildbot•2mo ago
Indeed…
JonChesterfield•2mo ago
The strix halo mini workstation is 128gb of ecc. Good machine. Source https://h20195.www2.hp.com/v2/GetPDF.aspx/c09086887
buildbot•2mo ago
Necro reply - I don't think they actually shipped it with ECC... If you lookup a Z1 mina G1a repair video: https://youtu.be/1i_PfH05ekw?si=MpQ0Uc9QVwhgsFzi&t=267 It has 8 chips, not 10. It needs 10 from true ECC.
JonChesterfield•1mo ago
I should check this next time I take it apart, thanks for the heads up :)
buildbot•1mo ago
I’d be really curious to see - that would possibly sell me on one! Linux should also report edac being up in dmesg if it has ecc.
alias_neo•2mo ago
Exactly this.

I'd been planning to upgrade my desktop as a christmas present for myself.

Now I have the cash and was looking at buying my PCPartPicker list, the cost of the 64GB DDR5-6000 RAM I planned to buy has gone from £300-400 to £700-800+, a difference of almost the price of the 9070 XT I just bought to go in the computer.

I guess I'll stick with my outdated AM4/X370 setup and make the best of the GPU upgrade until RAM prices stop being a complete joke.

baq•2mo ago
literally every market is like that. if you've got market-cap amounts of money and place a market buy order for all of it, you'll quickly learn what slippage is.
KeplerBoy•2mo ago
That really isn't unprecedented. We need high RAM prices for manufacturers to expand fabs, supply overshoots demand because the AI bubble will contract to some extend and then we'll have cheap RAM once again. Classic cycle.
overfeed•2mo ago
> We need high RAM prices for manufacturers to expand fab

Manufactures aren't dumb, they lost a lot of money in the last cycle and aren't playing that game anymore. No additional capacity is planned, OEMs are simply redirecting existing capacity towards high-margin products (HBM), instead of chasing fragile demand.

Glemkloksdjf•2mo ago
I understand hating at people like Musk who destroys human lifes but what is Sam Altman doing?

Because of (c) of images or just because he bought ram?

wqaatwt•2mo ago
Inefficiently (from society’s perspective) allocating massive amounts of resources? Why is he specifically being singled out I’m absolutely that certain..
Glemkloksdjf•2mo ago
People should be happy that commercial entities invest that much money into compute, especially on hn.

This will leap frog cancer research, material research etc.

AviationAtom•2mo ago
That DDR5-4800 2x16GB price tend is crazy. It tripled from August/September until now.
Oxodao•2mo ago
Even DDR4. Just checked, I bought a non-ECC 1x32go stick for my homelab on August 25th, priced 78€ on Amazon. Same offer is now at 229€. Yeah I guess I'll wait before updating to 64gig then
jsheard•2mo ago
I don't think DDR4 is even being manufactured anymore, so the rush is clearing out that inventory for good.
duskwuff•2mo ago
It is still being manufactured. Older memory standards continue to be manufactured long after they stop being used in computers, e.g. for use in embedded devices.
AviationAtom•2mo ago
It reminds me very much of the crypto mining craze, when there was a run on GPUs and one couldn't be had for any less than 5x it's MSRP. I know that eventually passed and so too will this but it still sucks if you had been planning to purchase RAM or anything needing it.
m4rtink•2mo ago
What will happen once the bubble pops and OpenAI will not be able to pay for all the useless stuff they ordered ?
defrost•2mo ago
Ideally the consumer market gets flooded with surplus at cost or below server grade hardware flowing out in going out of business fire sales.
PunchyHamster•2mo ago
not much use for the 100GB+ AI boards or server RAM for consumers. Tho homelab guys will be thrilled.

Enterprise wise, the used servers kinda always have been cheap (at least compared to MSRP or even after discount price), just because there is enough companies that want a feel good of having a warranty on equipment and yeet it after 5 years.

zozbot234•2mo ago
Nowadays old-gen server hardware can be a viable alternative to a new HEDT or workstation, which would typically use top-of-the-line consumer parts. The price and performance are both broadly comparable.
hollerith•2mo ago
Isn't the typical server much noisier than, e.g., a high-end desktop (HEDT) with Noctua fans?
zozbot234•2mo ago
Depends how big the fans are. Tiny 1U rack-mountable hardware = lots of noise; huge fans = near silent with better heat removal capacity.
renewiltord•2mo ago
No. Up to you to cool. I use an Epyc based system as a home server and you can’t hear it. At a previous employer we built a cluster out of these and just water cooled them. Very easy.

This is a chassis and fan problem not a CPU problem. Some devices do need their own cooling if your case is not a rack mount. E.g. if you have a mellanox chip those run hot unless you cool them specifically. In rackmount use case that happens anyway.

embedding-shape•2mo ago
> Apparently OpenAI locked down 40% of the global DRAM supply for their Stargate project

That sounds like a lot, and almost unbelievable, but the scales of all of this kind of sits in that space, so what do I know.

Nonetheless, where are you getting this specific number and story from? I've seen it echoed before, but no one been able to trace it to any sort of reliable source that doesn't boil down to "secret insider writing on Substack".

jsheard•2mo ago
Samsung directly announced that OpenAI expects to procure up to 900,000 DRAM wafers every month. That number being 40% of global supply comes from third party analysis, but the market is going to notice nearly a million wafers being diverted each month however you slice it. That's a shitload of silicon.

https://news.samsung.com/samsung-and-openai-announce-strateg...

https://www.tomshardware.com/pc-components/dram/openais-star...

ac29•2mo ago
> Samsung directly announced that OpenAI expects to procure up to 900,000 DRAM wafers every month

The article says: "OpenAI’s memory demand projected to reach up to 900,000 DRAM wafers per month", but not by when, or what current demand is. If this is based on OpenAI's >$1T of announced capex over the next 5 years, its not clear that money will ever actually materialize.

IshKebab•2mo ago
Oof the RAM in my computers is apparently worth more than I paid for the entire thing...
ThrowawayTestr•2mo ago
DRAM prices have skyrocketed recently
re-thc•2mo ago
The clear winners of AI are memory makers.
orphea•2mo ago
And Nvidia.
jsheard•2mo ago
And TSMC (and ASML).

It's shovels all the way down.

jetbalsa•2mo ago
I bet the nerds making the PCBs, the jellybean parts and connectors are making mint as well.
re-thc•2mo ago
Those 2 are close to monopolies anyway. Wouldn't matter either way.
okokwhatever•2mo ago
Stock price does not say the same :'(
pvdebbe•2mo ago
Oh, you think it's undervalued?
re-thc•2mo ago
If you believe the US government will reopen the door to China sales, then yes. Highly rumored to be a thing soon.
okokwhatever•2mo ago
Source?
PunchyHamster•2mo ago
can't just make a new fab in a year and capitalise on spike, and most big investors in it know it.
re-thc•2mo ago
They "can". It happened during COVID and they got trapped by it so they're not taking the bite anymore.
re-thc•2mo ago
Nvidia just got hit by Broadcom / Google on TPUs. There's also AMD behind its back. Not so simple.
bigyabai•2mo ago
If Google and AMD are the biggest threats to CUDA's monopoly, I'd argue Nvidia has nothing to worry about.
linguae•2mo ago
It’s sad to see the one area of life that has long resisted inflation (computing) now succumb to inflationary forces. Other than emergency situations such as COVID-19, I’m used to seeing prices going down over time for computers and their components. It’s one of the rare bright spots when everything else is escalating in price, and now that’s disappearing.
wqaatwt•2mo ago
> time for computers and their components

Seems it has been the opposite for some components like GPUs though for years (well before the AI boom)

georgemcbay•2mo ago
Speaking as someone who used to buy them regularly to support a PC gaming hobby stretching back to the original glQuake -- GPUs were on average very reasonably priced prior to the crypto boom that preceded the AI boom.

So its technically not AI "ruining everything" here, but there was a nice, long before-time of reasonable pricing.

ifwinterco•2mo ago
It was always subject to inflationary forces due to money printing like everything else, it was just the one place where natural deflation due to improving technology was temporarily enough to offset it
MrBuddyCasino•2mo ago
Memory price fluctuations due to market demand and monetary inflation - the increase in quantity of fiat money, diluting its value - are two separate and unrelated things.
coldtea•2mo ago
And both are at play here - it's not just RAM, to ascribe it to AI
PunchyHamster•2mo ago
It's not inflation tho ? It's just rise in demand.
coldtea•2mo ago
Seriously doubt it.
smallmancontrov•2mo ago
When Sam Altman buys 40% of global DRAM wafer production, that looks like a demand increase to the market.
BrianGragg•2mo ago
Not to mention the other companies panic buying another ~20% (guess).
coldtea•2mo ago
I've seen prices for memory, SSDs, thunderbolt hubs, and thunderbolt/high end USB cables, flatline or get worse over the last 3 or so years.
jack_tripper•2mo ago
Most in-demand electronics got worse post-2020 and haven't recovered.

That's why I just buy something when I need it or when I think the price is reasonable, because nowadays, if I wait for something to get cheaper like I used to do in the 90s-00s, chances are it's gonna get even more expensive as time passes, not cheaper.

The days when you would wait 6-12 months and get the same thing for 50% off or a new thing with 50% more performance for the same price are over, when there's only one major semiconductor fab making everything, 3 RAM makers, 3 FLASH makers, 2 GPU vendors, 2 CPU vendors, controlling all supply, and I'm competing with datacenters for it.

bpye•2mo ago
> 2 GPU vendors

Intel Arc GPUs exist, I have a B580 in my desktop and it works well enough.

coldtea•2mo ago
Parent means "2 vendors of GPUs that actualy matter".
Glemkloksdjf•2mo ago
There was the issue of hard disk prices for years after the floods in taiwan in 2011.

GPU prices were horrendes when crypto happened (they migrated into a stable issue but it was still because of crypto).

DDR4 jumped because they started focusing on DDR5 before these news right now.

I could probably find more examples but hey

Damogran6•2mo ago
In general, your dollar buys a _Crazy_ amount of compute...but over the last 30 years or so, RAM has spiked several times (Taiwan plant fire) and suffered from several market driven spikes (DDRx shortages, Apple's crazy pricing structure)
dagmx•2mo ago
Firstly, this is not due to inflation. The price increase is explicitly (per the article even) due to increased market demand that is causing raised prices.

Secondly, Computing has always been subject to inflation. It cannot escape inflation. You may not notice it , perhaps due to the increase in performance but the cost of parts definitely has risen in the same tiers if you look over a long enough period to avoid pricing amortization

Workaccount2•2mo ago
Inflation simply refers to the rate at which prices are increasing. It's agnostic as to the origin ( any single/combo of demand increase, supply shortage, money printing, price fixing, etc).
dagmx•2mo ago
Only if you phrase it devoid of any context.

And if the definition was that loose to begin with, then the original comment is even more incorrect since there have been multiple rounds of demand/scarcity led pricing increases.

rahimnathwani•2mo ago
Inflation isn’t just "prices increasing". It’s the sustained, broad-based rise in the overall price level. Your comment treats any price increase as inflation, but economists draw a pretty clear line here: a relative price change (say, eggs getting more expensive because of a supply shock) isn’t the same thing as inflation. You can have sector-specific increases (as in this case, with RAM) that are independent of changes in the general price level.
sshine•2mo ago
> the cost of parts definitely has risen in the same tiers if you look over a long enough period

This is especially apparent if you’re a hardware manufacturer and have to buy the same components periodically, since the performance increase that consumers see doesn’t appear.

bigbadfeline•2mo ago
> if you... buy the same components periodically... the performance increase that consumers see doesn’t appear.

Good point and that should properly be called inflation in the semiconductor sector. We always have general inflation, but the different sectors of the economy exhibit different rates of inflation depending on the driving forces and their strength.

As of today, tariffs are the major driver of inflation and semiconductors are hit hard because the only high-volume, reasonable quality/price country has been practically excluded from the the US market by export bans and prohibitively high tariffs - that's China of course.

The other producers are in a near monopoly situation and are also acting like a cartel without shame or fear of law... which isn't there to begin with.

nullsmack•2mo ago
I know one area of computing that has dramatically risen in price in the past 10-15 years or so is GPUs. Cards that fit into midrange today sell for prices that would've been considered ultra-highend just a few gpu generations ago. Today's highend prices for just a graphics card are higher than I've paid for entire computers in the past. It's ridiculous.
adgjlsfhk1•2mo ago
imo this is wrong. what's changing is how high end having a discrete GPU is. as integrated graphics have become increasingly powerful, the discrete GPU has shifted to being more of a luxury item.
martythemaniak•2mo ago
Those price increases seem pretty reasonable given the shitty situation. I bought a Jetson 8GB a few weeks ago for $350 CAD from Amazon, I just checked that same listing and it's now $430.
MisterTea•2mo ago
What surprises me the most is the 1GB option is even viable though I can imagine this will be for IoT users who shove Pi's into things doing embedded stuff where a kernel with a few user space things along with maybe a container are doing all the work.
pjerem•2mo ago
Probably but I fail to see what use case doesn't need more than 1Gb but can't be done already with a Pi 3b or 4.
Mashimo•2mo ago
I have a Pi3b in a 3D printer, and compiling the software, but also simply apt upgrade feels like it takes forever. Most day to day operations work just fine though.

At work we have a display with a Pi3 (not B) connected, just showing websites in rotation. Websites even with a simple animation are laggy, startup takes a few minutes.

Both of these usecases don't need more than 1 GB of ram, but I want to speed of a 4 or 5.

dwedge•2mo ago
The 4 and 5 are pretty laggy too. An improvement, but slow.
geerlingguy•2mo ago
Usually it's just "same thing but faster". CPU is 2-3x faster, and even boot speed is faster, so it can be handy to not have to wait so long to run updates, compile something, reboot, etc.
shrx•2mo ago
Well to be honest, I'm doing just fine with my 1 GB Pi3B home server. Sure, another gigabyte wouldn't hurt, but I'm able to run influxd, zigbee2mqtt, telegraf, grafana, homeassistant (containerized), mpd and navidrome on it without issues.
overfeed•2mo ago
> What surprises me the most is the 1GB option is even viable...

There are plenty of non-IoT use cases that are viable with 1GB of general-purpose compute. Hell, I rented an obscenely cheap 512MB VPS until recently, and only abandoned it because its ancient kernel version was a security risk.

Most of my RPi tasks are not memory-bound

segmondy•2mo ago
you would be surprised to find out some of us are doing fine with 512mb pi zero w.
MarkusWandel•2mo ago
Nothing wrong with this. Some applications really are compute bound and don't need much RAM, such as a homemade surveillance camera system I have, presently running on a couple of Raspi 4s. Suppose I wanted to upgrade to Raspi 5, why spend extra money on RAM that's not needed? These things run headless with the only GUI exposed via web server.
aynyc•2mo ago
I don't really blame them, but my question is, if ram price goes down, will RPI drop its prices? My experience with other companies is no.
GlacierFox•2mo ago
These scenarios end up being testers to see what people will pay. If people are buying your product at a ridiculous price, why drop it?
AlexandrB•2mo ago
They will if they have competitors who undercut them. Otherwise, no.
Workaccount2•2mo ago
Price is an optimization problem, if you raise prices and profits increase, your product was likley too cheap. If you raise prices and profits decrease ("lol I'm not paying $XYZ for an rpi when the clone is $ABC") you are charging too much.

There are myriad other factors that go into this, especially just general inflation, which will likely fill the price gap by the time memory costs go down anyway.

dwedge•2mo ago
In my opinion rpis have been living off their name/first to market for a long time now with exaggerated low-power usage, and there may come a point where your "too high" scenario happens.

I know I'm comparing apples to oranges here (new to used), but I started buying used 1L PCs instead (Lenovo thinkcentres) for about $20 the cost of a RPi 5 - but with the benefit of it actually coming with the cooling and storage it needs to run and is upgradable, plus runs Intel.

The amount of times I've had a Pi just self-destruct on me is ridiculous. They are known for melting SD cards, and just this week I had one blow the power regulator over USB power and still get hot enough in 2 minutes that it burnt me to touch it. They are considered cheap commodity computing and they aren't cheap enough for that any more.

LIV2•2mo ago
Of course not
pogue•2mo ago
RPi Locator is a great service if you're looking to buy a Pi you can afford.

https://rpilocator.com/

Tepix•2mo ago
The article mentions "the $10 Raspberry Pi Zero". I feel this is rewriting history. The Raspberry Pi Zero was $5 when it was released back in 2015. It was mostly out of stock, but I did manage to get one unit at that price eventually.

Nowadays you can no longer get the Raspberrry Pi Zero for less than 12€ or so. I consider the $5 Raspberry Pi Zero to be among the best values on the market and there hasn't been anything else that came close.

ta9000•2mo ago
$5 in 2015 is worth $7 in 2025 dollars. Combine that with higher memory prices and overall increases in supply chain costs/tariffs, and I really don’t see $10 as being that bad.
darqis•2mo ago
Starting to hate OpenAI. Them and their Trillion Dollar deals with data centers and gpu manufacturers
sentrysapper•2mo ago
Starting to? Brother where hast thou been?
Audiophilip•2mo ago
What do you think, when will the ram prices come back down again? Years, months?
marethyu•2mo ago
$45 USD is equivalent to about 63 CAD. This is crazy considering that I brought 4GB one last year for $70 CAD.
hxorr•2mo ago
On the bright side, hopefully rising memory prices will give Microsoft and its ilk the kick up the pants they need to reduce memory usage in Windows et al
stuaxo•2mo ago
I guess ram compression is going to be back in fashion for a while.