frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Open in hackernews

Will I ever own a zettaflop?

https://geohot.github.io//blog/jekyll/update/2026/01/26/own-a-zettaflop.html
42•surprisetalk•3d ago

Comments

androiddrew•1h ago
Not with the price of silicon being what it is
ge96•1h ago
Where are we at with the rat brain CPUs
Avicebron•1h ago
We keep losing people to the sewers..some in the organization are speculating they might be building a human brain CPU to retaliate. Progress is slow.
supermdguy•1h ago
If all LLM advancements stopped today, but compute + energy got to the price where the $30 million zettaflop was possible, I wonder what outcomes would be possible? Would 1000 claudes be able to coordinate in meaningful ways? How much human intervention would be needed?
echelon•1h ago
There's no way we're not living in a historical simulation.

This is all just such crazy coincidence.

Everything is coming together so quickly.

trhway•57m ago
look at the history of technology, and before that to the biological history - how long it took from single cells to multi-cells vs. for example how long it took from lizard brain to human brain - the things are naturally going exponential (my thinking why - https://news.ycombinator.com/item?id=9418811) at least until they hit some wall, yet so far hitting walls mostly only stimulated even more advanced development.

There is an issue of the "non-uniformity of the spread of the future" though with fast development, and the faster the development the stronger the non-uniformity and the tensions it creates. Strong non-uniformity and resulting tensions have tendency to resolve catastrophically on their own at some point if not solved/smoothed by the other ways before.

throw0101d•1h ago
Somewhat related, why the creators of Zettabyte File System (ZFS) decided to make it 128 bits (writing in 2004):

> Some customers already have datasets on the order of a petabyte, or 2^50 bytes. Thus the 64-bit capacity limit of 2^64 bytes is only 14 doublings away. Moore's Law for storage predicts that capacity will continue to double every 9-12 months, which means we'll start to hit the 64-bit limit in about a decade. Storage systems tend to live for several decades, so it would be foolish to create a new one without anticipating the needs that will surely arise within its projected lifetime.

* https://web.archive.org/web/20061112032835/http://blogs.sun....

And some math on what that means 'physically':

> Although we'd all like Moore's Law to continue forever, quantum mechanics imposes some fundamental limits on the computation rate and information capacity of any physical device. In particular, it has been shown that 1 kilogram of matter confined to 1 liter of space can perform at most 10^51 operations per second on at most 10^31 bits of information [see Seth Lloyd, "Ultimate physical limits to computation." Nature 406, 1047-1054 (2000)]. A fully-populated 128-bit storage pool would contain 2^128 blocks = 2^137 bytes = 2^140 bits; therefore the minimum mass required to hold the bits would be (2^140 bits) / (10^31 bits/kg) = 136 billion kg.

> To operate at the 10^31 bits/kg limit, however, the entire mass of the computer must be in the form of pure energy. By E=mc^2, the rest energy of 136 billion kg is 1.2x10^28 J. The mass of the oceans is about 1.4x10^21 kg. It takes about 4,000 J to raise the temperature of 1 kg of water by 1 degree Celcius, and thus about 400,000 J to heat 1 kg of water from freezing to boiling. The latent heat of vaporization adds another 2 million J/kg. Thus the energy required to boil the oceans is about 2.4x10^6 J/kg 1.4x10^21 kg = 3.4x10^27 J. Thus, fully populating a 128-bit storage pool would, literally, require more energy than boiling the oceans.*

* Ibid.

popol12•37m ago
Very interesting, could someone please do the same computation for filling 64 bit storage?
Dylan16807•9m ago
You want someone to put "3.4*10^27 / 2^64" into a calculator? 200 million joules, using all the same assumptions. 50kWh. Though that leaves the question of how the energy requirements change when we're not going for extreme density.

If we instead consider a million 18TB hard drives, and estimate they each need 8 watts for 20 hours to fill up, it's 160MWh on modern hardware.

tbrownaw•2m ago
16 million terablocks, or 8 billion terabytes.

Or a third of a billion 24 TB drives, which is one of the larger sizes currently available.

Some random search results say the global hard drive market is around an eighth of a billion units, but of course much of that will be smaller sizes.

So that should be physically realizable today (well, with today's commercial technology), with only a few years of global production.

jandrewrogers•22m ago
Single data sets surpassed 2^64 bytes over a decade ago. This creates fun challenges since just the metadata structures can't fit in the RAM of the largest machines we build today.
jasonwatkinspdx•54s ago
Virtualization has pushed back the need for a while, but we are going to have to look at pointers larger than 64 bit at some point. It's also not just about the raw size of datasets, but how we get a lot of utility out of various memory mapping tricks, so we consume more address space than the strict minimum required by the dataset. Also if we move up to 128 bit a lot more security mitigations become possible.
jmyeet•1h ago
I'm a big believer that humanity's future is in space in a Dyson Swarm. There are simply too many advantages. It's estimated that humanity currently uses ~10^11 Watts of power. About 10^16 Watts of solar energy hits the Earth but the Earth's cross-section is less than a billionth of the Sun's total energy output. A Dyson Swarm would give us access to ~10^25 Watts of power. With our current population that would give every person on Earth living space about equivalent to Africa and access to more energy than our entire civilization currently uses by orders of magnitude.

I bring this up to present an alternate view of the future that a lot of thought has gone into: the Matrioshka Brain. This is basically a Dyson Swarm but the entire thing operates as one giant computer. Some of the heat from inner layers is captured by outer layers for greater efficiency. That's the Matrioshka part.

How much computing power would this be?

It's hard to say but estimates range from 10^40 to 10^50 FLOPS (eg [1]). At 10^45 FLOPS that would give each person on Earth access to roughly 100 trillion zettaflops.

[1]: https://www.reddit.com/r/IsaacArthur/comments/1nzbhxj/matrio...

ninkendo•33m ago
It makes me wonder about what it would take to actually create one.

You’d need self-replicating machines to build it, naturally. You’d need some ability for them to mine from asteroids and process the materials right there on the spot. And they’d need to be able to build both the processor “swarmlets” (probably some stamped-out solar/engine/CPU package) and more builders, so that the growth can be exponential. Oh, and the ability to turn solar energy into thrust somehow using only fuel you can get from the mined asteroids. Maybe a prerequisite is finding a solar system that has a huge and extremely uranium-rich asteroid belt.

You would need a CPU design that can be built using the kind of fidelity that a self-replicating machine in space under constant solar radiation can achieve. But if you can get the scale high enough, maybe you can just brute force it and make machines on the computational scale of a Pentium 3, but there’s 10^40 of them so who cares. Maybe there’s a novel way of designing a durable computing machine out of hydrocarbons we have yet to discover.

The machines would have to self replicate, and you’d need to store the instructions somewhere hardened. And that can be built out of materials commonly found in asteroids. Maybe hydrocarbons. Hell, may as well use RNA. These things need to be as good as humans at building stuff, so really this is just creating artificial “life” that self has DNA and is made of cells that build proteins needed to create the machine. Maybe they reproduce by spreading as little DNA seeds that can attach to an asteroid with the right chemistry, and we just spew them into the cosmos at a candidate star and hope the process gets kickstarted. Hell, we could make it spew its own DNA at the next stars over as soon as it’s done. We’d have a whole galaxy computing for us, all we’d need is the right DNA instructions, the right capsule for them, and a way to launch them.

Maybe another civilization has already done this…

userbinator•11m ago
Dyson Swarm sounds like the name of an aggressive cleaning machine.
JackYoustra•1h ago
nit: it's a zettaflops, not a zettaflop
nl•58m ago
Firstly, True Names is an awesome read, and the real origin of cyberpunk. I much prefer it to Neuromancer or Diamond Age.

Secondly, I recently tried to work out what year on the Top500 list[1] I could reasonably be for around US$5000. It's surprisingly difficult to work out mostly because they use 64 bit flops and few other systems quote that number.

[1] https://top500.org/lists/top500/2025/11/

arthurjj•30m ago
I just want to thank the submitter. This is the type of internet that I really miss. A very smart person who's a good writer, proud of their interests and obsessions.
ipnon•3m ago
Yes, to paraphrase Jobs, I'm only interested in the intersection of Technology Avenue and Liberal Arts Street.
Sprotch•28m ago
And when it comes, people will use it for porn, memes, and to argue with each other in bad faith
mememememememo•2m ago
Saw zetaflop in the title. Knew it would be that guy!

Microsoft's PhotoDNA technology keeps flagging my face picture

https://www.elevenforum.com/t/microsoft-photodna-scanning-problem-it-is-comical-now.45961/
25•darkzek•45m ago•3 comments

Many African families spend fortunes burying their dead

https://davidoks.blog/p/how-funerals-keep-africa-poor
130•powera•3h ago•100 comments

Native Instant Space Switching on macOS

https://arhan.sh/blog/native-instant-space-switching-on-macos/
313•PaulHoule•6h ago•153 comments

How NASA Built Artemis II’s Fault-Tolerant Computer

https://cacm.acm.org/news/how-nasa-built-artemis-iis-fault-tolerant-computer/
75•speckx•10h ago•21 comments

Charcuterie – Visual similarity Unicode explorer

https://charcuterie.elastiq.ch/
123•rickcarlino•5h ago•19 comments

PicoZ80 – Drop-In Z80 Replacement

https://eaw.app/picoz80/
148•rickcarlino•7h ago•22 comments

Reverse engineering Gemini's SynthID detection

https://github.com/aloshdenny/reverse-SynthID
111•_tk_•5h ago•42 comments

Robots Eat Cars

https://telemetry.endeff.com/p/robots-eat-cars
44•JMill•2d ago•32 comments

Will I ever own a zettaflop?

https://geohot.github.io//blog/jekyll/update/2026/01/26/own-a-zettaflop.html
42•surprisetalk•3d ago•21 comments

Unfolder for Mac – A 3D model unfolding tool for creating papercraft

https://www.unfolder.app/
149•codazoda•8h ago•33 comments

Moving from WordPress to Jekyll (and static site generators in general)

https://www.demandsphere.com/blog/rebuilding-demandsphere-with-jekyll-and-claude-code/
40•rgrieselhuber•4h ago•17 comments

Generative Art over the Years

https://blog.veitheller.de/Generative_art_over_the_years.html
4•evakhoury•2d ago•0 comments

Hegel, a universal property-based testing protocol and family of PBT libraries

https://hegel.dev
83•PaulHoule•7h ago•30 comments

Old laptops in a colo as low cost servers

https://colaptop.pages.dev/
153•argentum47•7h ago•84 comments

Research-Driven Agents: When an agent reads before it codes

https://blog.skypilot.co/research-driven-agents/
130•hopechong•8h ago•43 comments

Top laptops to use with FreeBSD

https://freebsdfoundation.github.io/freebsd-laptop-testing/
281•fork-bomber•16h ago•161 comments

BunnyCDN has been silently losing our production files for 15 months

https://old.reddit.com/r/webdev/comments/1sglytg/bunnycdn_has_been_silently_losing_our_production/
111•speckx•3h ago•25 comments

How Close Is Too Close? Applying Fluid Dynamics Research Methods to PC Cooling

https://www.lttlabs.com/articles/2026/04/04/how-close-is-too-close-applying-fundamental-fluid-dyn...
10•LabsLucas•4d ago•2 comments

Reallocating $100/Month Claude Code Spend to Zed and OpenRouter

https://braw.dev/blog/2026-04-06-reallocating-100-month-claude-spend/
293•kisamoto•16h ago•203 comments

Microsoft is employing dark patterns to goad users into paying for storage?

https://lzon.ca/posts/other/microsoft-user-abuse/
217•jpmitchell•4h ago•119 comments

Show HN: I built a Cargo-like build tool for C/C++

https://github.com/randerson112/craft
118•randerson_112•9h ago•109 comments

Introduction to Nintendo DS Programming

https://www.patater.com/files/projects/manual/manual.html
217•medbar•1d ago•50 comments

Instant 1.0, a backend for AI-coded apps

https://www.instantdb.com/essays/architecture
92•stopachka•7h ago•54 comments

The Training Example Lie Bracket

https://pbement.com/posts/lie_brackets/
12•pb1729•3h ago•8 comments

EFF is leaving X

https://www.eff.org/deeplinks/2026/04/eff-leaving-x
1114•gregsadetsky•8h ago•935 comments

Vibe-Coded Ext4 for OpenBSD

https://lwn.net/Articles/1064541/
5•signa11•1h ago•2 comments

Show HN: Druids – Build your own software factory

https://github.com/fulcrumresearch/druids
25•etherio•1d ago•3 comments

A WebGPU implementation of Augmented Vertex Block Descent

https://github.com/jure/webphysics
121•juretriglav•13h ago•15 comments

Wit, unker, Git: The lost medieval pronouns of English intimacy

https://www.bbc.com/future/article/20260408-the-extinct-english-words-for-just-the-two-of-us
185•eigenspace•15h ago•120 comments

Progressive encoding and decoding of 'repeated' protobuffer fields

https://schilk.co/blog/protobuffer-repeat-append/
15•quarkz02•4d ago•2 comments