frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Start all of your commands with a comma

https://rhodesmill.org/brandon/2009/commands-with-comma/
193•theblazehen•2d ago•56 comments

OpenCiv3: Open-source, cross-platform reimagining of Civilization III

https://openciv3.org/
679•klaussilveira•14h ago•203 comments

The Waymo World Model

https://waymo.com/blog/2026/02/the-waymo-world-model-a-new-frontier-for-autonomous-driving-simula...
954•xnx•20h ago•552 comments

How we made geo joins 400× faster with H3 indexes

https://floedb.ai/blog/how-we-made-geo-joins-400-faster-with-h3-indexes
125•matheusalmeida•2d ago•33 comments

Jeffrey Snover: "Welcome to the Room"

https://www.jsnover.com/blog/2026/02/01/welcome-to-the-room/
25•kaonwarb•3d ago•21 comments

Unseen Footage of Atari Battlezone Arcade Cabinet Production

https://arcadeblogger.com/2026/02/02/unseen-footage-of-atari-battlezone-cabinet-production/
62•videotopia•4d ago•2 comments

Show HN: Look Ma, No Linux: Shell, App Installer, Vi, Cc on ESP32-S3 / BreezyBox

https://github.com/valdanylchuk/breezydemo
235•isitcontent•15h ago•25 comments

Vocal Guide – belt sing without killing yourself

https://jesperordrup.github.io/vocal-guide/
39•jesperordrup•5h ago•17 comments

Monty: A minimal, secure Python interpreter written in Rust for use by AI

https://github.com/pydantic/monty
227•dmpetrov•15h ago•121 comments

Show HN: I spent 4 years building a UI design tool with only the features I use

https://vecti.com
332•vecti•17h ago•145 comments

Hackers (1995) Animated Experience

https://hackers-1995.vercel.app/
499•todsacerdoti•22h ago•243 comments

Sheldon Brown's Bicycle Technical Info

https://www.sheldonbrown.com/
384•ostacke•21h ago•96 comments

Microsoft open-sources LiteBox, a security-focused library OS

https://github.com/microsoft/litebox
360•aktau•21h ago•183 comments

Show HN: If you lose your memory, how to regain access to your computer?

https://eljojo.github.io/rememory/
292•eljojo•17h ago•182 comments

Where did all the starships go?

https://www.datawrapper.de/blog/science-fiction-decline
21•speckx•3d ago•10 comments

An Update on Heroku

https://www.heroku.com/blog/an-update-on-heroku/
413•lstoll•21h ago•279 comments

ga68, the GNU Algol 68 Compiler – FOSDEM 2026 [video]

https://fosdem.org/2026/schedule/event/PEXRTN-ga68-intro/
6•matt_d•3d ago•1 comments

Was Benoit Mandelbrot a hedgehog or a fox?

https://arxiv.org/abs/2602.01122
20•bikenaga•3d ago•10 comments

PC Floppy Copy Protection: Vault Prolok

https://martypc.blogspot.com/2024/09/pc-floppy-copy-protection-vault-prolok.html
66•kmm•5d ago•9 comments

Dark Alley Mathematics

https://blog.szczepan.org/blog/three-points/
93•quibono•4d ago•22 comments

How to effectively write quality code with AI

https://heidenstedt.org/posts/2026/how-to-effectively-write-quality-code-with-ai/
260•i5heu•17h ago•202 comments

Delimited Continuations vs. Lwt for Threads

https://mirageos.org/blog/delimcc-vs-lwt
33•romes•4d ago•3 comments

Female Asian Elephant Calf Born at the Smithsonian National Zoo

https://www.si.edu/newsdesk/releases/female-asian-elephant-calf-born-smithsonians-national-zoo-an...
38•gmays•10h ago•13 comments

I now assume that all ads on Apple news are scams

https://kirkville.com/i-now-assume-that-all-ads-on-apple-news-are-scams/
1073•cdrnsf•1d ago•459 comments

Introducing the Developer Knowledge API and MCP Server

https://developers.googleblog.com/introducing-the-developer-knowledge-api-and-mcp-server/
60•gfortaine•12h ago•26 comments

Understanding Neural Network, Visually

https://visualrambling.space/neural-network/
291•surprisetalk•3d ago•43 comments

I spent 5 years in DevOps – Solutions engineering gave me what I was missing

https://infisical.com/blog/devops-to-solutions-engineering
150•vmatsiiako•19h ago•71 comments

The AI boom is causing shortages everywhere else

https://www.washingtonpost.com/technology/2026/02/07/ai-spending-economy-shortages/
8•1vuio0pswjnm7•1h ago•0 comments

Why I Joined OpenAI

https://www.brendangregg.com/blog/2026-02-07/why-i-joined-openai.html
154•SerCe•10h ago•144 comments

Learning from context is harder than we thought

https://hy.tencent.com/research/100025?langVersion=en
187•limoce•3d ago•102 comments
Open in hackernews

Debunking NIST's calculation of the Kyber-512 security level (2023)

https://blog.cr.yp.to/20231003-countcorrectly.html
59•RA2lover•7mo ago

Comments

perching_aix•7mo ago
That's pretty messed up, guess that's another sombering fact to the pile. I'd have expected that serious security stuff always involves mechanized math proofs every step of the way, making such silly mischaracterizations impossible. Not a fun thing to learn that this is not what happens.
I_dream_of_Geni•7mo ago
Not only messed up, but I am guessing that there are either politics involved (personal gain, friends of friends, etc), or somebody paid somebody to push Kyber over NTRU. Which is difficult or impossible to prove, ESPECIALLY if that "person" is senator or "other". (Since I failed civics, I have no idea what forces are involved in something like this, but it all sounds really fishy).
drob518•7mo ago
Senators can’t even spell crypto.
kragen•7mo ago
Historically the NSA has sabotaged public cryptography standards so that it could crack them, while adversaries hopefully couldn't. It pays its employees to do this. It seems plausible that that's what's going on here, but even if so, whether that's because they know of a fatal weakness in NTRU they fear adversaries will exploit, or know of one in Kyber that they hope to exploit themselves, is anybody's guess.
bigfatkitten•7mo ago
NSA makes public their own policy for national security systems.

https://media.defense.gov/2025/May/30/2003728741/-1/-1/0/CSA...

If the U.S. Government is willing to bet the SECRET-and-above farm on particular cryptography standards and implementations, it’s probably safe for you to use them too.

pxeger1•7mo ago
If NSA and only NSA can crack a particular system, they probably wouldn't mind using it for their own secrets.

And anyway why is there any reason to believe they really do use the system they say they use?

bigfatkitten•7mo ago
> If NSA and only NSA can crack a particular system, they probably wouldn't mind using it for their own secrets.

How do you think they could assess that they, and only they will ever be able to exploit a particular cryptographic vulnerability at any time over the next few decades?

They can’t, they would be well aware of that, and they are extremely risk averse.

> And anyway why is there any reason to believe they really do use the system they say they use?

Because these systems exist widely throughout government today.

https://www.nsa.gov/Resources/Commercial-Solutions-for-Class...

https://www.disa.mil/-/media/files/disa/fact-sheets/dmcc-s-f...

kragen•7mo ago
What they've been doing consistently for the last 50 years counts for more than what they say today.
bigfatkitten•7mo ago
They haven’t been using commercial cryptography to protect classified information for 50 years.

The fact they are now is a relatively recent development, and it’s significant because they now have their own skin in the game whereas they previously did not.

jandrewrogers•7mo ago
FWIW, the US government actively develops and maintains a suite of classified cryptography algorithms[0] which are completely separate from the suite of algorithms they publish publicly. The reason for the existence of Suite A algorithms has never really been explained. I’ve heard rumors that it contains capabilities not known in public cryptographic algorithms, but that’s speculation.

[0] https://en.wikipedia.org/wiki/NSA_Suite_A_Cryptography

bigfatkitten•7mo ago
They do, and there are a lot of situations in which those algorithms are not usable, such as on mobile devices, hence the introduction of Suite B and now CNSA.
matthewdgreen•7mo ago
This has been discussed before on HN when it was first published. I don’t remember the resolution years later, but it was discussed on the PQC mailing lists. The missing context here is that many people had submissions and academic rivalries can be very bitter.
wahern•7mo ago
This stood out to me:

> For comparison, Bitcoin mining did only about 2^111 bit operations in 2022. ("Only"!)

Anyone have a source for this? Google results suggest in 2022 Bitcoin miners reached ~209 quintillion hashes (209 exahashes) per second. I don't know how many bit operations SHA-1 takes, but dividing 2^111 by 209 * 10^18 * 86400 * 365 gives 393891, which doesn't sound unreasonable for number of bit operations per SHA-1 hash.

Basically, it's fascinating that global compute is reaching those kinds of numbers. Even more fascinating is that it's just Bitcoin mining, so global total computations must be some multiple of that (3x? 10x? 100x?). These are numbers once considered (still considered?) unfathomable, let alone a quantity applicable to human endeavor. And that's 2022. Today the Bitcoin hash rate is 4.5x greater.

kragen•7mo ago
It's hard to know what to count as "global compute". How many bit operations do we count for the clock propagation across your CPU? Two for each clock cycle per buffer? Even though the bit operation is just identity, or do we omit that? Does it change if you use inverting buffers, since NOT is a nontrivial operation? Did you know that in CMOS a normal buffer is made out of two inverters? Can you do twice as many bit operations just by using buffers that are half as big, so that you have to use twice as many? How about DRAM destructive read and refresh cycles? Do you count the bit operations in the TLB CAM and the caches to test if entries are already present? Then going to a higher associativity, like from two-way to four-way, doubles the bit operation count.

For power consumption I think the answer to all of these is "yes", except for the one where you split the clock buffers in half.

How about DNA replication in bacterial cells? Is that two bit operations per base? My pot of yogurt is 4 kg of mostly Lactobacillus casei, with a genome of about 2 million base pairs, 4 megabits, and a generation time of about 30 minutes, 2 kilobits per second of reproductive copying per bacterium, plus presumably a much higher transcription rate into mRNA. Each bacterium is about 5 cubic microns, so there are about 10¹⁴ bacteria in the pot, so about 10¹⁷ bit operations per second for reproduction, and maybe 10¹⁹ for mRNA, wildly guessing. That would make the pot of yogurt millions of times more computationally powerful than my CPU, though only for a few hours. Fortunately, the bacteria are more energy-efficient than AMD, or the yogurt would be exploding.

But none of those operations can be used directly for cracking a key, because they aren't programmable. What the paper says is sensible, because it's comparing two things that are very much alike. Even though you can't use Bitcoin mining ASICs for key cracking, you can build very similar key cracking ASICs for a very similar cost and energy consumption. But things get very vague when you start trying to quantify all compute.

fc417fc802•7mo ago
Presumably "global compute" in this context refers to activities of similar complexity carried out with digital electronic devices that produce a similarly useful output. Obviously bitcoin is some fraction of global compute; it's interesting to wonder what the (approximate) total might be.
pbsd•7mo ago
This circuit [1] puts it at <=135k bit operations. Bitcoin uses SHA-256, not SHA-1.

[1] https://nigelsmart.github.io/MPC-Circuits/sha256.txt

nullc•7mo ago
Bitcoin's proof of work uses SHA-256(SHA-256(x)). Combining that with your figures reduces the differences to well within minutia of how you count bit operations and exactly which tradeoff the circuits make.
omoikane•7mo ago
Previously:

https://news.ycombinator.com/item?id=37756656 - Debunking NIST's calculation of the Kyber-512 security level (2023-10-03, 201 comments)

ggm•7mo ago
You would expect a decent rebuttal if wrong, and an acknowledgement if correct.

I'm not aware of either. I'd love to know if NIST has formally accepted their arithmetic flaw. It's possible they did, and believe they are north of need supporting Kyber-512 irrespective.

zzo38computer•7mo ago
I can see they say many problems with what NIST is doing. One question is: Does someone bribe (or otherwise coerce) them? If so, is that why they are being deceptive, and why they would not respond to (or explain) some things?

If a system has parameters, another issue is whether or not a different implementation is required due to the parameters being different. There are some reasons why a separate implementation might be desirable anyways in some cases, but sometimes it would be possible to change the parameters at run time.

Another consideration is patents; they should not recommend patented or secret algorithms. Cryptanalysis will be difficult if the specification is not freely available to anyone who wants to read it, and implementation can be a problem if patent licensing is required. Wikipedia says that NTRU is patented but "Security Innovation exempted open-source projects from having to get a patent license"; that might be good enough.

Wikipedia also says that Kyber is a key encapsulation mechanism but NTRU is a public key cryptosystem, so they would not be the same kind of things, anyways. However, you could also use a public key cryptosystem like a key encapsulation mechanism if you have another method of making up a key securely at random. But, Wikipedia says "it is easier to design and analyze a secure KEM than to design a secure public-key encryption scheme as a basis" (I do not know the details of the quoted part to judge this, but the unquoted part seems obvious to me).

Another alternative might be using multiple algorithms with independent keys (to be secure, the keys will have to be independent; however, you might have to be careful that they really will be independent), e.g. by using Kyber first and then encrypting the result with NTRU. But, that depends on what your requirements are.

As another comments (https://news.ycombinator.com/item?id=37756656) had mention, they may have different requirements than yours, such as hardware, so that is another issue.

None of that is an excuse for what NIST seems to be doing though (according to the article); they are additional concerns than those ones.

rurban•7mo ago
NIST is basically NSA and CIA. I wouldn't trust them a single bit.