> For comparison, Bitcoin mining did only about 2^111 bit operations in 2022. ("Only"!)
Anyone have a source for this? Google results suggest in 2022 Bitcoin miners reached ~209 quintillion hashes (209 exahashes) per second. I don't know how many bit operations SHA-1 takes, but dividing 2^111 by 209 * 10^18 * 86400 * 365 gives 393891, which doesn't sound unreasonable for number of bit operations per SHA-1 hash.
Basically, it's fascinating that global compute is reaching those kinds of numbers. Even more fascinating is that it's just Bitcoin mining, so global total computations must be some multiple of that (3x? 10x? 100x?). These are numbers once considered (still considered?) unfathomable, let alone a quantity applicable to human endeavor. And that's 2022. Today the Bitcoin hash rate is 4.5x greater.
For power consumption I think the answer to all of these is "yes", except for the one where you split the clock buffers in half.
How about DNA replication in bacterial cells? Is that two bit operations per base? My pot of yogurt is 4 kg of mostly Lactobacillus casei, with a genome of about 2 million base pairs, 4 megabits, and a generation time of about 30 minutes, 2 kilobits per second of reproductive copying per bacterium, plus presumably a much higher transcription rate into mRNA. Each bacterium is about 5 cubic microns, so there are about 10¹⁴ bacteria in the pot, so about 10¹⁷ bit operations per second for reproduction, and maybe 10¹⁹ for mRNA, wildly guessing. That would make the pot of yogurt millions of times more computationally powerful than my CPU, though only for a few hours. Fortunately, the bacteria are more energy-efficient than AMD, or the yogurt would be exploding.
But none of those operations can be used directly for cracking a key, because they aren't programmable. What the paper says is sensible, because it's comparing two things that are very much alike. Even though you can't use Bitcoin mining ASICs for key cracking, you can build very similar key cracking ASICs for a very similar cost and energy consumption. But things get very vague when you start trying to quantify all compute.
https://news.ycombinator.com/item?id=37756656 - Debunking NIST's calculation of the Kyber-512 security level (2023-10-03, 201 comments)
I'm not aware of either. I'd love to know if NIST has formally accepted their arithmetic flaw. It's possible they did, and believe they are north of need supporting Kyber-512 irrespective.
If a system has parameters, another issue is whether or not a different implementation is required due to the parameters being different. There are some reasons why a separate implementation might be desirable anyways in some cases, but sometimes it would be possible to change the parameters at run time.
Another consideration is patents; they should not recommend patented or secret algorithms. Cryptanalysis will be difficult if the specification is not freely available to anyone who wants to read it, and implementation can be a problem if patent licensing is required. Wikipedia says that NTRU is patented but "Security Innovation exempted open-source projects from having to get a patent license"; that might be good enough.
Wikipedia also says that Kyber is a key encapsulation mechanism but NTRU is a public key cryptosystem, so they would not be the same kind of things, anyways. However, you could also use a public key cryptosystem like a key encapsulation mechanism if you have another method of making up a key securely at random. But, Wikipedia says "it is easier to design and analyze a secure KEM than to design a secure public-key encryption scheme as a basis" (I do not know the details of the quoted part to judge this, but the unquoted part seems obvious to me).
Another alternative might be using multiple algorithms with independent keys (to be secure, the keys will have to be independent; however, you might have to be careful that they really will be independent), e.g. by using Kyber first and then encrypting the result with NTRU. But, that depends on what your requirements are.
As another comments (https://news.ycombinator.com/item?id=37756656) had mention, they may have different requirements than yours, such as hardware, so that is another issue.
None of that is an excuse for what NIST seems to be doing though (according to the article); they are additional concerns than those ones.
perching_aix•7mo ago
I_dream_of_Geni•7mo ago
drob518•7mo ago
kragen•7mo ago
bigfatkitten•7mo ago
https://media.defense.gov/2025/May/30/2003728741/-1/-1/0/CSA...
If the U.S. Government is willing to bet the SECRET-and-above farm on particular cryptography standards and implementations, it’s probably safe for you to use them too.
pxeger1•7mo ago
And anyway why is there any reason to believe they really do use the system they say they use?
bigfatkitten•7mo ago
How do you think they could assess that they, and only they will ever be able to exploit a particular cryptographic vulnerability at any time over the next few decades?
They can’t, they would be well aware of that, and they are extremely risk averse.
> And anyway why is there any reason to believe they really do use the system they say they use?
Because these systems exist widely throughout government today.
https://www.nsa.gov/Resources/Commercial-Solutions-for-Class...
https://www.disa.mil/-/media/files/disa/fact-sheets/dmcc-s-f...
kragen•7mo ago
bigfatkitten•7mo ago
The fact they are now is a relatively recent development, and it’s significant because they now have their own skin in the game whereas they previously did not.
jandrewrogers•7mo ago
[0] https://en.wikipedia.org/wiki/NSA_Suite_A_Cryptography
bigfatkitten•7mo ago
matthewdgreen•7mo ago