A relevant quote: "this is your daily reminder that "How large is the biggest number it can factorize" is NOT a good measure of progress in quantum computing. If you're still stuck in this mindset, you'll be up for a rude awakening."
Related: this is from Dan Bernstein: https://blog.cr.yp.to/20250118-flight.html#moon
A relevant quote: "Humans faced with disaster tend to optimistically imagine ways that the disaster will be avoided. Given the reality of more and more user data being encrypted with RSA and ECC, the world will be a better place if every effort to build a quantum computer runs into some insurmountable physical obstacle"
And a reminder that in the world of non-QC computing, right from its very roots, the ability of computers improved in mind boggling large steps every year.
QC records, other than the odd statistic about how many bits they can make, have largely not made any strides in being able to solve real world sized problems (with exception of those that use QCs purely as an analog computer to model QC behavior)
Also, in the world of QC, right from its very roots, the ability of QC improved in mind boggling large steps every year. It's only that you cannot see it if you only look at the wrong metric, i.e., factorization records.
It's a bit like saying "classical computing technology has not improved for 50 years, it's only recently that we finally start to have programs that are able to write other programs".
That no significant factorization milestones have moved is a huge critical black eye to this field. Even worse, that no one has ever even been able to truly run Schors algorithm on even trivial numbers is a shocking indictment of the whole field.
With QC, the risk (and I am not saying this is going to happen, but I'm saying that it is a non-overlookable risk) is that we end up transitioning from "QC can only factorize 15" to "RSA-2048 is broken" in such a sudden way that the industry has no time to adapt.
Imagine if you had crummy, unreliable transistors. You couldn't build any computing machine out of them.
Indeed, in the real world progress looked like:
* Useless devices (1947)
* Very limited devices (hearing aids)
* Hand-selected, lab devices with a few hundred transistors, computing things as stunts (1955)
* The IBM 1401-- practical transitorized computers (1959)-- because devices got reliable enough and ancillary technologies like packaging improved.
In other words, there was a pattern of many years of seemingly negligible progress and then a sudden step once the foundational component reached a critical point. I think that's the point of the person you're talking to about this.
And then just a couple of years later we had the reliability to move to integrated circuits for logic.
If you looked at the "transistorized factorization record" it would be static for several years, before making a couple steps of several orders of magnitude each.
I think the problem is that “objective indicators pointing to the cliff” is pretty handwavy. Could there be a widely agreed-upon function of qubit fidelity, error rate, coherence time, and interconnections that measures, even coarsely, how far we are from the cliff? It seems like the cliff has been ten years away for a very long time, so you might forgive an outsider for believing there has been a lot of motion without progress.
rahimnathwani•2h ago