frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Willow quantum chip demonstrates verifiable quantum advantage on hardware

https://blog.google/technology/research/quantum-echoes-willow-verifiable-quantum-advantage/
212•AbhishekParmar•2h ago•116 comments

HP SitePrint

https://www.hp.com/us-en/printers/site-print/layout-robot.html
44•gjvc•49m ago•15 comments

Cryptographic Issues in Cloudflare's Circl FourQ Implementation (CVE-2025-8556)

https://www.botanica.software/blog/cryptographic-issues-in-cloudflares-circl-fourq-implementation
106•botanica_labs•3h ago•50 comments

Meta is axing 600 roles across its AI division

https://www.theverge.com/news/804253/meta-ai-research-layoffs-fair-superintelligence
131•Lionga•1h ago•67 comments

Look, Another AI Browser

https://manuelmoreale.com/thoughts/look-another-ai-browser
31•v3am•52m ago•6 comments

Introducing Galaxy XR, the first Android XR headset

https://blog.google/products/android/samsung-galaxy-xr/
63•thelastgallon•1h ago•65 comments

Linux Capabilities Revisited

https://dfir.ch/posts/linux_capabilities/
110•Harvesterify•4h ago•19 comments

Bild AI (YC W25) Is Hiring a Founding AI Engineer

https://www.ycombinator.com/companies/bild-ai/jobs/m2ilR5L-founding-engineer-applied-ai
1•rooppal•1h ago

JMAP for Calendars, Contacts and Files Now in Stalwart

https://stalw.art/blog/jmap-collaboration/
10•StalwartLabs•41m ago•0 comments

MinIO stops distributing free Docker images

https://github.com/minio/minio/issues/21647#issuecomment-3418675115
520•LexSiga•11h ago•317 comments

Designing software for things that rot

https://drobinin.com/posts/designing-software-for-things-that-rot/
106•valzevul•19h ago•20 comments

AI assistants misrepresent news content 45% of the time

https://www.bbc.co.uk/mediacentre/2025/new-ebu-research-ai-assistants-news-content
286•sohkamyung•4h ago•219 comments

Scripts I wrote that I use all the time

https://evanhahn.com/scripts-i-wrote-that-i-use-all-the-time/
42•speckx•3h ago•8 comments

SourceFS: A 2h+ Android build becomes a 15m task with a virtual filesystem

https://www.source.dev/journal/sourcefs
84•cdesai•5h ago•27 comments

The security paradox of local LLMs

https://quesma.com/blog/local-llms-security-paradox/
82•jakozaur•5h ago•49 comments

Show HN: Create interactive diagrams with pop-up content

https://vexlio.com/features/interactive-diagrams-with-popups/
4•ttd•3h ago•0 comments

Internet's biggest annoyance: Cookie laws should target browsers, not websites

https://nednex.com/en/the-internets-biggest-annoyance-why-cookie-laws-should-target-browsers-not-...
403•SweetSoftPillow•5h ago•429 comments

Die shots of as many CPUs and other interesting chips as possible

https://commons.wikimedia.org/wiki/User:Birdman86
158•uticus•4d ago•30 comments

The Logarithmic Time Perception Hypothesis

http://www.kafalas.com/Logtime.html
22•rzk•2h ago•8 comments

Chess engines didn't replace Magnus Carlsen, and AI won't replace you

https://coding-with-ai.dev/posts/use-ai-like-magnus-carlsen/
11•codeclimber•1h ago•20 comments

Cyborgs vs. rooms, two visions for the future of computing

https://interconnected.org/home/2025/10/13/dichotomy
3•surprisetalk•3d ago•0 comments

Patina: a Rust implementation of UEFI firmware

https://github.com/OpenDevicePartnership/patina
95•hasheddan•1w ago•15 comments

French ex-president Sarkozy begins jail sentence

https://www.bbc.com/news/articles/cvgkm2j0xelo
305•begueradj•12h ago•392 comments

Farming Hard Drives (2012)

https://www.backblaze.com/blog/backblaze_drive_farming/
20•floriangosse•6d ago•11 comments

Go subtleties

https://harrisoncramer.me/15-go-sublteties-you-may-not-already-know/
176•darccio•1w ago•131 comments

Count-Min Sketches in JS – Frequencies, but without the data

https://www.instantdb.com/essays/count_min_sketch
6•stopachka•1h ago•1 comments

Evaluating the Infinity Cache in AMD Strix Halo

https://chipsandcheese.com/p/evaluating-the-infinity-cache-in
130•zdw•13h ago•53 comments

Power over Ethernet (PoE) basics and beyond

https://www.edn.com/poe-basics-and-beyond-what-every-engineer-should-know/
226•voxadam•6d ago•183 comments

Show HN: Cadence – A Guitar Theory App

https://cadenceguitar.com/
149•apizon•1w ago•47 comments

Knocker, a knock based access control system for your homelab

https://github.com/FarisZR/knocker
58•xlmnxp•9h ago•101 comments
Open in hackernews

Willow quantum chip demonstrates verifiable quantum advantage on hardware

https://blog.google/technology/research/quantum-echoes-willow-verifiable-quantum-advantage/
211•AbhishekParmar•2h ago

Comments

Bootvis•2h ago
This seems like an actually useful computation to do, unlike earlier results. Is that a reasonable reading of this article?
portaouflop•12m ago
No it’s still completely useless for the real world. Also not actually verifiable
einsteinx2•2h ago
> demonstrates the first-ever algorithm to achieve verifiable quantum advantage on hardware.

Am I crazy or have I heard this same announcement from Google and others like 5 times at this point?

ortusdux•2h ago
I'd classify this one as different as it accompanies a publication in Nature - https://www.nature.com/articles/s41586-025-09526-6
joshuaissac•2h ago
It's the third one I am seeing from Google specifically.
auxiliarymoose•1h ago
My understanding is that this one is "verifiable" which means you get a reproducible result (i.e. consistent result comes out of a computation that would take much longer to do classically).

Non-verifiable computations include things like pulling from a hard-to-compute probability distribution (i.e. random number generator) where it is faster, but the result is inherently not the same each time.

corranh•2h ago
Main caveat is that it’s verifiable (by them) but repeatable by others (in principle).
portaouflop•11m ago
So actually it’s neither verifiable or repeatable in any real-works definition of the words.
CGMthrowaway•2h ago
Related papers

The idea: Quantum Computation of Molecular Structure Using Data from Challenging-To-Classically-Simulate Nuclear Magnetic Resonance Experiments https://journals.aps.org/prxquantum/abstract/10.1103/PRXQuan...

Verifying the result by another quantum computer (it hasn't been yet): Observation of constructive interference at the edge of quantum ergodicity https://www.nature.com/articles/s41586-025-09526-6

markx2•2h ago
> In physics, a quantum (pl.: quanta) is the minimum amount of any physical entity (physical property) involved in an interaction.

Not a big leap then.

chuckadams•50m ago
I see what you did there. Shame a lot of other people didn't.
Imnimo•2h ago
As with any quantum computing news, I will wait for Scott Aaronson to tell me what to think about this.
lisper•2h ago
Why wait? Just go read the paper:

https://www.nature.com/articles/s41586-025-09526-6

In the last sentence of the abstract you will find:

"These results ... indicate a viable path to practical quantum advantage."

And in the conclusions:

"Although the random circuits used in the dynamic learning demonstration remain a toy model for Hamiltonians that are of practical relevance, the scheme is readily applicable to real physical systems."

So the press release is a little over-hyped. But this is real progress nonetheless (assuming the results actually hold up).

[UPDATE] It should be noted that this is still a very long way away from cracking RSA. That requires quantum error correction, which this work doesn't address at all. This work is in a completely different regime of quantum computing, looking for practical applications that use a quantum computer to simulate a physical quantum system faster than a classical computer can. The hardware improvements that produced progress in this area might be applicable to QEC some day, this is not direct progress towards implementing Shor's algorithm at all. So your crypto is still safe for the time being.

ransom1538•2h ago
SO... BTC goes to zero?
deliriumchn•2h ago
no, not really, PQC is already being discussed in pretty much every relevant crypto thing for couple years alearady and there are multiple PQC algos ready to protect important data in banking etc as well
cyberpunk•1h ago
I don’t really understand the threat to banking. Let’s say you crack the encryption key used in my bank between a java payment processing system and a database server. You can’t just inject transactions or something. Is the threat that internal network traffic could be read? Transactions all go to clearing houses anyway. Is it to protect browser->webapp style banking? those all use ec by now anyway, and even if they don’t how do you mitm this traffic?

Where is the exact threat?

conradev•1h ago
The big threat is passively breaking TLS, so it’s browser traffic. Or, any internet traffic?
cyberpunk•1h ago
Okay, but breaking that TLS (device->bank) would allow you to intercept the session keys and then decrypt the conversation. Alright, so now you can read I logged in and booked a transaction to my landlord or whatever. What else can you do? OTP/2FA code prevents you from re-using my credentials. Has it been demonstrated at all that someone who intercepts a session key is able to somehow inject into a conversation? It seems highly unlikely to me with TCP over the internet.

So we are all in a collective flap that someone can see my bank transactions? These are pretty much public knowledge to governments/central banks/clearing houses anyway -- doesn't seem like all that big a deal to me.

(I work on payment processing systems for a large bank)

bawolff•1h ago
> Has it been demonstrated at all that someone who intercepts a session key is able to somehow inject into a conversation? It seems highly unlikely to me with TCP over the internet.

if you can read the TLS session in general, you can capture the TLS session ticket and then use that to make a subsequent connection. This is easier as you dont have to be injecting packets live or make inconvinent packets disappear.

cyberpunk•52m ago
It seems like detecting a re-use like this should be reasonably easy, it would not look like normal traffic and we could flag this to our surveillance systems for additional checks on these transactions. In a post quantum world, this seems like something that would be everywhere anyway (and presumably, we would be using some other algo by then too).

Somehow, I'm not all that scared. Perhaps I'm naive.. :}

bawolff•1h ago
> those all use ec by now anyway

As far as i am aware, eliptic curve is also vulnerable to quantum attacks.

The threat is generally both passive eavesdropping to decrypt later and also active MITM attacks. Both of course require the attacker to be in a position to eavesdrop.

> Let’s say you crack the encryption key used in my bank between a java payment processing system and a database server.

Well if you are sitting in the right place on the network then you can.

> how do you mitm this traffic?

Depends on the scenario. If you are government or ISP then its easy. Otherwise it might be difficult. Typical real life scenarios are when the victim is using wifi and the attacker is in the physical vicinity.

Like all things crypto, it always depends on context. What information are you trying to protect and who are you trying to protect.

All that said, people are already experimenting with PQC so it might mostly be moot by the time a quantum computer comes around. On the other hand people are still using md5 so legacy will bite.

cyberpunk•1h ago
> Well if you are sitting in the right place on the network then you can.

Not really. This would be if not instantly then when a batch goes for clearing or reconciliation, be caught -- and an investigation would be immediately started.

There are safeguards against this kind of thing that can't be really defeated by breaking some crypto. We have to protect against malicious employees etc also.

One can not simply insert bank transactions like this. They are really extremely complicated flows here.

chuckadams•1h ago
Flooding the system with forged messages that overwhelm the clearinghouse having to verify them sounds like a good way to bring down a banking system.
cyberpunk•48m ago
Sure, if a bank gets compromised you could in theory DOS a clearing house, but I'd be completely amazed if it succeeded. Those kind of anomalous spikes would be detected quickly. Not even imagining that each bank probably has dedicated instances inside each clearing house.

These are fairly robust systems. You'd likely have a much better impact dossing the banks.

chuckadams•46m ago
Yah, I suspect the banks pay a handsome sum to smarter people than you and me, and they've gamed this out already.
cyberpunk•43m ago
I build such systems ;)
pclmulqdq•2h ago
No, we're still not much closer to that event.
LarsDu88•2h ago
If quantum computers crack digital crytography, traditional bank account goes to zero too because regular 'ol databases also use crytography techniques for communication.
wcoenen•1h ago
If all else fails, banks can generate terabytes of random one-time pad bytes, and then physically transport those on tape to other banks to set up provably secure communication channels that still go over the internet.

It would be a pain to manage but it would be safe from quantum computing.

SAI_Peregrinus•1h ago
They could also use pre-shared keys with symmetric cryptography. AES-256-GCM is secure against quantum attack, no need to bother with one-time pads.
OsrsNeedsf2P•1h ago
Let's say I give you a function you can call to crack any RSA key. How are you hacking banks?
bilsbie•1h ago
I don’t see why bitcoin wouldn’t update its software in such a case. The majority of minors just need to agree. But why wouldn’t they if the alternative is going to zero?
jonathanlydall•1h ago
Sir Alexander Dane: MINERS, not MINORS.
FergusArgyll•1h ago
That actually confused me. I thought he he meant "the majority of the minority" while I was pretty sure it's just a simple majority
jacquesm•1h ago
"Ahhhh... now you tell me" (Formerly Prince Andrew, at some point).
andrewstuart2•1h ago
I'll tell you right now, no way my kids would agree until they're at least adults. They don't even know what asymmetric cryptography is.
LPisGood•1h ago
I’m confused, are your kids major Bitcoin miners?
andrewstuart2•1h ago
Not major miners, but minor miners (if you count Minecraft).
jdiff•1h ago
GGP used the term "minors," GP is running with the typo.
andrewla•1h ago
How could updating the software possibly make a difference here? If the encryption is cracked, then who is to say who owns which Bitcoin? As soon as I try to transfer any coin that I own, I expose my public key, your "Quantum Computer" cracks it, and you offer a competing transaction with a higher fee to send the Bitcoin to your slush fund.

No amount of software fixes can update this. In theory once an attack becomes feasible on the horizon they could update to post-quantum encryption and offer the ability to transfer from old-style addresses to new-style addresses, but this would be a herculean effort for everyone involved and would require all holders (not miners) to actively update their wallets. Basically infeasible.

Fortunately this will never actually happen. It's way more likely that ECDSA is broken by mundane means (better stochastic approaches most likely) than quantum computing being a factor.

iwontberude•1h ago
As you alluded to, network can have two parallel chains where wallets can be upgraded by users asynchronously before PQC is “needed” (a long way away still) which will leave some wallets vulnerable and others safe. It’s not that herculean as most wallets (not most BTC) are in exchanges. The whales will be sufficiently motivated to switch and everyone else it will happen in the background.

A nice benefit is it solves the problem with Satoshi’s (of course not a real person or owner) wallet. Satoshi’s wallet becomes the defacto quantum advantage prize. That’s a lot of scratch for a research lab.

jwpapi•1h ago
Not even needed you can just copy network state of a specific moment in time and encrypt with a new algorithm that will be used from then on
strbean•53m ago
The problem is that the owner needs to claim their wallet and migrate it to the new encryption. Just freezing the state at a specific moment doesn't help; to claim the wallet in the new system I just need the private key for the old wallet (as that's the sole way to prove ownership). In our hypothetical post-quantum scenario, anyone with a quantum computer can get the private key and migrate the wallet, becoming the de-facto new owner.

I think this is all overhyped though. It seems likely we will have plenty of warning to migrate prior to achieving big enough quantum computers to steal wallets. Per wikipedia:

> The latest quantum resource estimates for breaking a curve with a 256-bit modulus (128-bit security level) are 2330 qubits and 126 billion Toffoli gates.

IIRC this is speculated to be the reason ECDSA was selected for Bitcoin in the first place.

jjmarr•1h ago
> this would be a herculean effort for everyone involved and would require all holders (not miners) to actively update their wallets. Basically infeasible.

Any rational economic actor would participate in a post-quantum hard fork because the alternative is losing all their money.

If this was a company with a $2 trillion market cap there'd be no question they'd move heaven-and-earth to prevent the stock from going to zero.

Y2K only cost $500 billion[1] adjusted for inflation and that required updating essentially every computer on Earth.

[1]https://en.wikipedia.org/wiki/Year_2000_problem#Cost

orblivion•1h ago
Firstly I'd want to see them hash the whole blockchain (not just the last block) with the post-quantum algo to make sure history is intact.

But as far as moving balances - it's up to the owners. It would start with anybody holding a balance high enough to make it worth the amount of money it would take to crack a single key. That cracking price will go down, and the value of BTC may go up. People can move over time as they see fit.

bloppe•12m ago
> would require all holders (not miners) to actively update their wallets. Basically infeasible.

It doesn't require all holders to update their wallets. Some people would fail to do so and lose their money. That doesn't mean the rest of the network can't do anything to save themselves. Most people use hosted wallets like Coinbase these days anyway, and Coinbase would certainly be on top of things.

Also, you don't need to break ECDSA to break BTC. You could also do it by breaking mining. The block header has a 32-bit nonce at the very end. My brain is too smooth to know how realistic this actually is, but perhaps someone could do use a QC to perform the final step of SHA-256 on all 2^32 possible values of the nonce at once, giving them an insurmountable advantage in mining. If only a single party has that advantage, it breaks the Nash equilibrium.

But if multiple parties have that advantage, I suppose BTC could survive until someone breaks ECDSA. All those mining ASICs would become worthless, though.

jacquesm•1h ago
> The majority of minors just need to agree.

That's an uncomfortably apt typo.

udev4096•48m ago
The problem is all the lost BTC wallets, which is speculated to be a lot and also one of the biggest reason for the current BTC price, who obviously cannot upgrade to PQ. There is currently a radical proposal of essentially making all those lost wallets worthless, unless they migrate [1]

[1] - https://github.com/jlopp/bips/blob/quantum_migration/bip-pos...

shwaj•16m ago
I’m not sure there’s a better alternative.
logtrees•20m ago
No, I don't think so. By the time quantum supremacy is really achieved for a "Q-Day" that could affect them or things like them, the existing blockchains which have already been getting hardened will have gotten even harder. Quantum computing could be used to further harden them, as well, rather than compromise them. Supposing that Q-Day brought any temporary hurdles to Bitcoin or Ethereum or related blockchains, well...due to their underlying nature resulting in justified Permanence, we would be able to simply reconstitute and redeploy them for their functionalities because they've already been sufficiently imbued with value and institutional interest as well. These are quantum-resistant hardenings.

So I do not think these tools or economic substrate layers are going anywhere. They are very valuable for the particular kinds of applications that can be built with them and also as additional productive layers to the credit and liquidity markets nationally, internationally, and also globally/universally.

So there is a lot of institutional interest, including governance interest, in using them to build better systems. Bitcoin on its own would be reduced in such justification but because of Ethereum's function as an engine which can drive utility, the two together are a formidable and quantum-resistant platform that can scale into the hundreds of trillions of dollars and in Ethereum's case...certainly beyond $1Q in time.

I'm very bullish on the underlying technology, even beyond tokenomics for any particular project. The underlying technologies are powerful protocols that facilitate the development and deployment of Non Zero Sum systems at scale. With Q-Day not expected until end of 2020s or beginning of 2030s, that is a considerable amount of time (in the tech world) to lay the ground work for further hardening and discussions around this.

tux3•2h ago
Quantum advantage papers have a history of overpromising, this one looks interesting, but it would still seem wise to wait for a second opinion.
AndrewStephens•1h ago
> "These results ... indicate a viable path to practical quantum advantage"

I'll add this to my list of useful phrases.

Q: Hey AndrewStephens, you promised that task would be completed two days ago. Can you finish it today?

A: Results indicate a viable path to success.

iwontberude•1h ago
Charlie Brown, Lucy, football
keeda•1h ago
An MBA, an engineer and a quantum computing physicist check into a hotel. Middle of the night, a small fire starts up on their floor.

The MBA wakes up, sees the fire, sees a fire extinguisher in the corner of the room, empties the fire extinguisher to put out the fire, then goes back to sleep.

The engineer wakes up, sees the fire, sees the fire extinguisher, estimates the extent of the fire, determines the exact amount of foam required to put it out including a reasonable tolerance, and dispenses exactly that amount to put out the fire, and then satisified that there is enough left in case of another fire, goes back to sleep.

The quantum computing physicist wakes up, sees the fire, observes the fire extinguisher, determines that there is a viable path to practical fire extinguishment, and goes back to sleep.

toasted-subs•1h ago
A consistent theme of Quantum Computing is setting up the problem to have the hardwired achieve nicely to get a good news article to get more funding.

Im pretty reluctant to make any negative comments about these kinds of posts be cause it will prevent actually achieving the desired outcome.

bawolff•1h ago
Quantum computing hardware is still at its infancy.

The problem is not with these papers (or at least not ones like this one) but how they are reported. If quantum computing is going to suceed it needs to do the baby steps before it can do the big steps, and at the current rate the big leaps are probably decades away. There is nothing wrong with that, its a hard problem and its going to take time. But then the press comes in and reports that quantum computing is going to run a marathon tomorrow which is obviously not true and confuses everyone.

toasted-subs•28m ago
There in lies the problem. Hey can I have a few billion dollars for my baby doesn't really work out too well for investors or industry.

The current situation with "AI" took off because people learned their lessons from the last round of funding cuts "AI winter".

wnevets•2h ago
I'm waiting for Peter Gutmann[1] to tell me what to think about this.

[1] https://eprint.iacr.org/2025/1237

getnormality•2h ago
I will wait for a HN commenter to tell me what Scott Aaronson thinks about this.
thedrexster•1h ago
this is my approach, as well, lol
guywithahat•1h ago
As with most news, I'll be waiting for Scott Adams to tell me what to think about this
amiga386•1h ago
The text adventure guy, or the cartoonist who went batshit?
supernetworks_•1h ago
https://arxiv.org/abs/2509.07255

This paper on verifiable advantage is a lot more compelling. With Scott Aaronson and Quantinuum among other great researchers

nashashmi•2h ago
Before the mega monopolies took over, corps used to partner with universities to conduct this kind of research. Now we have bloated salaries, rich corporations, and expensive research while having under experienced graduates. These costs will get forwarded to the consumer. The future won’t have a lot of things that we have come to expect.
Rover222•2h ago
Nihilism is to trendy right now
steego•1h ago
Nihilism is one response to disillusionment.

Another response is to come to terms with a possibly meaningless and Sisyphean reality and to keep pushing the boulder (that you care about) up the hill anyway.

I’m glad the poster is concerned and/or disillusioned about the hype, hyperbole and deception associated with this type of research.

It suggests he still cares.

reaperducer•2h ago
Before the mega monopolies took over, corps used to partner with universities to conduct this kind of research. Now we have bloated salaries, rich corporations, and expensive research while having under experienced graduates. These costs will get forwarded to the consumer. The future won’t have a lot of things that we have come to expect.

I don't disagree, but these days I'm happy to see any advanced research at all.

Granted, too often I see the world through HN-colored glasses, but it seems like so many technological achievements are variations on getting people addicted to something in order to show them ads.

Did Bellcore or Xerox PARC do a lot of university partnerships? I was into other things in those days.

Rebelgecko•1h ago
Funnily enough I remember reading a comment a week or two ago decrying the death of corporate research labs like Bell Labs and Xerox PARC
ibejoeb•1h ago
I don't think it's accurate to attribute some kind of altruism to these research universities. Have a look at some of those pay packages or the literal hedge funds that they operate. And they're mostly exempt from taxation.
auxiliarymoose•1h ago
From the article:

> in partnership with The University of California, Berkeley, we ran the Quantum Echoes algorithm on our Willow chip...

And the author affiliations in the Nature paper include:

Princeton University; UC Berkeley; University of Massachusetts, Amherst; Caltech; Harvard; UC Santa Barbara; University of Connecticut; UC Santa Barbara; MIT; UC Riverside; Dartmouth College; Max Planck Institute.

This is very much in partnership with universities and they clearly state that too.

nashashmi•1h ago
Thanks. I could not find any mention of it. This is good.
onlyrealcuzzo•58m ago
> Before the mega monopolies took over, corps used to partner with universities to conduct this kind of research.

Ah, yes, it's Google's fault, not the destruction of the Department of Education and the weaponization of funding.

josefritzishere•2h ago
Quantum is the new AI. It's just the new hype cycle of doom.
jezzamon•2h ago
The barrier to entry seems a little higher than AI so it's at least a little limited in scope
fooker•2h ago
Can someone explain if this is still the RCS problem or a similar one?

My impression was that every problem a quantum computer solves in practice right now is basically reducible from 'simulate a quantum computer'

seanhunter•2h ago
This is not the RCS problem or indeed anything from number theory.

The announcement is about an algorithm which they are calling Quantum Echoes, where you set up the experiment, perturb one of the qbits and observe the “echoes” through the rest of the system.

They use it to replicate a classical experiment in chemistry done using nuclear magnetic resonance imaging. They say they are able to reproduce the results of that conventional experiment and gather additional data which is unavailable via conventional means.

qnleigh•1h ago
This is quite different from their previous random circuit sampling (RCS) experiments that have made headlines a few times in the past. The key difference from an applied standpoint is that the output of RCS is a random bitstring which is different every time you run the algorithm. These bitstrings are not reproducible, and also not particularly interesting, except for the fact that only a quantum computer can generate them efficiently.

The new experiment generates the same result every time you run it (after a small amount of averaging). It also involves running a much more structured circuit (as opposed to a random circuit), so all-in-all, the result is much more 'under control.'

As a cherry on top, the output has some connection to molecular spectroscopy. It still isn't that useful at this scale, but it is much more like the kind of thing you would hope to use a quantum computer for someday (and certainly more useful than generating random bitstrings).

elAhmo•2h ago
Definitely not a quantum expert, but I have a feeling that news like this have been happening for more than a decade, without anything usable.
terminalbraid•1h ago
It's what happens when companies are driven by profit rather than making accurate scientific statements that reputation is built by and further research funding is predicated on.

Hyperbolic claims like this are for shareholders who aren't qualified to judge for themselves because they're interested in future money and not actual understanding. This is what happens when you delegate science to corporations.

JohnHaugeland•2h ago
the big problem with quantum advantage is that quantum computing is inherently error-prone and stochastic, but then they compare to classical methods that are exact

let a classical computer use an error prone stochastic method and it still blows the doors off of qc

this is a false comparison

jasonthorsness•2h ago
They get the same result when they run it a second time and it matches the classical result; this is their key achievement (in addition to the speed).
krastanov•2h ago
Stochasticity (randomness) is pervasively used in classical algorithms that one compares to. That is nothing new and has always been part of comparisons.

"Error prone" hardware is not "a stochastic resource". Error prone hardware does not provide any value to computation.

ashleyn•2h ago
We're all asking it: any impact on AES?
jonathanstrange•2h ago
The rule of thumb is that a working quantum computer that can run Grover's algorithm reduces the security of a symmetric cipher to half of its key size. That is, AES-128 should be considered to have a 64 bit key size, which is why it's not considered "quantum-safe."

Edit: An effective key space of 2^64 is not secure according to modern-day standards. It was secure at the times of DES.

adgjlsfhk1•1h ago
AES-128 is quantum safe (more or less). 64 bit security in the classical domain isn't safe because you can parallelize across 2^20 computers trivially. Grover gives you 2^64 AES operations on a quantum coputer (probably ~2^70 gates or so before error correction or ~2^90 after error correction) that can't be parallelized efficiently. AES-128 is secure for the next century (but you might as well switch to aes-256 because why not)
seanhunter•2h ago
This is a chemistry experiment, so no.
dlandis•2h ago
> Quantum computing-enhanced NMR could become a powerful tool in drug discovery, helping determine how potential medicines bind to their targets, or in materials science for characterizing the molecular structure of new materials like polymers, battery components or even the materials that comprise our quantum bits (qubits)

There is a section in the article about future real world application, but I feel like these articles about quantum "breakthroughs" are almost always deliberately packed with abstruse language. As a result I have no sense about whether these suggested real world applications are a few years away or 50+ years away. Does anyone?

kantbtrue•2h ago
“13,000× faster” sounds huge, but I wonder what it’s being compared to. Quantum speedups are always tricky to measure
jonathanstrange•2h ago
The article states: “...13,000 times faster on Willow than the best classical algorithm on one of the world’s fastest supercomputers...”

I agree it's not very precise without knowing which of the world's fastest supercomputers they're talking about, but there was no need to leave out this tidbit.

jasonthorsness•2h ago
The paper talks only about the Frontier supercomputer which is #2 on Top500. But I think it was an analysis rather than them actually running it.
jonathanstrange•2h ago
I was being sarcastic because 13,000 times faster is 4 orders of magnitude faster so it doesn't matter to which supercomputer it is compared.
jasonthorsness•2h ago
"surpassing even the fastest classical supercomputers (13,000x faster)"

"Quantum verifiability means the result can be repeated on our quantum computer — or any other of the same caliber — to get the same answer, confirming the result."

"The results on our quantum computer matched those of traditional NMR, and revealed information not usually available from NMR, which is a crucial validation of our approach."

It certainly seems like this time, there finally is a real advantage?

seanhunter•1h ago
I’ve only skimmed the paper but it seems like the “information not usually available” from NMR is the Jacobian and Hessian of the Hamiltonian of the system.

So basically you’re able to go directly from running the quantum experiment to being able to simulate the dynamics of the underlying system, because the Jacobian and Hessian are the first and second partial derivatives of the system with respect to all of its parameters in matrix form.

i-con•2h ago
Notice how they say "quantum advantage" not "supremacy" and "a (big) step toward real-world applications". So actually just another step as always. And I'm left to doubt if the classic algorithm used for comparison was properly optimised.
ccppurcell•1h ago
I skimmed the paper so might have missed something, but iiuc there is no algorithm used for comparison. They did not run some well defined algorithm on some benchmark instances, they estimated the cost of simulating the circuits "through tensor network contraction" - I quote here not to scare but because this is where my expertise runs out.
streptomycin•1h ago
FWIW "quantum advantage" and "quantum supremacy" are synonyms, some people just prefer the former because the latter reminds them of "white supremacy" https://en.wikipedia.org/wiki/Quantum_supremacy#Criticism_of...
i-con•1h ago
Good point, I had no idea. I actually thought they just restrained themselves this time :D
TeeMassive•1h ago
The last time I heard a similar news from Google, it turned out they were solving a quantum phenomenon using a quantum phenomenon. It seems to be the same pattern here. Not to say it's not progress, but kind of feels like overhyped.
refulgentis•1h ago
Idk. I get this is the median take across many comments, I don’t mean to be disagreeable with a crowd. But I don’t know why using quantum phenomena is a sign something’s off. It’s a quantum computer! But I know something is off with this take if it didn’t strike you that way.
fwip•20m ago
To me, it matters because it's a sign that it might not be particularly transferable as a method of computation.

A wind tunnel is a great tool for solving aerodynamics and fluid flow problems, more efficiently than a typical computer. But we don't call it a wind-computer, because it's not a useful tool outside of that narrow domain.

The promise of quantum computing is that it can solve useful problems outside the quantum realm - like breaking traditional encryption.

tiahura•1h ago
Can it factor 21?
FabHK•1h ago
So, "verifiable" here means "we ran it twice and got the same result"?

> Quantum verifiability means the result can be repeated on our quantum computer — or any other of the same caliber — to get the same answer, confirming the result.

refulgentis•51m ago
Sounds like N, not 2
EvgeniyZh•35m ago
It is not very clear from the text and from what I can say there is no "verifiability" concept in the papers they link.

I think what they are trying to do is to contrast these to previous quantum advantage experiments in the following sense.

The previous experiments involve sampling from some distribution, which is believed to be classically hard. However, it is a non-trivial question whether you succeed or fail in this task. Having perfect sampler from the same distribution won't allow you to easily verify the samples.

On the other hand these experiments involve measuring some observable, i.e., the output is just a number and you could compare it to the value obtained in a different way (one a different or same computer or even some analog experimental system).

Note that these observables are expectation values of the samples, but in the previous experiments since the circuits are random, all the expectation values are very close to zero and it is impossible to actually resolve them from the experiment.

Disclaimer: this is my speculation about what they mean because they didn't explain it anywhere from what I can see.

nine_k•12m ago
It means that they transcended the "works on my machine" stage, and can reliably run a quantum algorithm on more than one different quantum computer.
bossyTeacher•52m ago
Afaik we are a decade or two away from quantum supremacy. All the AI monks, forget that if AI is the future, quantum supremacy is the present. And whoever controls the present, decides the future.

Rememeber, it is not about quantum general computing, it's about implementing the quantum computation of Shor's algorithm

cwmma•36m ago
but much like AI hype quantum hype is also way over played, yeah modern asymmetric encryption will be less secure, but even after you have quantum computers that can do Shor's algorithm it might be a while before there are quantum computers affordable enough for it to be an actual threat (i.e. it's not cheaper to just buy a zero day for the target's phone or something).

But since we already have post quantum algorithms, the end state of cheap quantum computers is just a new equilibrium where people use the new algorithms and they can't be directly cracked and it's basically the same except maybe you can decrypt historical stuff but who knows if it's worth it.

Lapsa•34m ago
but can it run Doom?
DrNosferatu•9m ago
I would like to hear from the classical computation people if they validate these results and claims:

As many times in the past quantum supremacy was claimed, and then, other groups have shown they can do better with optimized classical methods.