frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Launch HN: Freestyle: Sandboxes for AI Coding Agents

https://www.freestyle.sh
75•benswerd•1h ago•39 comments

Battle for Wesnoth: open-source, turn-based strategy game

https://www.wesnoth.org
39•akyuu•53m ago•9 comments

Germany Doxes "UNKN," Head of RU Ransomware Gangs REvil, GandCrab

https://krebsonsecurity.com/2026/04/germany-doxes-unkn-head-of-ru-ransomware-gangs-revil-gandcrab/
183•Bender•4h ago•79 comments

A cryptography engineer's perspective on quantum computing timelines

https://words.filippo.io/crqc-timeline/
107•thadt•2h ago•31 comments

Sc-im: Spreadsheets in your terminal

https://github.com/andmarti1424/sc-im
67•m-hodges•2h ago•18 comments

Book review: There is no antimemetics division

https://www.stephendiehl.com/posts/no_antimimetics/
120•ibobev•4h ago•79 comments

Sky – an Elm-inspired language that compiles to Go

https://github.com/anzellai/sky
50•whalesalad•3h ago•3 comments

Reducto releases Deep Extract

https://reducto.ai/blog/reducto-deep-extract-agent
24•raunakchowdhuri•2h ago•3 comments

Claude Code is unusable for complex engineering tasks with the Feb updates

https://github.com/anthropics/claude-code/issues/42796
358•StanAngeloff•4h ago•253 comments

Show HN: GovAuctions lets you browse government auctions at once

https://www.govauctions.app/
24•player_piano•2h ago•14 comments

Adobe modifies hosts file to detect whether Creative Cloud is installed

https://www.osnews.com/story/144737/adobe-secretly-modifies-your-hosts-file-for-the-stupidest-rea...
57•rglullis•52m ago•25 comments

What being ripped off taught me

https://belief.horse/notes/what-being-ripped-off-taught-me/
220•doctorhandshake•5h ago•134 comments

Show HN: I built a tiny LLM to demystify how language models work

https://github.com/arman-bd/guppylm
779•armanified•18h ago•119 comments

The Last Quiet Thing

https://www.terrygodier.com/the-last-quiet-thing
46•coinfused•2d ago•22 comments

I won't download your app. The web version is a-ok

https://www.0xsid.com/blog/wont-download-your-app
646•ssiddharth•3h ago•350 comments

Eighteen Years of Greytrapping – Is the Weirdness Finally Paying Off?

https://nxdomain.no/~peter/eighteen_years_of_greytrapping.html
4•jruohonen•2d ago•0 comments

PostHog (YC W20) Is Hiring

1•james_impliu•5h ago

Microsoft hasn't had a coherent GUI strategy since Petzold

https://www.jsnover.com/blog/2026/03/13/microsoft-hasnt-had-a-coherent-gui-strategy-since-petzold/
725•naves•1d ago•505 comments

81yo Dodgers fan can no longer get tickets because he doesn't have a smartphone

https://twitter.com/Suzierizzo1/status/2040864617467924865
221•josephcsible•2h ago•200 comments

Gemma 4 on iPhone

https://apps.apple.com/nl/app/google-ai-edge-gallery/id6749645337
801•janandonly•23h ago•222 comments

An open-source 240-antenna array to bounce signals off the Moon

https://moonrf.com/
227•hillcrestenigma•15h ago•46 comments

France pulls last gold held in US for $15B gain

https://www.mining.com/france-pulls-last-gold-held-in-us-for-15b-gain/
491•teleforce•10h ago•267 comments

The 1987 game “The Last Ninja” was 40 kilobytes

https://twitter.com/exQUIZitely/status/2040777977521398151
243•keepamovin•15h ago•158 comments

LÖVE: 2D Game Framework for Lua

https://github.com/love2d/love
387•cl3misch•2d ago•200 comments

Signals, the push-pull based algorithm

https://willybrauner.com/journal/signal-the-push-pull-based-algorithm
137•mpweiher•2d ago•34 comments

One ant for $220: The new frontier of wildlife trafficking

https://www.bbc.com/news/articles/cg4g44zv37qo
95•gmays•4d ago•26 comments

Show HN: Real-time AI (audio/video in, voice out) on an M3 Pro with Gemma E2B

https://github.com/fikrikarim/parlor
224•karimf•1d ago•26 comments

Drop, formerly Massdrop, ends most collaborations and rebrands under Corsair

https://drop.com/
111•stevebmark•14h ago•59 comments

Running Gemma 4 locally with LM Studio's new headless CLI and Claude Code

https://ai.georgeliu.com/p/running-google-gemma-4-locally-with
377•vbtechguy•1d ago•93 comments

Sheets Spreadsheets in Your Terminal

https://github.com/maaslalani/sheets
169•_____k•2d ago•45 comments
Open in hackernews

A cryptography engineer's perspective on quantum computing timelines

https://words.filippo.io/crqc-timeline/
104•thadt•2h ago

Comments

pdhborges•1h ago
What do you recomend as reading material for someone that was in college a while ago (before AE modes got popular) to get up to speed with the new PQ developments?
FiloSottile•1h ago
If you want something book-shaped, the 2nd edition of Serious Cryptography is updated to when the NIST standards were near-final drafts, and has a nice chapter on post-quantum cryptography.

If you want something that includes details on how they were deployed, I'm afraid that's all very recent and I don't have good references.

vonneumannstan•1h ago
This seems like something uniquely suited to the startup ecosystem. I.e. offering PQ Encryption Migration as a Service. PQ algorithms exist and now theres a large lift required to get them into the tech with substantial possible value.
hlieberman•1h ago
… really? This is simultaneously so far down in the plumbing and extremely resistant to measuring the impact of, I can’t imagine anyone building a company off of this that’s not already deep in the weeds (lookin’ at you, WolfSSL).

The idea that a startup would be competitive in the VC “the only thing that matters are the feels” environment seems crazy to me.

OhMeadhbh•56m ago
Yeah... I spent the 90s working for RSADSI and Certicom implementing algorithms. Crypto is a vitamin, not an aspirin. Hardly anyone is capable of properly assessing risk in general, much less the technical world of information risk management. Telling someone they should pay you money to reduce the impact of something that may or may not happen in the future is not a sales win.
tux3•1h ago
This is a good take, there's really not much to argue about.

>[...] the availability of HPKE hybrid recipients, which blocked on the CFRG, which took almost two years to select a stable label string for X-Wing (January 2024) with ML-KEM (August 2024), despite making precisely no changes to the designs. The IETF should have an internal post-mortem on this, but I doubt we’ll see one

My kingdom for a standards body that discusses and resolves process issues.

OhMeadhbh•45m ago
I missed you at the most recent CRFG meeting.
OhMeadhbh•1h ago
In rebuttal, Peter Gutmann seems to think the progress towards quantum computing devices which can break commonly used public key crypto systems is not moving especially quickly: https://eprint.iacr.org/2025/1237
schmichael•56m ago
That's not a rebuttal. The post references the paper and a rebuttal to it from an expert in the field.
OhMeadhbh•50m ago
Damn. It's like I insulted Vault.

Also, I went over Filippo's post again and still can't see where it references the Gutmann / Neuhaus paper. Are we talking about the same post?

tkhattra•7m ago
From Filippo's post: "Sure, papers about an abacus and a dog are funny and can make you look smart and contrarian on forums."
xvector•6m ago
From the abstract:

> This paper presents implementations that match and, where possible, exceed current quantum factorisation records using a VIC-20 8-bit home computer from 1981, an abacus, and a dog.

From the link:

> Sure, papers about an abacus and a dog are funny and can make you look smart and contrarian on forums. But that’s not the job, and those arguments betray a lack of expertise[1]. As Scott Aaronson said[2]:

> > Once you understand quantum fault-tolerance, asking “so when are you going to factor 35 with Shor’s algorithm?” becomes sort of like asking the Manhattan Project physicists in 1943, “so when are you going to produce at least a small nuclear explosion?”

[1]: https://bas.westerbaan.name/notes/2026/04/02/factoring.html

[2]: https://scottaaronson.blog/?p=9665#comment-2029013

OsrsNeedsf2P•1h ago
Why do we "need to ship"? 1,000 qubit quantum computers are still decades away at this point
OhMeadhbh•34m ago
So... In 2013 I was working for Mozilla adding TLS 1.1 and 1.2 support into Firefox. It turns out that some of the extensions common in 1.1, in some instances caused PDUs to grow beyond 16k (or maybe it was 32k, can't remember.). This caused middle boxes to barf. Sure, they shouldn't barf, but they did. We discovered the problem (or rather one of our users discovered the problem) by increasing the key size on server and client certs to push PDU sizes over the limit.

At the very least, you want to start using hybrid legacy / pqc algorithms so engineers at Cisco will know not to limit key sizes in PDUs to 128 bytes.

ekr____•19m ago
A few points here: There is already very wide use of PQ algorithms in the Web context [0], which is the most problematic one because clients need to be able to connect to any site and there's no real coordination between sites and clients. So we're exercising the middleboxes already.

The incident you're thinking of doesn't sound familiar. None of the extensions in 1.1 really were that big, though of course certs can get that big if you work hard enough. Are you perhaps thinking instead of the 256-511 byte ClientHello issue addressed ion [1]

[0] https://blog.cloudflare.com/pq-2025/ [1] https://datatracker.ietf.org/doc/html/rfc7685

Sparkyte•52m ago
There is always a price to encryption. The cost goes up the more you have to cater to different and older encryptions while supporting the latest.
munrocket•51m ago
Yes, this is why I invested in QRL crypto. With lates updates and no T1 exchange it looks like a good opportunity to grow.
adrian_b•43m ago
It should be noted that if indeed there has not remained much time until a usable quantum computer will become available, the priority is the deployment of FIPS 203 (ML-KEM) for the establishment of the secret session keys that are used in protocols like TLS or SSH.

ML-KEM is intended to replace the traditional and the elliptic-curve variant of the Diffie-Hellman algorithm for creating a shared secret value.

When FIPS 203, i.e. ML-KEM is not used, adversaries may record data transferred over the Internet and they might become able to decrypt the data after some years.

On the other hand, there is much less urgency to replace the certificates and the digital signature methods that are used today, because in most cases it would not matter if someone would become able to forge them in the future, because they cannot go in the past to use that for authentication.

The only exception is when there would exist some digital documents that would completely replace some traditional paper documents that have legal significance, like some documents proving ownership of something, which would be digitally signed, so forging them in the future could be useful for somebody, in which case a future-proof signing method would make sense for them.

OpenSSH, OpenSSL and many other cryptographic libraries and applications already support FIPS 203 (ML-KEM), so it could be easily deployed, at least for private servers and clients, without also replacing the existing methods used for authentication, e.g. certificates, where using post-quantum signing methods would add a lot of overhead, due to much bigger certificates.

FiloSottile•35m ago
That was my position until last year, and pretty much a consensus in the industry.

What changed is that the new timeline might be so tight that (accounting for specification, rollout, and rotation time) the time to switch authentication has also come.

ML-KEM deployment is tangentially touched on in the article because it's both uncontroversial and underway, but:

> This is not the article I wanted to write. I’ve had a pending draft for months now explaining we should ship PQ key exchange now, but take the time we still have to adapt protocols to larger signatures, because they were all designed with the assumption that signatures are cheap. That other article is now wrong, alas: we don’t have the time if we need to be finished by 2029 instead of 2035.

> For key exchange, the migration to ML-KEM is going well enough but: 1. Any non-PQ key exchange should now be considered a potential active compromise, worthy of warning the user like OpenSSH does, because it’s very hard to make sure all secrets transmitted over the connection or encrypted in the file have a shorter shelf life than three years. [...]

You comment is essentially the premise of the other article.

adrian_b•26m ago
I agree with you that one must prepare for the transition to post-quantum signatures, so that when it becomes necessary the transition can be done immediately.

However that does not mean that the switch should really be done as soon as it is possible, because it would add unnecessary overhead.

This could be done by distributing a set of post-quantum certificates, while continuing to allow the use of the existing certificates. When necessary, the classic certificates could be revoked immediately.

FiloSottile•17m ago
How do you do revocation or software updates securely if your current signature algorithm is compromised?
ekr____•7m ago
As a practical matter, revocation on the Web is handled mostly by centrally distributed revocation lists (CRLsets, CRLite, etc. [0]), so all you really need is:

(1) A PQ-secure way of getting the CRLs to the browser vendors. (2) a PQ-secure update channel.

Neither of these require broad scale deployment.

However, the more serious problem is that if you have a setting where most servers do not have PQ certificates, then disabling the non-PQ certificates means that lots of servers can't do secure connections at all. This obviously causes a lot of breakage and, depending on the actual vulnerability of the non-PQ algorithms, might not be good for security either, especially if people fall back to insecure HTTP.

See: https://educatedguesswork.org/posts/pq-emergency/ and https://www.chromium.org/Home/chromium-security/post-quantum...

[0] The situation is worse for Apple.

layer8•26m ago
> The only exception is when there would exist some digital documents that would completely replace some traditional paper documents that have legal significance, like some documents proving ownership of something, which would be digitally signed, so forging them in the future could be useful for somebody, in which case a future-proof signing method would make sense for them.

This very much exists. In particular, the cryptographic timestamps that are supposed to protect against future tampering are themselves currently using RSA or EC.

janalsncm•38m ago
Building out a supercomputer capable of breaking cryptography is exactly the kind of thing I expect governments to be working on now. It is referenced in the article, but the analogy to the Manhattan Project is clear.

Prior to 1940 it was known that clumping enough fissile material together could produce an explosion. There were engineering questions around how to purify uranium and how to actually construct the weapon etc. But the phenomenon was known.

I say this because there’s a meme that governments are cooking up exotic technologies behind closed doors which I personally tend to doubt.

This is almost perfect analogy to the MP though. We know exactly what could happen if we clumped enough qubits together. There are hard engineering challenges of actually doing so, and governments are pretty good at clumping dollars together when they want to.

bitexploder•11m ago
The Manhattan project employed some significant % of all of America. A project of that scale will likely never happen again.

It was also about far more than the science. It was about industrializing the entire production process and creating industrial capability that simply did not exist before.

amluto•23m ago
I was in this field a while back, and I always found it baffling that anyone ever believed in the earlier large estimates for the size of a quantum computer needed to run Shor's algorithm. For a working quantum computer, Shor's algorithm is about as difficult as modular exponentiation or elliptic curve scalar multiplication, it can compute discrete logs. To break keys of a few hundred bits, you need a few hundred qubits plus not all that much overhead. And the error correction keeps improving all the time.

Also...

> Trusted Execution Environments (TEEs) like Intel SGX and AMD SEV-SNP and in general hardware attestation are just f**d. All their keys and roots are not PQ and I heard of no progress in rolling out PQ ones, which at hardware speeds means we are forced to accept they might not make it, and can’t be relied upon.

This part is embarrassing. We’ve had hash-based signatures that are plenty good for this for years and inspire more confidence for long-term security than the lattice schemes. Sure, the private keys are bigger. So what?

We will also need some clean way to upgrade WebAuthn keys, and WebAuthn key management currently massively sucks.

palata•21m ago
What is the consequence on e.g. Yubikeys (or say the Android Keystore)? Do I understand correctly that those count as "signature algorithms" and are a little less at risk than "full TEEs" because there is no "store now, decrypt later" for authentication?

E.g. can I use my Yubikey with FIDO2 for SSH together with a PQ encryption, such that I am safe from "store now, decrypt later", but can still use my Yubikey (or Android Keystore, for that matter)?

amluto•15m ago
Your Yubikey itself is doomed.

If you are doing a post-quantum key exchange and only authenticating with the Yubikey, then you are safe from after-the-fact attacks. Well, as long as the PQ key exchange holds up, and I am personally not as optimistic about that as I’d like to be.

bjourne•4m ago
> Traveling back from an excellent AtmosphereConf 2026, I saw my first aurora, from the north-facing window of a Boeing 747.

Given the author's "safety first" stance on pqc, it seems a bit incongruent to continue to fly to conferences...