frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Start all of your commands with a comma

https://rhodesmill.org/brandon/2009/commands-with-comma/
163•theblazehen•2d ago•47 comments

OpenCiv3: Open-source, cross-platform reimagining of Civilization III

https://openciv3.org/
674•klaussilveira•14h ago•202 comments

The Waymo World Model

https://waymo.com/blog/2026/02/the-waymo-world-model-a-new-frontier-for-autonomous-driving-simula...
950•xnx•20h ago•552 comments

How we made geo joins 400× faster with H3 indexes

https://floedb.ai/blog/how-we-made-geo-joins-400-faster-with-h3-indexes
123•matheusalmeida•2d ago•33 comments

Jeffrey Snover: "Welcome to the Room"

https://www.jsnover.com/blog/2026/02/01/welcome-to-the-room/
22•kaonwarb•3d ago•19 comments

Unseen Footage of Atari Battlezone Arcade Cabinet Production

https://arcadeblogger.com/2026/02/02/unseen-footage-of-atari-battlezone-cabinet-production/
58•videotopia•4d ago•2 comments

Show HN: Look Ma, No Linux: Shell, App Installer, Vi, Cc on ESP32-S3 / BreezyBox

https://github.com/valdanylchuk/breezydemo
232•isitcontent•14h ago•25 comments

Monty: A minimal, secure Python interpreter written in Rust for use by AI

https://github.com/pydantic/monty
225•dmpetrov•15h ago•118 comments

Show HN: I spent 4 years building a UI design tool with only the features I use

https://vecti.com
332•vecti•16h ago•145 comments

Hackers (1995) Animated Experience

https://hackers-1995.vercel.app/
495•todsacerdoti•22h ago•243 comments

Sheldon Brown's Bicycle Technical Info

https://www.sheldonbrown.com/
383•ostacke•20h ago•95 comments

Microsoft open-sources LiteBox, a security-focused library OS

https://github.com/microsoft/litebox
360•aktau•21h ago•182 comments

Show HN: If you lose your memory, how to regain access to your computer?

https://eljojo.github.io/rememory/
289•eljojo•17h ago•175 comments

An Update on Heroku

https://www.heroku.com/blog/an-update-on-heroku/
413•lstoll•21h ago•279 comments

Vocal Guide – belt sing without killing yourself

https://jesperordrup.github.io/vocal-guide/
32•jesperordrup•4h ago•16 comments

Was Benoit Mandelbrot a hedgehog or a fox?

https://arxiv.org/abs/2602.01122
20•bikenaga•3d ago•8 comments

Where did all the starships go?

https://www.datawrapper.de/blog/science-fiction-decline
17•speckx•3d ago•7 comments

PC Floppy Copy Protection: Vault Prolok

https://martypc.blogspot.com/2024/09/pc-floppy-copy-protection-vault-prolok.html
63•kmm•5d ago•7 comments

Dark Alley Mathematics

https://blog.szczepan.org/blog/three-points/
91•quibono•4d ago•21 comments

How to effectively write quality code with AI

https://heidenstedt.org/posts/2026/how-to-effectively-write-quality-code-with-ai/
258•i5heu•17h ago•196 comments

Delimited Continuations vs. Lwt for Threads

https://mirageos.org/blog/delimcc-vs-lwt
32•romes•4d ago•3 comments

What Is Ruliology?

https://writings.stephenwolfram.com/2026/01/what-is-ruliology/
44•helloplanets•4d ago•42 comments

Introducing the Developer Knowledge API and MCP Server

https://developers.googleblog.com/introducing-the-developer-knowledge-api-and-mcp-server/
60•gfortaine•12h ago•26 comments

I now assume that all ads on Apple news are scams

https://kirkville.com/i-now-assume-that-all-ads-on-apple-news-are-scams/
1070•cdrnsf•1d ago•446 comments

Female Asian Elephant Calf Born at the Smithsonian National Zoo

https://www.si.edu/newsdesk/releases/female-asian-elephant-calf-born-smithsonians-national-zoo-an...
36•gmays•9h ago•12 comments

I spent 5 years in DevOps – Solutions engineering gave me what I was missing

https://infisical.com/blog/devops-to-solutions-engineering
150•vmatsiiako•19h ago•70 comments

Understanding Neural Network, Visually

https://visualrambling.space/neural-network/
288•surprisetalk•3d ago•43 comments

Why I Joined OpenAI

https://www.brendangregg.com/blog/2026-02-07/why-i-joined-openai.html
150•SerCe•10h ago•142 comments

Learning from context is harder than we thought

https://hy.tencent.com/research/100025?langVersion=en
186•limoce•3d ago•100 comments

Show HN: R3forth, a ColorForth-inspired language with a tiny VM

https://github.com/phreda4/r3
73•phreda4•14h ago•14 comments
Open in hackernews

Quantum Computation Lecture Notes (2022)

https://math.mit.edu/~shor/435-LN/
166•ibobev•8mo ago

Comments

revskill•7mo ago
How can a person become so good at researching ?
lightbendover•7mo ago
(1) be naturally gifted, (2) avoid falling down wells of distraction, (3) be lucky. Don’t sleep on (3), it’s easy to call it capitalizing on opportunity in hindsight when it was honestly just luck.
TechDebtDevin•7mo ago
luck in terms of natural intelligence and focus. But generally people can approve their ability to avoid falling down the "wells of distraction". This year I set time filters for apps on my phone and its made a stark difference. Even though I can turn off "work mode" to get around it, that little reminder has saved me hundreds of hours as I usually just put down the phone when I see it.
jasonhong•7mo ago
I'd also add (4) be incredibly curious about lots of things; (5) surround yourself with other smart, curious, and committed people who have a culture of critiquing ideas; and (6) devote a lot of time to deep thinking.
rwoerz•7mo ago
(8) be good in counting things, (h) be consistent in your thinking, (10) have a good memory, (11) be good in counting things, (12) refrain from making silly comments
xeonmc•7mo ago
just get nerd-sniped.
gaze•7mo ago
I don't know how else to say it but you just have to do it. It's like asking how to get good at running. You run.
graycat•7mo ago
> How can a person become so good at researching ?

My time in my Ph.D. program and some of the work in my career (getting paid) suggest that I was "good at researching". But I left such research due to wanting to get paid more, and settled on starting a business, owning it, and making it valuable. If some research can help the business, fine, but the real goal is just the money from a successful business.

On (academic) research, one lesson no one ever mentioned to me but eventually I formulated: Pick a field of research. Then in that field a lot about what is expected, respected, intended, valued or not, ..., is not much spoken about and not made clear -- clear that have fertile ground for politics. Then, for such questions, the answers you guess or get in some one field will likely be quite different in another. In some fields can get reminded of the old quip: "Haydn wrote 101 symphonies or one symphony 101 times?" Or at times can believe that with high probability, a paper gets read by just two people, the peer reviewer and the author; as a result of that case, the only accomplishment of papers, good or bad, are that they get counted as in someone with 50 papers is regarded as better than someone with only 4. Ah, tough to prove that the paper will never get 1000 readers!

For research, one approach is to study a (assume an academic) field, crawl down some narrow alley or rabbit whole, see a question with no answer, consider the broad status of the field, then if making progress on the question seems not obviously impossible, give it a try. By a few days or weeks should have an answer, a partial answer, or hints that by continuing you might get something.

Another approach is to pick a problem mostly on your own and not from, trying to extend, published research. You might follow some instance of personal curiosity or something from some other field, e.g., do some math, optimization, statistics, ... research from problems in the environment (why the ups and downs of lobsters in east Canada?), medical testing, the supply chain, some engineering problem, some business problem, etc.

Do note that in the US, after radar, the proximity fuze, submarine acoustics, code breaking, jet engines, the "bomb", the US military had plenty of both money and problems, and that funded a lot of US research. Now there seems to be a general view: We don't know what research directions will yield powerful results, but since we can't afford to miss out on some big results or fall behind, we will continue to fund research. Non-military research seems less eager for results and to have less money.

Ah, be good at the politics, e.g., even follow "Always look for the hidden agenda." If working in an organization, beware of "goal subordination", i.e., others working to have you fail.

11101010001100•7mo ago
What, if any, did you port from the research career to the business?
graycat•7mo ago
I derived some math, new at least to me, based on my pure math background, e.g., Rudin in measure theory some functional analysis, probability based on measure theory from Neveu, tightness in probability once used in some statistics for computer science, presented at NASDAQ, some optimization via the Kuhn-Tucker conditions, some stochastic optimal control, etc.
polamolo•7mo ago
I feel like I've seen/done this before. Could I be stuck in a groundhog day?
zara_is_reading•7mo ago
I’ve 70.71% seen it before and 70.71% not seen it before.
rvz•7mo ago
Well right now I am very skeptical, but I think we have somewhat given quantum computing plenty of time (we have given it decades) unless someone can convince me that it is not a scam.

Right now it hasn't amounted to anything useful, other than Shor's and 'experiments' and promises and applications that are no better done on a GPU rack right now.

adastra22•7mo ago
Tab is the equivalent of “there will be worldwide demand for 4-6 computers.”
dandanua•7mo ago
Who are "we"? And why those "we" think that a branch of fundamental science that have people involved all around the globe could be a scam? It's nowhere close to the cryto mania, where there is only one end goal.
n4r9•7mo ago
For quantum computing, as a rule of thumb I generally look to what Scott Aaronson says. And as you suggest, while some cool stuff is being done both in industry and academia, we are nowhere near general quantum computers. I haven't checked what his outlook is for the next 5-10 years.
honzaik•7mo ago
this may give you an idea about his current outlook https://www.youtube.com/watch?v=DQFyQgA_GE4
bradly•7mo ago
Not quantum computation, but quantum mechanics are being used in QKD satellites today to hedge against RSA being broken. Pretty neat.
bawolff•7mo ago
Quantum computers is not a scam, but QKD basically is. There is no scenario where QKD actually makes practical sense.
Viliam1234•7mo ago
I am not an expert, but seems to me it is caused by two things:

1) While quantum computers are potentially exponentially faster, they also seem to be exponentially more expensive given the number of qubits, so you actually can't save money by building a huge quantum computer. This may or may not change in future. Also, there was a problem with error correction, which is made much harder by the nature of quantum computing. Smart people are working on that, I don't know the current state of progress.

2) Despite the hype, only some problems can be calculated exponentially faster using a quantum computer, not all of them. This is analogical to parallel computing: having two CPUs instead of one will allow you to calculate some things twice as fast, but some other things will require exactly the same amount of time because their steps need to be done sequentially. Similarly, a quantum computer is like a network of billions of computers that are spread across the multiverse, but they need to all run the same code, and to compress the results of the gigantic computation into about dozen bytes. So it's great for highly parallelizable tasks where the entire required output is a "yes or no" or a single number... and less useful for everything else. That still includes some important scientific problems, such as simulating atoms and molecules. But those are not the things we typically use computers for.

japanuspus•7mo ago
> Well right now I am very skeptical, but I think we have somewhat given quantum computing plenty of time (we have given it decades) unless someone can convince me that it is not a scam.

Shor's paper on polynomial time factoring is from 1997, first real demonstration of quantum hardware (Monroe et al.) is from 1995: Yes, quantum has had decades -- but only barely, and is has certainly only now started to have generations.

To look at the kind of progress this means, take a look of some of the recent phd spinouts of leading research groups (Oxford Ionics etc.): There are a lot of organisations with nothing but engineering to go before they reach fault tolerance.

When I came back to quantum three years ago, fault tolerance was still to be based on the surface code ideas that floated when I did my phd ('04). Today, after everyone has started looking harder, it turns out that a bit of long-range connectivity can cut the error correction overhead by orders of magnitude (see recent public posts by IBM Quantum): The goalposts for fault tolerance are moving in the right direction.

And this is the key thing about quantum computing: you need error correction, and you need to do it with the same error-prone hardware that you correct for. There is a threshold hardware quality that will let you do this at a reasonable overhead, and before you reach this threshold all you have is a fancy random number generator.

But yes, feel free to be a pessimist -- just remember to own it when quantum happens in a few years.

William_BB•7mo ago
"When quantum happens in a few years" -- do you mean NISQ (i.e. VQA, QAOA) or actual fault tolerant quantum computing that can run Shor?
ttshaw1•7mo ago
We're already at NISQ
William_BB•7mo ago
I meant practically usable NISQ
andrewla•7mo ago
I'm a skeptic as well, but calling it a "scam" is a bit extreme. I think QC proponents are acting in good faith, and I believe that it is worth chasing a little longer since we don't yet have a convincing model for why QC will or won't work (although I think the Gil Kalai's work in this area is intuitively correct I don't think that we have a physical explanation for why quantum error correction would not work).

The current emphasis on NISQ systems is a bit of a desperate measure because the most we can get out of such systems is evidence that quantum computing can work in theory; they do not advance us towards having a workable quantum computer.

wasabi991011•7mo ago
Fwiw:

The last paper I saw posted on hackernews from Gil Kalai included a few explicit predictions about what would be impossible in quantum error correction.

This was a paper from a few years back.

The problem is that now Google has published results which imply that some Kalai's predictions turned out false.

The paper in question is Google's recent "below threshold"/"beyond break-even" QEC paper. I believe Kalai was predicting below threshold QEC to be impossible IIRC, among other things.

Not sure if Kalai has responded or updated his predictions, I haven't been following him closely.

n4r9•7mo ago
Although I agree that "scam" is extreme, the commercial side was sullied in the early 2010s by D-Wave selling what they described as "quantum computers" for $10m and spinning up a bunch of misleading PR. Of course you expect a certain deree of "fake it til you make it" in such companies, but they'd been going for over a decade at that point. This was all kicking off as I was doing my PhD in the field. It was eye-opening to see how little attention was paid to serious academics vs hucksters, and how companies like Google could be duped into spending millions on basically nothing.
dheera•7mo ago
> we have given it decades

Decades is a short amount of time in human history. Many things took centuries to invent.

The silicon valley approach of a year or two of runway is how apps are built, but that's not how science is built.

vonneumannstan•7mo ago
It's not a scam. There are many applications in materials science for example. Your ignorance of them doesn't make it a scam.
bawolff•7mo ago
You should stop viewing it as a tech start up and view it more as a physics experiment.

In some ways it is similar to fusion. People have been working on it for a long time. The benefits are potentially significant (shor is cute and all but really the big deal would be a cheap way to simulate other quantum systems) but the challenges are also significant. Real progress is being. Things that were super challanging 10 years ago are solved now. The field is advancing. But we still have a long way to go.

It is not a scam itself, but a lot of scammers use the language of quantum to sell their scams. You should treat anyone convincing you that they will have a useful quantum computer in the next 5 years the same way as someone offering you a fusion reactor (i.e. full of shit).

Its still a worthwhile pursuit, even just as a physics experiment. It pushes the "weirdness" of quantum physics to the limit - by literally disproving the extended church turing thesis. If we make a real quantum computer - that is proof that quantum physics is really how are world works. Its not just something else that is being misinterpreted.

DebtDeflation•7mo ago
This is exactly it. QC right now is a series of very cool science experiments that are being marketed to [Government Officials, CEOs, Investors, the General Public] as product development, which it is not. We're at the stage of scientists in the 1910s creating the first vacuum tubes and noting the ability to amplify and control small currents with larger ones but these companies are pretending it's instead the early 1980s with the PC and 8088 and Moore's Law getting ready to take off like a rocket.
JBits•7mo ago
There are multiple competing quantum computing hardwares, so you haven't given all of them the same length of time.
JanisErdmanis•7mo ago
A scam is a strong word, giving the impression that there are malicious interests in selling it without working towards making returns to the investors. But a dead horse, for sure, it now looks like.

The next big challenge will be mounting the controlling hardware, currently connected via coaxial cables, onto the chip while preventing the introduction of new sources of interference so that error correction can run. That will take a miracle.

Of course, an alternative is a million coaxial cables connected to a chip cooled close to mK temperatures.

addaon•7mo ago
I agree completely. We gave programmable machines a full century after Babbage, and it was clearly a waste and a scam. Can you imagine if we had continued to invest in that garbage and built some "Electronic Numerical Integrator and Computer" or other nonsense? Clearly if an idea isn't mature in a couple decades, it has no possible future development or impact.
EvgeniyZh•7mo ago
There has been steady progress in metrics, that, if keep improving, will eventually lead to fault tolerant quantum computing. We also do not know any fundamental reason why we won't be able to keep improving them. It's true that the progress is not fast, because the problem is hard, but I don't see why would you call it scam (there are definitely scammers in the field, but there are in any field).
m3kw9•7mo ago
As soon as a math equation comes up I get lost.
ofjcihen•7mo ago
Thanks for this.

I’ve recently become very interested in QC and purchased and read Quantum Computation and quantum Information which I think is the standard book on the subject right now.

I’m even more interested in applying what I’ve learned but I’m at a loss as to how to begin working in the industry. Aside from getting a new masters degree I wouldn’t know where to begin and resources on the matter are understandably sparse.

abdullahkhalids•7mo ago
You can definitely work at QC companies even without having a degree in the field. Many QC companies hire people from other fields because they require that expertise, say people with experience in numerical optimization. Of course, many QC companies also hire software engineers because they have complex internal software. If you are a software engineer, you can start there and then start to move laterally within the company.

Source: work at a QC company as a scientist.

ofjcihen•7mo ago
That’s something I didn’t think about. Thank you
dheera•7mo ago
It's actually not that difficult to pick up quantum mechanics and quantum computing if you have a solid foundation in linear algebra. QC really just reduces down to "applied linear algebra on crack".

If you're in AI, you might be pleased to know that the probability distribution of a particle in its various energy states is related to the softmax of the negative values of those energies times temperature, which is where the concept of LLM "temperature" comes from. If you have linear algebra background, those energies are the eigenvalues of a Hamiltonian. Physics is actually quite beautiful.

Getting into industries is another issue though, it seems every company favors credentials over learning ability these days. If you haven't published 1500 papers on the subject you're automatically rejected.

qnleigh•7mo ago
Yes that's still a great book, though it's starting to get a bit outdated. Some recent developments that would belong in an updated edition:

- The section on error correction is still gold, but it doesn't cover "scalable codes" like the Surface Code (and other LDPC codes; lots of exciting progress there) - Superconducting Qubits: https://arxiv.org/abs/1904.06560. - Rydberg Atoms: see Nature Papers from Misha Lukin's group on the subject - Photonic quanum computing

These might be hard to follow now, but if you make it through a good chunk of Nielsen and Chuang, then they might become quite readable. Make sure you solve lots of problems or it won't stick.

Like other commenters have pointed out, quantum computing companies need lots of software engineers, so that's a very viable entry into the field for many people. Here's an arbitrary list of some relevant skills: - Qutip! You can learn sooo much quantum mechanics by playing around in Qutip, and it's quite easy to use. - Rust or C++ (depending on the company?) - FPGA programming - Python (ofc) - Linear algebra - ...

fogof•7mo ago
Very funny to me that lecture 21 is one of the only lecture titles that doesn't reference the name of the originator.
hedgehog0•7mo ago
De Wolf's note is also one of the standards right now, and more up-to-date than the QC&QI book: https://arxiv.org/abs/1907.09415
mikestorrent•7mo ago
Of course, no discussion of quantum annealing, the only practical form of quantum computation that is likely to exist at scale in our lifetimes.
adastra22•7mo ago
That's a strong statement. Regardless, the content of the course is explicitly targeted at gate-based quantum computing.
msgodel•7mo ago
Is it practical? The little I've messed with it it seemed borderline useless. All it can do is QUBO and encoding the problem into the machine topology itself is another QUBO problem that has to be done on a classical computer.

People also keep talking about using it for AI but all you can train with it are Boltzmann machines because those are all that map into QUBO problems.

husky8•7mo ago
I made a podcast in NotebookLM once I saw equations. Enjoy https://notebooklm.google.com/notebook/bc7616c4-1c71-4a04-b2...
vismit2000•7mo ago
Getting this message - "Oops! This audio could not be loaded."