frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

The AI CEO Experiment

https://yukicapital.com/blog/the-ai-ceo-experiment/
2•romainsimon•1m ago•0 comments

Speed up responses with fast mode

https://code.claude.com/docs/en/fast-mode
2•surprisetalk•4m ago•0 comments

MS-DOS game copy protection and cracks

https://www.dosdays.co.uk/topics/game_cracks.php
2•TheCraiggers•5m ago•0 comments

Updates on GNU/Hurd progress [video]

https://fosdem.org/2026/schedule/event/7FZXHF-updates_on_gnuhurd_progress_rump_drivers_64bit_smp_...
2•birdculture•6m ago•0 comments

Epstein took a photo of his 2015 dinner with Zuckerberg and Musk

https://xcancel.com/search?f=tweets&q=davenewworld_2%2Fstatus%2F2020128223850316274
5•doener•6m ago•1 comments

MyFlames: Visualize MySQL query execution plans as interactive FlameGraphs

https://github.com/vgrippa/myflames
1•tanelpoder•8m ago•0 comments

Show HN: LLM of Babel

https://clairefro.github.io/llm-of-babel/
1•marjipan200•8m ago•0 comments

A modern iperf3 alternative with a live TUI, multi-client server, QUIC support

https://github.com/lance0/xfr
3•tanelpoder•9m ago•0 comments

Famfamfam Silk icons – also with CSS spritesheet

https://github.com/legacy-icons/famfamfam-silk
1•thunderbong•9m ago•0 comments

Apple is the only Big Tech company whose capex declined last quarter

https://sherwood.news/tech/apple-is-the-only-big-tech-company-whose-capex-declined-last-quarter/
2•elsewhen•13m ago•0 comments

Reverse-Engineering Raiders of the Lost Ark for the Atari 2600

https://github.com/joshuanwalker/Raiders2600
2•todsacerdoti•14m ago•0 comments

Show HN: Deterministic NDJSON audit logs – v1.2 update (structural gaps)

https://github.com/yupme-bot/kernel-ndjson-proofs
1•Slaine•18m ago•0 comments

The Greater Copenhagen Region could be your friend's next career move

https://www.greatercphregion.com/friend-recruiter-program
2•mooreds•18m ago•0 comments

Do Not Confirm – Fiction by OpenClaw

https://thedailymolt.substack.com/p/do-not-confirm
1•jamesjyu•19m ago•0 comments

The Analytical Profile of Peas

https://www.fossanalytics.com/en/news-articles/more-industries/the-analytical-profile-of-peas
1•mooreds•19m ago•0 comments

Hallucinations in GPT5 – Can models say "I don't know" (June 2025)

https://jobswithgpt.com/blog/llm-eval-hallucinations-t20-cricket/
1•sp1982•19m ago•0 comments

What AI is good for, according to developers

https://github.blog/ai-and-ml/generative-ai/what-ai-is-actually-good-for-according-to-developers/
1•mooreds•19m ago•0 comments

OpenAI might pivot to the "most addictive digital friend" or face extinction

https://twitter.com/lebed2045/status/2020184853271167186
1•lebed2045•20m ago•2 comments

Show HN: Know how your SaaS is doing in 30 seconds

https://anypanel.io
1•dasfelix•21m ago•0 comments

ClawdBot Ordered Me Lunch

https://nickalexander.org/drafts/auto-sandwich.html
3•nick007•22m ago•0 comments

What the News media thinks about your Indian stock investments

https://stocktrends.numerical.works/
1•mindaslab•23m ago•0 comments

Running Lua on a tiny console from 2001

https://ivie.codes/page/pokemon-mini-lua
1•Charmunk•23m ago•0 comments

Google and Microsoft Paying Creators $500K+ to Promote AI Tools

https://www.cnbc.com/2026/02/06/google-microsoft-pay-creators-500000-and-more-to-promote-ai.html
3•belter•25m ago•0 comments

New filtration technology could be game-changer in removal of PFAS

https://www.theguardian.com/environment/2026/jan/23/pfas-forever-chemicals-filtration
1•PaulHoule•26m ago•0 comments

Show HN: I saw this cool navigation reveal, so I made a simple HTML+CSS version

https://github.com/Momciloo/fun-with-clip-path
2•momciloo•27m ago•0 comments

Kinda Surprised by Seadance2's Moderation

https://seedanceai.me/
1•ri-vai•27m ago•2 comments

I Write Games in C (yes, C)

https://jonathanwhiting.com/writing/blog/games_in_c/
2•valyala•27m ago•1 comments

Django scales. Stop blaming the framework (part 1 of 3)

https://medium.com/@tk512/django-scales-stop-blaming-the-framework-part-1-of-3-a2b5b0ff811f
2•sgt•28m ago•0 comments

Malwarebytes Is Now in ChatGPT

https://www.malwarebytes.com/blog/product/2026/02/scam-checking-just-got-easier-malwarebytes-is-n...
1•m-hodges•28m ago•0 comments

Thoughts on the job market in the age of LLMs

https://www.interconnects.ai/p/thoughts-on-the-hiring-market-in
1•gmays•28m ago•0 comments
Open in hackernews

A fast, strong, topologically meaningful and fun knot invariant

https://arxiv.org/abs/2509.18456
52•bikenaga•4mo ago

Comments

m_dupont•4mo ago
To their credit, the paper is unexpectedly fun.

I'm not understanding the "separation power" thing, what does that imply?

gjm11•4mo ago
They mean that their invariant does a good job of distinguishing different knots from one another.

The way they quantify this is: they pick a biggish set of knots that are known all to be distinct from one another. They then compute their invariant for each of those knots. A knot invariant successfully distinguishes them all from one another precisely when it takes different values for all of the knots. So they count the number of different values their invariant takes, and subtract it from the number of knots. They call this the "separation deficit": the smaller the better.

They compare their invariant with some already-known ones, taking "all knots that can be drawn in the plane with <= 15 crossing points" as their set of knots. There are about 300,000 of these.

One of the best-known knot invariants is the so-called Alexander polynomial. That's in row 3 of Table 5.1, and its "separation deficit" for those knots is on the order of 200k. That is, these 300k knots have between them only about 100k different Alexander polynomials; if you pick a random smallish knot and compute its Alexander polynomial then handwavily you should expect that there are two other different smallish knots with the same Alexander polynomial.

Another knot polynomial, which does a better job of distinguishing different knots, is the so-called HOMFLY polynomial. (Why the weird name? It comes from the initials of the six authors of the paper announcing its discovery.) That's row 7, showing a deficit of about 75k. That suggests, even more handwavily, that if you pick a random smallish knot and compute its HOMFLY polynomial, there's about a 1/3 chance that there's another smallish knot with the same HOMFLY polynomial. Still not great.

A rather different sort of invariant is the hyperbolic volume of the complement of the knot. That is: if you take all of space minus the knot then there's a certain nice way to define distances and volumes and things in the left-over space; the whole of space-minus-the-knot turns out then to have a finite volume, and perhaps surprisingly deforming the knot doesn't change that volume. So that's another knot invariant, and it turns out to be better at distinguishing knots from one another than the polynomials mentioned above, on the order as 2x better than the HOMFLY polynomial.

This paper's invariant (which is a pair of polynomials) does about 6x better than what you get by looking at the Alexander polynomial, the HOMFLY polynomial, the hyperbolic volume, and a few other invariants I didn't mention above, all together. Its "separation deficit" on this set of ~300k knots is about 7000. If you pick a random smallish knot, there's only about a 2% chance that some other knot has the same value of this paper's invariant.

(Reminder that all this business about probabilities is super-handwavy. Actually, that probability might be anywhere from about 2% to about 4% depending on exactly how the values of the invariant are distributed.)

Now, all of this is purely empirical and looks only at smallish knots. So far as I know they haven't proved any theorems like "our invariants do a better job than the hyperbolic volume for knots with <= N crossings, for all N". I think such theorems are very hard to come by.

They don't, to be clear, claim that their invariant is the best at distinguishing different knots from one another. For instance, they mention another set of knot polynomials that does a better job but is (so they say) much more troublesome to compute for a given knot.

m_dupont•4mo ago
very clear and detailed answer. thankyou very much
QuadmasterXLII•4mo ago
What would people recommend to get up to speed on the alexander polynomial? The wikipedia page was terse as expected.
bikenaga•4mo ago
It depends on your math background, but:

Richard Crowell and Ralph Fox, "An Introduction to Knot Theory" (1963) - an old book which doesn't recent developments. The Dover edition is only $15.95: https://store.doverpublications.com/products/9780486468945?_...

Ralph Fox, "A Quick Trip Through Knot Theory" (1962). The original paper was a chapter in M. K. Fort's "Topology of 3-Manifolds" collection, but I think you can find copies online.

Dale Rolfsen, "Knots and Links". A newer book and one of the best if you know some (algebraic) topology. He argue for considering the Alexander invariant (the homology of the infinite cyclic cover) rather than the Alexander polynomial. I don't have the newest edition, so I don't know how far it goes in terms of recent developments.

The book by Burde and Zeischang has this stuff and more, but it's more advanced.

I'm not a knot theorist, so I don't know about newer books - maybe someone else has better recommendations.

MarkusQ•4mo ago
For a gentile introduction, Colin Adams's "The Knot Book" is very accessible (and builds up the notion of polynomial invariants step by step).
bikenaga•4mo ago
Yep, just noticed it's on my shelf - nice book! He describes Conway's approach to computing the Alexander polynomial using just two rules - a lot simpler than the technical stuff in Crowell and Fox.