frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

The Super Sharp Blade

https://netzhansa.com/the-super-sharp-blade/
1•robin_reala•12s ago•0 comments

Smart Homes Are Terrible

https://www.theatlantic.com/ideas/2026/02/smart-homes-technology/685867/
1•tusslewake•1m ago•0 comments

What I haven't figured out

https://macwright.com/2026/01/29/what-i-havent-figured-out
1•stevekrouse•2m ago•0 comments

KPMG pressed its auditor to pass on AI cost savings

https://www.irishtimes.com/business/2026/02/06/kpmg-pressed-its-auditor-to-pass-on-ai-cost-savings/
1•cainxinth•2m ago•0 comments

Open-source Claude skill that optimizes Hinge profiles. Pretty well.

https://twitter.com/b1rdmania/status/2020155122181869666
1•birdmania•2m ago•1 comments

First Proof

https://arxiv.org/abs/2602.05192
2•samasblack•4m ago•1 comments

I squeezed a BERT sentiment analyzer into 1GB RAM on a $5 VPS

https://mohammedeabdelaziz.github.io/articles/trendscope-market-scanner
1•mohammede•6m ago•0 comments

Kagi Translate

https://translate.kagi.com
1•microflash•6m ago•0 comments

Building Interactive C/C++ workflows in Jupyter through Clang-REPL [video]

https://fosdem.org/2026/schedule/event/QX3RPH-building_interactive_cc_workflows_in_jupyter_throug...
1•stabbles•7m ago•0 comments

Tactical tornado is the new default

https://olano.dev/blog/tactical-tornado/
1•facundo_olano•9m ago•0 comments

Full-Circle Test-Driven Firmware Development with OpenClaw

https://blog.adafruit.com/2026/02/07/full-circle-test-driven-firmware-development-with-openclaw/
1•ptorrone•10m ago•0 comments

Automating Myself Out of My Job – Part 2

https://blog.dsa.club/automation-series/automating-myself-out-of-my-job-part-2/
1•funnyfoobar•10m ago•0 comments

Google staff call for firm to cut ties with ICE

https://www.bbc.com/news/articles/cvgjg98vmzjo
25•tartoran•10m ago•1 comments

Dependency Resolution Methods

https://nesbitt.io/2026/02/06/dependency-resolution-methods.html
1•zdw•10m ago•0 comments

Crypto firm apologises for sending Bitcoin users $40B by mistake

https://www.msn.com/en-ie/money/other/crypto-firm-apologises-for-sending-bitcoin-users-40-billion...
1•Someone•11m ago•0 comments

Show HN: iPlotCSV: CSV Data, Visualized Beautifully for Free

https://www.iplotcsv.com/demo
1•maxmoq•12m ago•0 comments

There's no such thing as "tech" (Ten years later)

https://www.anildash.com/2026/02/06/no-such-thing-as-tech/
1•headalgorithm•12m ago•0 comments

List of unproven and disproven cancer treatments

https://en.wikipedia.org/wiki/List_of_unproven_and_disproven_cancer_treatments
1•brightbeige•13m ago•0 comments

Me/CFS: The blind spot in proactive medicine (Open Letter)

https://github.com/debugmeplease/debug-ME
1•debugmeplease•13m ago•1 comments

Ask HN: What are the word games do you play everyday?

1•gogo61•16m ago•1 comments

Show HN: Paper Arena – A social trading feed where only AI agents can post

https://paperinvest.io/arena
1•andrenorman•17m ago•0 comments

TOSTracker – The AI Training Asymmetry

https://tostracker.app/analysis/ai-training
1•tldrthelaw•21m ago•0 comments

The Devil Inside GitHub

https://blog.melashri.net/micro/github-devil/
2•elashri•22m ago•0 comments

Show HN: Distill – Migrate LLM agents from expensive to cheap models

https://github.com/ricardomoratomateos/distill
1•ricardomorato•22m ago•0 comments

Show HN: Sigma Runtime – Maintaining 100% Fact Integrity over 120 LLM Cycles

https://github.com/sigmastratum/documentation/tree/main/sigma-runtime/SR-053
1•teugent•22m ago•0 comments

Make a local open-source AI chatbot with access to Fedora documentation

https://fedoramagazine.org/how-to-make-a-local-open-source-ai-chatbot-who-has-access-to-fedora-do...
1•jadedtuna•24m ago•0 comments

Introduce the Vouch/Denouncement Contribution Model by Mitchellh

https://github.com/ghostty-org/ghostty/pull/10559
1•samtrack2019•24m ago•0 comments

Software Factories and the Agentic Moment

https://factory.strongdm.ai/
1•mellosouls•24m ago•1 comments

The Neuroscience Behind Nutrition for Developers and Founders

https://comuniq.xyz/post?t=797
1•01-_-•24m ago•0 comments

Bang bang he murdered math {the musical } (2024)

https://taylor.town/bang-bang
1•surprisetalk•24m ago•0 comments
Open in hackernews

A short introduction to optimal transport and Wasserstein distance (2020)

https://alexhwilliams.info/itsneuronalblog/2020/10/09/optimal-transport/
40•sebg•5mo ago

Comments

smokel•5mo ago
This is very helpful for understanding generative AI. See for example the amazing lectures of Stefano Ermon for Stanford's CS236 Deep Generative Models [1]. All lectures are available on YouTube [2].

[1] https://deepgenerativemodels.github.io/

[2] https://youtube.com/playlist?list=PLoROMvodv4rPOWA-omMM6STXa...

jethkl•5mo ago
Wasserstein distance (Earth Mover’s Distance) measures how far apart two distributions are — the ‘work’ needed to reshape one pile of dirt into another. The concept extends to multiple distributions via a linear program, which under mild conditions can be solved with a linear-time greedy algorithm [1]. It’s an active research area with applications in clustering, computing Wasserstein barycenters (averaging distributions), and large-scale machine learning.

[1] https://en.wikipedia.org/wiki/Earth_mover's_distance#More_th...

ForceBru•5mo ago
Is the Wasserstein distance useful for parameter estimation instead of maximum likelihood? BTW, maximum likelihood basically estimates minimum KL divergence. All I see online and in papers is how to _compute_ the Wasserstein distance, which seems to be pretty hard in itself. In 1D, this requires computing a nasty integral of inverse CDFs when p!=1. Does it mean that "minimum Wasserstein estimation" is prohibitively expensive?
317070•5mo ago
It is.

But!

Wasserstein distances are used instead of a KL inside all kinds of VAE's and diffusion models, because while the Wasserstein distance is hard to compute, it is easy to make distributions whose expectation is the gradient wrt to the Wasserstein distance. So you can easily get unbiased gradients, and that is all you need to train big neural networks. [0] Pretty much any time you sample from your current and the target distribution and take the gradient of the distance between the points, you will be minimizing a Wasserstein distance.

[0] https://arxiv.org/abs/1711.01558

JustFinishedBSG•5mo ago
Wasserstein itself is expensive but you can instead optimize arbitrarily close entropic regularizations of it ( Sinkhorn algorithm) that are both easy to optimize and differentiable