frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Show HN: LocalGPT – A local-first AI assistant in Rust with persistent memory

https://github.com/localgpt-app/localgpt
165•yi_wang•5h ago•50 comments

The world heard JD Vance being booed at the Olympics. Except for viewers in USA

https://www.theguardian.com/sport/2026/feb/07/jd-vance-boos-winter-olympics
57•treetalker•18m ago•12 comments

Haskell for all: Beyond agentic coding

https://haskellforall.com/2026/02/beyond-agentic-coding
80•RebelPotato•5h ago•19 comments

OpenClaw Is Changing My Life

https://reorx.com/blog/openclaw-is-changing-my-life/
5•novoreorx•57m ago•1 comments

SectorC: A C Compiler in 512 bytes (2023)

https://xorvoid.com/sectorc.html
270•valyala•13h ago•51 comments

Total surface area required to fuel the world with solar (2009)

https://landartgenerator.org/blagi/archives/127
33•robtherobber•4d ago•36 comments

Software factories and the agentic moment

https://factory.strongdm.ai/
209•mellosouls•16h ago•359 comments

Speed up responses with fast mode

https://code.claude.com/docs/en/fast-mode
172•surprisetalk•13h ago•164 comments

LLMs as the new high level language

https://federicopereiro.com/llm-high/
76•swah•4d ago•138 comments

Hoot: Scheme on WebAssembly

https://www.spritely.institute/hoot/
184•AlexeyBrin•19h ago•35 comments

Brookhaven Lab's RHIC concludes 25-year run with final collisions

https://www.hpcwire.com/off-the-wire/brookhaven-labs-rhic-concludes-25-year-run-with-final-collis...
76•gnufx•12h ago•60 comments

The Architecture of Open Source Applications (Volume 1) Berkeley DB

https://aosabook.org/en/v1/bdb.html
10•grep_it•5d ago•0 comments

Stories from 25 Years of Software Development

https://susam.net/twenty-five-years-of-computing.html
177•vinhnx•16h ago•18 comments

Vocal Guide – belt sing without killing yourself

https://jesperordrup.github.io/vocal-guide/
331•jesperordrup•23h ago•99 comments

Substack confirms data breach affects users’ email addresses and phone numbers

https://techcrunch.com/2026/02/05/substack-confirms-data-breach-affecting-email-addresses-and-pho...
31•witnessme•2h ago•9 comments

First Proof

https://arxiv.org/abs/2602.05192
139•samasblack•15h ago•81 comments

Wood Gas Vehicles: Firewood in the Fuel Tank (2010)

https://solar.lowtechmagazine.com/2010/01/wood-gas-vehicles-firewood-in-the-fuel-tank/
35•Rygian•2d ago•11 comments

Show HN: I saw this cool navigation reveal, so I made a simple HTML+CSS version

https://github.com/Momciloo/fun-with-clip-path
86•momciloo•13h ago•18 comments

Vouch

https://twitter.com/mitchellh/status/2020252149117313349
81•chwtutha•4h ago•22 comments

Al Lowe on model trains, funny deaths and working with Disney

https://spillhistorie.no/2026/02/06/interview-with-sierra-veteran-al-lowe/
109•thelok•15h ago•24 comments

Start all of your commands with a comma (2009)

https://rhodesmill.org/brandon/2009/commands-with-comma/
593•theblazehen•3d ago•214 comments

Show HN: A luma dependent chroma compression algorithm (image compression)

https://www.bitsnbites.eu/a-spatial-domain-variable-block-size-luma-dependent-chroma-compression-...
42•mbitsnbites•3d ago•5 comments

LineageOS 23.2

https://lineageos.org/Changelog-31/
6•pentagrama•1h ago•0 comments

The AI boom is causing shortages everywhere else

https://www.washingtonpost.com/technology/2026/02/07/ai-spending-economy-shortages/
316•1vuio0pswjnm7•19h ago•513 comments

FDA intends to take action against non-FDA-approved GLP-1 drugs

https://www.fda.gov/news-events/press-announcements/fda-intends-take-action-against-non-fda-appro...
116•randycupertino•8h ago•243 comments

OpenCiv3: Open-source, cross-platform reimagining of Civilization III

https://openciv3.org/
908•klaussilveira•1d ago•277 comments

Where did all the starships go?

https://www.datawrapper.de/blog/science-fiction-decline
161•speckx•4d ago•245 comments

Selection rather than prediction

https://voratiq.com/blog/selection-rather-than-prediction/
36•languid-photic•4d ago•18 comments

Show HN: Look Ma, No Linux: Shell, App Installer, Vi, Cc on ESP32-S3 / BreezyBox

https://github.com/valdanylchuk/breezydemo
304•isitcontent•1d ago•39 comments

Monty: A minimal, secure Python interpreter written in Rust for use by AI

https://github.com/pydantic/monty
314•dmpetrov•1d ago•159 comments
Open in hackernews

A short introduction to optimal transport and Wasserstein distance (2020)

https://alexhwilliams.info/itsneuronalblog/2020/10/09/optimal-transport/
40•sebg•5mo ago

Comments

smokel•5mo ago
This is very helpful for understanding generative AI. See for example the amazing lectures of Stefano Ermon for Stanford's CS236 Deep Generative Models [1]. All lectures are available on YouTube [2].

[1] https://deepgenerativemodels.github.io/

[2] https://youtube.com/playlist?list=PLoROMvodv4rPOWA-omMM6STXa...

jethkl•5mo ago
Wasserstein distance (Earth Mover’s Distance) measures how far apart two distributions are — the ‘work’ needed to reshape one pile of dirt into another. The concept extends to multiple distributions via a linear program, which under mild conditions can be solved with a linear-time greedy algorithm [1]. It’s an active research area with applications in clustering, computing Wasserstein barycenters (averaging distributions), and large-scale machine learning.

[1] https://en.wikipedia.org/wiki/Earth_mover's_distance#More_th...

ForceBru•5mo ago
Is the Wasserstein distance useful for parameter estimation instead of maximum likelihood? BTW, maximum likelihood basically estimates minimum KL divergence. All I see online and in papers is how to _compute_ the Wasserstein distance, which seems to be pretty hard in itself. In 1D, this requires computing a nasty integral of inverse CDFs when p!=1. Does it mean that "minimum Wasserstein estimation" is prohibitively expensive?
317070•5mo ago
It is.

But!

Wasserstein distances are used instead of a KL inside all kinds of VAE's and diffusion models, because while the Wasserstein distance is hard to compute, it is easy to make distributions whose expectation is the gradient wrt to the Wasserstein distance. So you can easily get unbiased gradients, and that is all you need to train big neural networks. [0] Pretty much any time you sample from your current and the target distribution and take the gradient of the distance between the points, you will be minimizing a Wasserstein distance.

[0] https://arxiv.org/abs/1711.01558

JustFinishedBSG•5mo ago
Wasserstein itself is expensive but you can instead optimize arbitrarily close entropic regularizations of it ( Sinkhorn algorithm) that are both easy to optimize and differentiable