frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

We Mourn Our Craft

https://nolanlawson.com/2026/02/07/we-mourn-our-craft/
56•ColinWright•55m ago•23 comments

Speed up responses with fast mode

https://code.claude.com/docs/en/fast-mode
16•surprisetalk•1h ago•9 comments

U.S. Jobs Disappear at Fastest January Pace Since Great Recession

https://www.forbes.com/sites/mikestunson/2026/02/05/us-jobs-disappear-at-fastest-january-pace-sin...
94•alephnerd•1h ago•36 comments

Hoot: Scheme on WebAssembly

https://www.spritely.institute/hoot/
120•AlexeyBrin•7h ago•22 comments

Stories from 25 Years of Software Development

https://susam.net/twenty-five-years-of-computing.html
55•vinhnx•4h ago•7 comments

Al Lowe on model trains, funny deaths and working with Disney

https://spillhistorie.no/2026/02/06/interview-with-sierra-veteran-al-lowe/
53•thelok•3h ago•6 comments

OpenCiv3: Open-source, cross-platform reimagining of Civilization III

https://openciv3.org/
822•klaussilveira•21h ago•248 comments

The AI boom is causing shortages everywhere else

https://www.washingtonpost.com/technology/2026/02/07/ai-spending-economy-shortages/
100•1vuio0pswjnm7•8h ago•117 comments

The Waymo World Model

https://waymo.com/blog/2026/02/the-waymo-world-model-a-new-frontier-for-autonomous-driving-simula...
1057•xnx•1d ago•607 comments

Reinforcement Learning from Human Feedback

https://rlhfbook.com/
75•onurkanbkrc•6h ago•5 comments

Start all of your commands with a comma (2009)

https://rhodesmill.org/brandon/2009/commands-with-comma/
476•theblazehen•2d ago•175 comments

Vocal Guide – belt sing without killing yourself

https://jesperordrup.github.io/vocal-guide/
201•jesperordrup•11h ago•69 comments

France's homegrown open source online office suite

https://github.com/suitenumerique
543•nar001•5h ago•252 comments

Selection Rather Than Prediction

https://voratiq.com/blog/selection-rather-than-prediction/
8•languid-photic•3d ago•1 comments

Coding agents have replaced every framework I used

https://blog.alaindichiappari.dev/p/software-engineering-is-back
213•alainrk•6h ago•328 comments

A Fresh Look at IBM 3270 Information Display System

https://www.rs-online.com/designspark/a-fresh-look-at-ibm-3270-information-display-system
34•rbanffy•4d ago•7 comments

72M Points of Interest

https://tech.marksblogg.com/overture-places-pois.html
27•marklit•5d ago•2 comments

Unseen Footage of Atari Battlezone Arcade Cabinet Production

https://arcadeblogger.com/2026/02/02/unseen-footage-of-atari-battlezone-cabinet-production/
113•videotopia•4d ago•30 comments

Software factories and the agentic moment

https://factory.strongdm.ai/
68•mellosouls•4h ago•72 comments

Where did all the starships go?

https://www.datawrapper.de/blog/science-fiction-decline
73•speckx•4d ago•74 comments

Show HN: Look Ma, No Linux: Shell, App Installer, Vi, Cc on ESP32-S3 / BreezyBox

https://github.com/valdanylchuk/breezydemo
273•isitcontent•21h ago•37 comments

Learning from context is harder than we thought

https://hy.tencent.com/research/100025?langVersion=en
199•limoce•4d ago•111 comments

Monty: A minimal, secure Python interpreter written in Rust for use by AI

https://github.com/pydantic/monty
285•dmpetrov•22h ago•153 comments

Show HN: Kappal – CLI to Run Docker Compose YML on Kubernetes for Local Dev

https://github.com/sandys/kappal
21•sandGorgon•2d ago•11 comments

Making geo joins faster with H3 indexes

https://floedb.ai/blog/how-we-made-geo-joins-400-faster-with-h3-indexes
155•matheusalmeida•2d ago•48 comments

Hackers (1995) Animated Experience

https://hackers-1995.vercel.app/
555•todsacerdoti•1d ago•268 comments

Sheldon Brown's Bicycle Technical Info

https://www.sheldonbrown.com/
424•ostacke•1d ago•110 comments

Ga68, a GNU Algol 68 Compiler

https://fosdem.org/2026/schedule/event/PEXRTN-ga68-intro/
42•matt_d•4d ago•18 comments

An Update on Heroku

https://www.heroku.com/blog/an-update-on-heroku/
472•lstoll•1d ago•311 comments

Show HN: If you lose your memory, how to regain access to your computer?

https://eljojo.github.io/rememory/
348•eljojo•1d ago•215 comments
Open in hackernews

TREAD: Token Routing for Efficient Architecture-Agnostic Diffusion Training

https://arxiv.org/abs/2501.04765
37•fzliu•5mo ago

Comments

platers•5mo ago
I'm struggling to understand where the gains are coming from. What is the intuition for why DiT training was so inefficient?
joshred•5mo ago
This is the high-level explanation of the simplest diffusion architecture. The model trains by taking an image and iteratively adding noise to the image until there is only noise. Then they take that sequence of noisier and noisier images and they reverse it. The result is that they start with only noise, and they predict the removal of noise at step until they get to the final step (which should be the original image (or training input)).

That process means they may require a hundred or more training iterations on a single image. I haven't digested the paper, but it sounds like they are proposing something conceptually similar to skip layers (but significantly more involved).

earthnail•5mo ago
Wow, Ommer’s students never fail to impress. 37x faster for a generic architecture, ie no domain specific tricks. Insane.
arjvik•5mo ago
Isn't this just Mixture-of-Depths but for DiTs?

If so, what are the DiT specific changes that needed to be made?

yorwba•5mo ago
Mixture-of-Depths trains the model to choose different numbers of layers for different tokens to reduce inference compute. This method is more like stochastic depth / layer dropout, where whether or not the intermediate layers are skipped for a token is random independent of the token value, and they're only using it as a training optimization. As far as I can tell, during inference all tokens are always processed by all layers.
lucidrains•5mo ago
very nice, will have to try it out! this is the same research group from which Robin Rombach (of stable diffusion fame) originated from