frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

France's homegrown open source online office suite

https://github.com/suitenumerique
105•nar001•1h ago•48 comments

Start all of your commands with a comma (2009)

https://rhodesmill.org/brandon/2009/commands-with-comma/
345•theblazehen•2d ago•117 comments

Hoot: Scheme on WebAssembly

https://www.spritely.institute/hoot/
49•AlexeyBrin•2h ago•10 comments

OpenCiv3: Open-source, cross-platform reimagining of Civilization III

https://openciv3.org/
734•klaussilveira•17h ago•230 comments

Reinforcement Learning from Human Feedback

https://arxiv.org/abs/2504.12501
29•onurkanbkrc•2h ago•2 comments

The Waymo World Model

https://waymo.com/blog/2026/02/the-waymo-world-model-a-new-frontier-for-autonomous-driving-simula...
990•xnx•22h ago•562 comments

Coding agents have replaced every framework I used

https://blog.alaindichiappari.dev/p/software-engineering-is-back
75•alainrk•2h ago•71 comments

Vocal Guide – belt sing without killing yourself

https://jesperordrup.github.io/vocal-guide/
114•jesperordrup•7h ago•52 comments

Unseen Footage of Atari Battlezone Arcade Cabinet Production

https://arcadeblogger.com/2026/02/02/unseen-footage-of-atari-battlezone-cabinet-production/
84•videotopia•4d ago•17 comments

Making geo joins faster with H3 indexes

https://floedb.ai/blog/how-we-made-geo-joins-400-faster-with-h3-indexes
144•matheusalmeida•2d ago•39 comments

Ga68, a GNU Algol 68 Compiler

https://fosdem.org/2026/schedule/event/PEXRTN-ga68-intro/
24•matt_d•3d ago•5 comments

Show HN: Look Ma, No Linux: Shell, App Installer, Vi, Cc on ESP32-S3 / BreezyBox

https://github.com/valdanylchuk/breezydemo
247•isitcontent•17h ago•27 comments

Cross-Region MSK Replication: K2K vs. MirrorMaker2

https://medium.com/lensesio/cross-region-msk-replication-a-comprehensive-performance-comparison-o...
6•andmarios•4d ago•1 comments

Monty: A minimal, secure Python interpreter written in Rust for use by AI

https://github.com/pydantic/monty
257•dmpetrov•17h ago•135 comments

Show HN: Kappal – CLI to Run Docker Compose YML on Kubernetes for Local Dev

https://github.com/sandys/kappal
6•sandGorgon•2d ago•2 comments

Show HN: I spent 4 years building a UI design tool with only the features I use

https://vecti.com
350•vecti•19h ago•157 comments

Hackers (1995) Animated Experience

https://hackers-1995.vercel.app/
518•todsacerdoti•1d ago•252 comments

Sheldon Brown's Bicycle Technical Info

https://www.sheldonbrown.com/
400•ostacke•23h ago•104 comments

What Is Ruliology?

https://writings.stephenwolfram.com/2026/01/what-is-ruliology/
52•helloplanets•4d ago•51 comments

Show HN: If you lose your memory, how to regain access to your computer?

https://eljojo.github.io/rememory/
316•eljojo•20h ago•196 comments

Microsoft open-sources LiteBox, a security-focused library OS

https://github.com/microsoft/litebox
365•aktau•23h ago•189 comments

An Update on Heroku

https://www.heroku.com/blog/an-update-on-heroku/
445•lstoll•23h ago•293 comments

Dark Alley Mathematics

https://blog.szczepan.org/blog/three-points/
99•quibono•4d ago•26 comments

PC Floppy Copy Protection: Vault Prolok

https://martypc.blogspot.com/2024/09/pc-floppy-copy-protection-vault-prolok.html
79•kmm•5d ago•12 comments

How to effectively write quality code with AI

https://heidenstedt.org/posts/2026/how-to-effectively-write-quality-code-with-ai/
285•i5heu•20h ago•238 comments

Female Asian Elephant Calf Born at the Smithsonian National Zoo

https://www.si.edu/newsdesk/releases/female-asian-elephant-calf-born-smithsonians-national-zoo-an...
48•gmays•12h ago•21 comments

Was Benoit Mandelbrot a hedgehog or a fox?

https://arxiv.org/abs/2602.01122
26•bikenaga•3d ago•15 comments

I spent 5 years in DevOps – Solutions engineering gave me what I was missing

https://infisical.com/blog/devops-to-solutions-engineering
162•vmatsiiako•22h ago•73 comments

I now assume that all ads on Apple news are scams

https://kirkville.com/i-now-assume-that-all-ads-on-apple-news-are-scams/
1098•cdrnsf•1d ago•479 comments

Introducing the Developer Knowledge API and MCP Server

https://developers.googleblog.com/introducing-the-developer-knowledge-api-and-mcp-server/
70•gfortaine•15h ago•29 comments
Open in hackernews

Arcee Trinity Mini: US-Trained Moe Model

https://www.arcee.ai/blog/the-trinity-manifesto?src=hn
70•hurrycane•2mo ago

Comments

bitwize•2mo ago
A moe model you say? How kawaii is it? uwu
noxa•2mo ago
I hate that I laughed at this. Thanks ;)
ghc•2mo ago
Capitalization makes a surprising amount of difference here...
donw•2mo ago
Meccha at present, but it may reach sugoi levels with fine-tuning.
htrp•2mo ago
Trinity Nano Preview: 6B parameter MoE (1B active, ~800M non-embedding), 56 layers, 128 experts with 8 active per token

Trinity Mini: 26B parameter MoE (3B active), fully post-trained reasoning model

They did pretraining on their own and are still training the large version on 2048 B300 GPUs

halJordan•2mo ago
Looks like a less good version of qwen 30b3a which makes sense bc it is slightly smaller. If they can keep that effiency going into the large one it'll be sick.

Trinity Large [will be] a 420B parameter model with 13B active parameters. Just perfect for a large Ram pool @ q4.

ksynwa•2mo ago
> Trinity Large is currently training on 2048 B300 GPUs and will arrive in January 2026.

How long does the training take?

arthurcolle•2mo ago
Couple days or weeks usually. No one is doing 9 month training runs
davidsainez•2mo ago
Excited to put this through its paces. It seems most directly comparable to GPT-OSS-20B. Comparing their numbers on the Together API: Trinity Mini is slightly less expensive ($0.045/$0.15 v $0.05/$0.20) and seems to have better latency and throughput numbers.
trvz•2mo ago
Moe ≠ MoE
cachius•2mo ago
?
azinman2•2mo ago
The HN title uses incorrect capitalization.
rbanffy•2mo ago
I was eagerly waiting for the Larry and Curly models.
m4rtink•2mo ago
^_-
Balinares•2mo ago
Interesting. Always glad to see more open weight models.

I do appreciate that they openly acknowledge the areas where they followed DeepSeek's research. I wouldn't consider that a given for a US company.

Anyone tried these as a coding model yet?