frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

France's homegrown open source online office suite

https://github.com/suitenumerique
469•nar001•4h ago•222 comments

British drivers over 70 to face eye tests every three years

https://www.bbc.com/news/articles/c205nxy0p31o
154•bookofjoe•2h ago•135 comments

Start all of your commands with a comma (2009)

https://rhodesmill.org/brandon/2009/commands-with-comma/
447•theblazehen•2d ago•160 comments

Leisure Suit Larry's Al Lowe on model trains, funny deaths and Disney

https://spillhistorie.no/2026/02/06/interview-with-sierra-veteran-al-lowe/
32•thelok•2h ago•2 comments

Software Factories and the Agentic Moment

https://factory.strongdm.ai/
33•mellosouls•2h ago•27 comments

Hoot: Scheme on WebAssembly

https://www.spritely.institute/hoot/
93•AlexeyBrin•5h ago•17 comments

OpenCiv3: Open-source, cross-platform reimagining of Civilization III

https://openciv3.org/
780•klaussilveira•20h ago•241 comments

First Proof

https://arxiv.org/abs/2602.05192
42•samasblack•2h ago•28 comments

Stories from 25 Years of Software Development

https://susam.net/twenty-five-years-of-computing.html
36•vinhnx•3h ago•4 comments

Reinforcement Learning from Human Feedback

https://arxiv.org/abs/2504.12501
59•onurkanbkrc•5h ago•3 comments

The Waymo World Model

https://waymo.com/blog/2026/02/the-waymo-world-model-a-new-frontier-for-autonomous-driving-simula...
1034•xnx•1d ago•583 comments

StrongDM's AI team build serious software without even looking at the code

https://simonwillison.net/2026/Feb/7/software-factory/
24•simonw•2h ago•23 comments

Coding agents have replaced every framework I used

https://blog.alaindichiappari.dev/p/software-engineering-is-back
180•alainrk•4h ago•255 comments

A Fresh Look at IBM 3270 Information Display System

https://www.rs-online.com/designspark/a-fresh-look-at-ibm-3270-information-display-system
27•rbanffy•4d ago•5 comments

Vocal Guide – belt sing without killing yourself

https://jesperordrup.github.io/vocal-guide/
171•jesperordrup•10h ago•65 comments

Vinklu Turns Forgotten Plot in Bucharest into Tiny Coffee Shop

https://design-milk.com/vinklu-turns-forgotten-plot-in-bucharest-into-tiny-coffee-shop/
9•surprisetalk•5d ago•0 comments

72M Points of Interest

https://tech.marksblogg.com/overture-places-pois.html
16•marklit•5d ago•0 comments

Unseen Footage of Atari Battlezone Arcade Cabinet Production

https://arcadeblogger.com/2026/02/02/unseen-footage-of-atari-battlezone-cabinet-production/
107•videotopia•4d ago•27 comments

What Is Stoicism?

https://stoacentral.com/guides/what-is-stoicism
7•0xmattf•1h ago•1 comments

Show HN: Look Ma, No Linux: Shell, App Installer, Vi, Cc on ESP32-S3 / BreezyBox

https://github.com/valdanylchuk/breezydemo
265•isitcontent•20h ago•33 comments

Making geo joins faster with H3 indexes

https://floedb.ai/blog/how-we-made-geo-joins-400-faster-with-h3-indexes
152•matheusalmeida•2d ago•43 comments

Monty: A minimal, secure Python interpreter written in Rust for use by AI

https://github.com/pydantic/monty
278•dmpetrov•20h ago•148 comments

Ga68, a GNU Algol 68 Compiler

https://fosdem.org/2026/schedule/event/PEXRTN-ga68-intro/
36•matt_d•4d ago•11 comments

Hackers (1995) Animated Experience

https://hackers-1995.vercel.app/
546•todsacerdoti•1d ago•264 comments

Sheldon Brown's Bicycle Technical Info

https://www.sheldonbrown.com/
421•ostacke•1d ago•110 comments

Show HN: I spent 4 years building a UI design tool with only the features I use

https://vecti.com
365•vecti•22h ago•166 comments

What Is Ruliology?

https://writings.stephenwolfram.com/2026/01/what-is-ruliology/
65•helloplanets•4d ago•69 comments

Show HN: If you lose your memory, how to regain access to your computer?

https://eljojo.github.io/rememory/
338•eljojo•23h ago•209 comments

An Update on Heroku

https://www.heroku.com/blog/an-update-on-heroku/
460•lstoll•1d ago•303 comments

Microsoft open-sources LiteBox, a security-focused library OS

https://github.com/microsoft/litebox
373•aktau•1d ago•194 comments
Open in hackernews

Pangu's Sorrow: The Sorrow and Darkness of Huawei's Noah Pangu LLM R&D Process

https://github.com/moonlightelite/True-Story-of-Pangu/blob/main/README.md
17•guardiangod•7mo ago

Comments

yms_hi•7mo ago
Calling a paper already determined to be AI-generated as "incident"? This is a major point of suspicion in the entire text.
nirui•7mo ago
Is the article a translation from Chinese? You have to have some deep knowledge on Chinese net slang and Huawei slang to correctly understand it.

And all that unnecessary emotional expressions. All of it made the article hard to read.

Here's takeaways I extracted:

1. The author claim to be "an employee of the Pangu Large Model Team and Huawei Noah's Ark Laboratory", a lower ranking "small worker". The first 4 bullet points supposed to prove that they have insider knowledge, which should authenticate the claims that followed. As of why Huawei named their teams in this oddly way is unexplained but do desire some psychiatric analysis.

2. "At the beginning, our (Huawei, editor's note) computing power was very limited..." (detail followed), "...At the same time, other domestic companies such as Alibaba (which published Qwen, editor's note) and Zhipu were training on GPUs and had already figured out the right method. The gap between Pangu and its competitors was getting bigger and bigger"

3. "In this situation, Wang Yunhe ('the current director of Noah', editor's note) and his small model laboratory took action. They claimed that they inherited and transformed from the old 135B parameters, and through training a short few hundred B of data, the average improvement of various indicators was about ten points. In fact, this was their first masterpiece of applying the shell to the large model. Huawei's laymen led the experts, which made the leaders completely unaware of this nonsense. They only thought that there must be some algorithm innovation. After internal analysis, they actually used Qwen (which is published by Alibaba, editor's note) 1.5 110B for continued training.", "By adding layers, expanding the ffn dimension, and adding some mechanisms from the Pangu pi paper, they gathered about 135B parameters. In fact, the old 135B has 107 layers, while this model has only 82 layers, and the various configurations are also different. After training, the distribution of many parameters of the new 135B of unknown origin is almost exactly the same as that of Qwen 110B. Even the class name of the model code was Qwen at the time, and they were too lazy to even change the name. The subsequent model is the so-called 135B V2. This model was also provided to many downstreams at the time, even including external customers."

And that's about it.

Also, yeah, the article was indeed a translation from Chinese. The [original post] was written in Chinese, and then got translated it to English by github.com/moonlightelite. That's why it felt odd to read.

[original post]: https://web.archive.org/web/20250706034203/https://github.co...

After reading the article, I feel this is less of a whistle blowing, more of an attack against Wang Yunhe. That's why there's so much emotional expressions, to (maybe) appeal to Huawei and/or the future employer of this individual. But that's just my personal feelings/hint.