frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

OpenCiv3: Open-source, cross-platform reimagining of Civilization III

https://openciv3.org/
510•klaussilveira•8h ago•141 comments

The Waymo World Model

https://waymo.com/blog/2026/02/the-waymo-world-model-a-new-frontier-for-autonomous-driving-simula...
848•xnx•14h ago•507 comments

How we made geo joins 400× faster with H3 indexes

https://floedb.ai/blog/how-we-made-geo-joins-400-faster-with-h3-indexes
61•matheusalmeida•1d ago•12 comments

Show HN: Look Ma, No Linux: Shell, App Installer, Vi, Cc on ESP32-S3 / BreezyBox

https://github.com/valdanylchuk/breezydemo
168•isitcontent•9h ago•20 comments

Monty: A minimal, secure Python interpreter written in Rust for use by AI

https://github.com/pydantic/monty
171•dmpetrov•9h ago•77 comments

Show HN: I spent 4 years building a UI design tool with only the features I use

https://vecti.com
282•vecti•11h ago•127 comments

Dark Alley Mathematics

https://blog.szczepan.org/blog/three-points/
64•quibono•4d ago•11 comments

Microsoft open-sources LiteBox, a security-focused library OS

https://github.com/microsoft/litebox
340•aktau•15h ago•165 comments

Show HN: If you lose your memory, how to regain access to your computer?

https://eljojo.github.io/rememory/
228•eljojo•11h ago•142 comments

Sheldon Brown's Bicycle Technical Info

https://www.sheldonbrown.com/
332•ostacke•14h ago•90 comments

Hackers (1995) Animated Experience

https://hackers-1995.vercel.app/
425•todsacerdoti•16h ago•221 comments

An Update on Heroku

https://www.heroku.com/blog/an-update-on-heroku/
364•lstoll•15h ago•253 comments

PC Floppy Copy Protection: Vault Prolok

https://martypc.blogspot.com/2024/09/pc-floppy-copy-protection-vault-prolok.html
35•kmm•4d ago•2 comments

Delimited Continuations vs. Lwt for Threads

https://mirageos.org/blog/delimcc-vs-lwt
11•romes•4d ago•1 comments

Show HN: ARM64 Android Dev Kit

https://github.com/denuoweb/ARM64-ADK
12•denuoweb•1d ago•1 comments

Why I Joined OpenAI

https://www.brendangregg.com/blog/2026-02-07/why-i-joined-openai.html
84•SerCe•4h ago•66 comments

How to effectively write quality code with AI

https://heidenstedt.org/posts/2026/how-to-effectively-write-quality-code-with-ai/
214•i5heu•11h ago•159 comments

Show HN: R3forth, a ColorForth-inspired language with a tiny VM

https://github.com/phreda4/r3
59•phreda4•8h ago•11 comments

Introducing the Developer Knowledge API and MCP Server

https://developers.googleblog.com/introducing-the-developer-knowledge-api-and-mcp-server/
35•gfortaine•6h ago•9 comments

Female Asian Elephant Calf Born at the Smithsonian National Zoo

https://www.si.edu/newsdesk/releases/female-asian-elephant-calf-born-smithsonians-national-zoo-an...
16•gmays•4h ago•2 comments

I spent 5 years in DevOps – Solutions engineering gave me what I was missing

https://infisical.com/blog/devops-to-solutions-engineering
123•vmatsiiako•13h ago•51 comments

Learning from context is harder than we thought

https://hy.tencent.com/research/100025?langVersion=en
160•limoce•3d ago•80 comments

Unseen Footage of Atari Battlezone Arcade Cabinet Production

https://arcadeblogger.com/2026/02/02/unseen-footage-of-atari-battlezone-cabinet-production/
3•videotopia•3d ago•0 comments

Understanding Neural Network, Visually

https://visualrambling.space/neural-network/
258•surprisetalk•3d ago•34 comments

I now assume that all ads on Apple news are scams

https://kirkville.com/i-now-assume-that-all-ads-on-apple-news-are-scams/
1022•cdrnsf•18h ago•425 comments

FORTH? Really!?

https://rescrv.net/w/2026/02/06/associative
53•rescrv•16h ago•17 comments

Evaluating and mitigating the growing risk of LLM-discovered 0-days

https://red.anthropic.com/2026/zero-days/
44•lebovic•1d ago•13 comments

WebView performance significantly slower than PWA

https://issues.chromium.org/issues/40817676
14•denysonique•5h ago•1 comments

I'm going to cure my girlfriend's brain tumor

https://andrewjrod.substack.com/p/im-going-to-cure-my-girlfriends-brain
98•ray__•5h ago•48 comments

Show HN: Smooth CLI – Token-efficient browser for AI agents

https://docs.smooth.sh/cli/overview
81•antves•1d ago•59 comments
Open in hackernews

Pangu's Sorrow: The Sorrow and Darkness of Huawei's Noah Pangu LLM R&D Process

https://github.com/moonlightelite/True-Story-of-Pangu/blob/main/README.md
17•guardiangod•7mo ago

Comments

yms_hi•7mo ago
Calling a paper already determined to be AI-generated as "incident"? This is a major point of suspicion in the entire text.
nirui•7mo ago
Is the article a translation from Chinese? You have to have some deep knowledge on Chinese net slang and Huawei slang to correctly understand it.

And all that unnecessary emotional expressions. All of it made the article hard to read.

Here's takeaways I extracted:

1. The author claim to be "an employee of the Pangu Large Model Team and Huawei Noah's Ark Laboratory", a lower ranking "small worker". The first 4 bullet points supposed to prove that they have insider knowledge, which should authenticate the claims that followed. As of why Huawei named their teams in this oddly way is unexplained but do desire some psychiatric analysis.

2. "At the beginning, our (Huawei, editor's note) computing power was very limited..." (detail followed), "...At the same time, other domestic companies such as Alibaba (which published Qwen, editor's note) and Zhipu were training on GPUs and had already figured out the right method. The gap between Pangu and its competitors was getting bigger and bigger"

3. "In this situation, Wang Yunhe ('the current director of Noah', editor's note) and his small model laboratory took action. They claimed that they inherited and transformed from the old 135B parameters, and through training a short few hundred B of data, the average improvement of various indicators was about ten points. In fact, this was their first masterpiece of applying the shell to the large model. Huawei's laymen led the experts, which made the leaders completely unaware of this nonsense. They only thought that there must be some algorithm innovation. After internal analysis, they actually used Qwen (which is published by Alibaba, editor's note) 1.5 110B for continued training.", "By adding layers, expanding the ffn dimension, and adding some mechanisms from the Pangu pi paper, they gathered about 135B parameters. In fact, the old 135B has 107 layers, while this model has only 82 layers, and the various configurations are also different. After training, the distribution of many parameters of the new 135B of unknown origin is almost exactly the same as that of Qwen 110B. Even the class name of the model code was Qwen at the time, and they were too lazy to even change the name. The subsequent model is the so-called 135B V2. This model was also provided to many downstreams at the time, even including external customers."

And that's about it.

Also, yeah, the article was indeed a translation from Chinese. The [original post] was written in Chinese, and then got translated it to English by github.com/moonlightelite. That's why it felt odd to read.

[original post]: https://web.archive.org/web/20250706034203/https://github.co...

After reading the article, I feel this is less of a whistle blowing, more of an attack against Wang Yunhe. That's why there's so much emotional expressions, to (maybe) appeal to Huawei and/or the future employer of this individual. But that's just my personal feelings/hint.