frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Mistral OCR 3

https://mistral.ai/news/mistral-ocr-3
262•pember•1d ago•32 comments

CSS Grid Lanes

https://webkit.org/blog/17660/introducing-css-grid-lanes/
20•frizlab•18m ago•1 comments

Garage – An S3 object store so reliable you can run it outside datacenters

https://garagehq.deuxfleurs.fr/
373•ibobev•6h ago•82 comments

TP-Link Tapo C200: Hardcoded Keys, Buffer Overflows and Privacy

https://www.evilsocket.net/2025/12/18/TP-Link-Tapo-C200-Hardcoded-Keys-Buffer-Overflows-and-Priva...
169•sibellavia•4h ago•44 comments

8-bit Boléro

https://linusakesson.net/music/bolero/index.php
101•Aissen•10h ago•15 comments

Graphite is joining Cursor

https://cursor.com/blog/graphite
136•fosterfriends•6h ago•154 comments

A Better Zip Bomb

https://www.bamsoftware.com/hacks/zipbomb/
12•kekqqq•57m ago•0 comments

GotaTun -- Mullvad's WireGuard Implementation in Rust

https://mullvad.net/en/blog/announcing-gotatun-the-future-of-wireguard-at-mullvad-vpn
502•km•11h ago•104 comments

Amazon will allow ePub and PDF downloads for DRM-free eBooks

https://www.kdpcommunity.com/s/article/New-eBook-Download-Options-for-Readers-Coming-in-2026?lang...
480•captn3m0•12h ago•260 comments

Qwen-Image-Layered: transparency and layer aware open diffusion model

https://huggingface.co/papers/2512.15603
19•dvrp•19h ago•2 comments

Performance Hints (2023)

https://abseil.io/fast/hints.html
22•danlark1•5h ago•18 comments

Show HN: TinyPDF – 3kb pdf library (70x smaller than jsPDF)

https://github.com/Lulzx/tinypdf
72•lulzx•1d ago•9 comments

Show HN: Stickerbox, a kid-safe, AI-powered voice to sticker printer

https://stickerbox.com/
31•spydertennis•2h ago•33 comments

Rust's Block Pattern

https://notgull.net/block-pattern/
75•zdw•17h ago•23 comments

Monumental snake engravings of the Orinoco River (2024)

https://www.cambridge.org/core/journals/antiquity/article/monumental-snake-engravings-of-the-orin...
6•bryanrasmussen•1w ago•0 comments

NOAA deploys new generation of AI-driven global weather models

https://www.noaa.gov/news-release/noaa-deploys-new-generation-of-ai-driven-global-weather-models
40•hnburnsy•1d ago•23 comments

Believe the Checkbook

https://robertgreiner.com/believe-the-checkbook/
100•rg81•6h ago•39 comments

Buteyko Method

https://en.wikipedia.org/wiki/Buteyko_method
10•rzk•39m ago•3 comments

Reverse Engineering US Airline's PNR System and Accessing All Reservations

https://alexschapiro.com/security/vulnerability/2025/11/20/avelo-airline-reservation-api-vulnerab...
75•bearsyankees•4h ago•32 comments

The FreeBSD Foundation's Laptop Support and Usability Project

https://github.com/FreeBSDFoundation/proj-laptop
117•mikece•7h ago•41 comments

Ask HN: How are you LLM-coding in an established code base?

42•adam_gyroscope•3d ago•30 comments

Response Healing: Reduce JSON defects by 80%+

https://openrouter.ai/announcements/response-healing-reduce-json-defects-by-80percent
21•numlocked•1d ago•14 comments

Lite^3, a JSON-compatible zero-copy serialization format

https://github.com/fastserial/lite3
107•cryptonector•6d ago•29 comments

The pitfalls of partitioning Postgres yourself

https://hatchet.run/blog/postgres-partitioning
22•abelanger•3d ago•3 comments

Wall Street Ruined the Roomba and Then Blamed Lina Khan

https://www.thebignewsletter.com/p/how-wall-street-ruined-the-roomba
145•connor11528•3h ago•94 comments

Show HN: I Made Loom for Mobile

https://demoscope.app
49•admtal•5h ago•32 comments

You can now play Grand Theft Auto Vice City in the browser

https://dos.zone/grand-theft-auto-vice-city/
223•Alifatisk•3h ago•63 comments

Postfix Macros and Let Place

https://nadrieril.github.io/blog/2025/12/09/postfix-macros-and-let-place.html
8•todsacerdoti•5d ago•3 comments

Prompt caching for cheaper LLM tokens

https://ngrok.com/blog/prompt-caching/
256•samwho•3d ago•64 comments

History LLMs: Models trained exclusively on pre-1913 texts

https://github.com/DGoettlich/history-llms
726•iamwil•23h ago•358 comments
Open in hackernews

NOAA deploys new generation of AI-driven global weather models

https://www.noaa.gov/news-release/noaa-deploys-new-generation-of-ai-driven-global-weather-models
40•hnburnsy•1d ago

Comments

margalabargala•1h ago
I am dearly hoping that they are using the current "AI" craze to talk up the machine learning methods they have presumably been using for a decade at this point, and not that they have actually integrated an LLM into a weather model.
username223•1h ago
Same. I hope this was written by hardened greybeards who have dedicated their lives to weather prediction and atmospheric modeling, and have "weathered" a few funding cycles.
akdev1l•1h ago
inb4 it’s actually an intern maintaining a 3000+ line markdown file
RHSeeger•34m ago
I can see it now

    The following snippet highlights the algorithm used to determine <thing>
    ```fortran
    .....
idontwantthis•1h ago
Hopefully they weren’t all forced out this year. The NOAA had massive cuts.
trueismywork•22m ago
NCAR is being dismantled as we speak.
sigmar•31m ago
Graphcast (the model this is based on) has been validated in weather models for a while[1]. It uses transformers, much like LLMs. Transformers are really impressive at modeling a variety of things and have become very common throughout a lot of ML models, there's no reason to besmirch these methods as "integrating an LLM into a weather model"

[1] https://github.com/google-deepmind/graphcast

lynndotpy•19m ago
A lot of shiny new "AI" features being shipped are language models being placed where they don't belong. It's reasonable to be skeptical here, not just because of the AI label, but especially for the troubled history of neural-network based ML methods for weather prediction.

Even before LLMs got big, a lot of machine learning research being published were models which underperformed SOTA (which was the case for weather modeling for a long time!) or models which are far far larger than they need to be (e.g. this [1] Nature paper using 'deep learning' for aftershock prediction being bested by this [2] Nature paper using one neuron.

[1] https://www.nature.com/articles/s41586-018-0438-y

[2] https://www.nature.com/articles/s41586-019-1582-8

Legend2440•27m ago
It’s not an LLM, but it is genAI. It’s based on the same idea of predict-the-next-thing, but instead of predicting words it predicts the next state of the atmosphere from the current state.
adamweld•6m ago
It is in fact one of the least generalized forms of "AI" out there. A model focused solely on predicting weather.
astrange•56s ago
"gen" stands for "generative". If you read the GenCast paper they call it a generative AI.

(It's an autoregressive GNN plus a diffusion model.)

lukeschlather•27m ago
The GraphCast paper says "GraphCast is implemented using GNNs" without explaining that the acronym stands for Graph Neural Networks. It contrasts GNNs to the " convolutional neural network (CNN)" and "graph attention network." (GAN?) It doesn't really explain the difference between GAN and a GNN. I think LLMs are GANs. So no, it's not an LLM in a weather model, but it's very similar to an LLM in terms of how it is trained.
Workaccount2•1h ago
Interestingly, while this model is based on a Google Deepmind AI weather model, it's based on a model from 2023 (GraphCast) rather than the WeatherNext 2 model which has grabbed headlines as of late. I'd imagine it takes a while to integrate and test everything, explaining the gap.
sigmar•1h ago
I've been assuming that, unlike graphcast, they have no intention to make weathernext 2 open source.
username223•1h ago
Whatever it is, it seems like it might be roughly competitive with ECMWF, the state of the art when it comes to global weather models: https://www.epic.noaa.gov/ai/eagle-verification/

A quick search didn't turn up anything about the model's skill or resolution, though I'm sure the data exists.

ryuuchin•31m ago
They run at 0.25 degree resolution (same as ECMWF AIFS models).
apawloski•40m ago
I've seen the Microsoft Aurora team make a compelling argument that weather is an interesting contradiction of the AI-energy-waste narrative. Once deployed at scale, inference with these models is actually a sizable energy/compute improvement over classical simulation and forecasting methods. Of course it is energy intensive to train the model, but the usage itself is more energy efficient.
ryuuchin•32m ago
These are available on Weatherbell[1] (which requires a subscription) now except for the HGEFS ensemble model which I'm guessing will probably be added later. AIGFS is on tropical tidbits which should be free for some stuff[5]. I believe some of the research on this is mentioned in these two[2][3] videos from NOAA weather partners site. They also talk about some of the other advances in weather model research.

One of the big benefits of both the single run and ensemble AIGFS models is the speed and (less) computation time required. Weather modeling is hard and these models should be used as complementary to deterministic models as they all have their own strengths and weaknesses. They run at the same 0.25 degree resolution as the ECMWF AIFS models which were introduced earlier this year and have been successful[4].

[1] https://www.weatherbell.com/

[2] https://www.youtube.com/watch?v=47HDk2BQMjU

[3] https://www.youtube.com/watch?v=DCQBgU0pPME

[4] https://www.ecmwf.int/en/forecasts/dataset/aifs-machine-lear...

[5] https://www.tropicaltidbits.com/analysis/models/

padjo•32m ago
What does AI refer to here? Presumably weather models have been using all sorts of advanced machine learning for decades now, so what’s AI about this that wasn’t AI previously?
tomww•12m ago
They're using a graph neural network. From the article - "The team leveraged Google DeepMind's GraphCast model as an initial foundation and fine-tuned the model using NOAA's own Global Data Assimilation System analyses".

> so what’s AI about this that wasn’t AI previously?

The weather models used today are physics-based numerical models. The machine learning models from DeepMind, ECMWF, Huawei and others are a big shift from the standard, numerical approach used for the last decades.

CalChris•24m ago
Neil Jacobs, Ph.D

This makes me skeptical that it isn’t just politicized Trumpian nonsense.

luc_•16m ago
I wonder if the new models consider land use change and emissions from aggressive datacenter development and model training...