frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Everything Is a Ralph Loop

https://ghuntley.com/loop/
1•ghuntley•1m ago•0 comments

Little red dots as young supermassive black holes in dense ionized cocoons [pdf]

https://www.nature.com/articles/s41586-025-09900-4
1•thunderbong•8m ago•0 comments

Politics and the English Language (1946) [pdf]

https://bioinfo.uib.es/~joemiro/RecEscr/PoliticsandEngLang.pdf
1•dvrp•8m ago•0 comments

U.S. freezes visas to 75 countries

https://www.kenklippenstein.com/p/trump-freezes-visas-to-75-countries
1•0x54MUR41•11m ago•1 comments

A Data Model for Git

https://jvns.ca/blog/2026/01/08/a-data-model-for-git/
1•vismit2000•12m ago•0 comments

Why is "Am I the asshole" always popular on Reddit

1•jaskirat1216•12m ago•0 comments

The New Food-Stamp Rules Will Make Your Head Spin

https://www.theatlantic.com/health/2026/01/snap-soda-ban-food-stamps/685637/
1•JumpCrisscross•13m ago•0 comments

Dps

https://engineering.fb.com/2019/08/15/security/zoncolan/
1•JohnCorey•17m ago•1 comments

Show HN: AudiobookHub – Blinkist-style summaries and full classics

https://www.audiobookhub.net/
1•baoyashishui•18m ago•2 comments

Something Is Wrong with Russia's Children

https://www.theatlantic.com/international/2026/01/russia-children-violence-war/685635/
1•JumpCrisscross•23m ago•0 comments

China blocks Nvidia H200 AI chips that US Government cleared for export – report

https://www.theguardian.com/technology/2026/jan/17/china-blocks-nvidia-h200-ai-chips-that-us-gove...
2•sorokod•23m ago•0 comments

Fatberg the size of 4 buses likely birthed poo balls that closed Sydney beaches

https://www.theguardian.com/australia-news/2026/jan/17/fatberg-poo-balls-sydney-beaches-malabar-o...
2•ljf•30m ago•0 comments

YouTube relaxes monetization policy on videos with controversial content

https://apnews.com/article/youtube-monetization-update-policy-controversial-issues-545e27e27e26e0...
1•01-_-•32m ago•0 comments

FestiveEcho

https://github.com/StnkRB/Chrome-Extension-X-AutoComment
1•rahulbootstrap•34m ago•1 comments

ClickHouse valued at $15B as database analytics firm rides AI wave

https://www.reuters.com/technology/database-management-firm-clickhouse-valued-15-billion-amid-ai-...
1•shadow28•34m ago•0 comments

True story of the 1916 hanging of Murderous Mary, a circus elephant

https://www.themoonlitroad.com/murderous-mary-the-elephant/
2•joebig•35m ago•1 comments

Show HN Rundown transforms docs into executable workflows

https://rundown.cool/
1•tobyhede•36m ago•0 comments

Tyler Cowen's AI Campus

https://arnoldkling.substack.com/p/tyler-cowens-ai-campus
1•samuel246•43m ago•0 comments

Show HN: Local AI that knows when you're burning out

https://www.humonos.com/beta
2•jaskirat1216•48m ago•0 comments

Why Systems Fail Under Load

https://www.youtube.com/watch?v=oO6pBX8_g6o
1•paperplaneflyr•54m ago•0 comments

GPT-5.2 does not follow instructions and ignores my prompts

https://old.reddit.com/r/OpenAI/comments/1mwyz6m/gpt_5_pro_no_following_instructions_and_ignoring/
1•behnamoh•56m ago•2 comments

Ask HN: How are you preventing LLMs from hallucinating in real workflows?

1•Agent_Builder•57m ago•0 comments

Built an app that aggregates Prediction Markets with AI Context

https://saipintel.ai:443/
1•everythingalt•59m ago•1 comments

Vibe coding is a blight on open-source

https://old.reddit.com/r/webdev/comments/1qcxres/vibe_coding_is_a_blight_on_opensource/
5•doppp•1h ago•0 comments

Learning better decision trees – LLMs as Heuristics for Program Synthesis

https://mchav.github.io/learning-better-decision-tree-splits/
1•mchav•1h ago•0 comments

Divorce app to save you lawyers fee

https://replantlife.com/
1•cheroll•1h ago•1 comments

Meditation and Unconscious: A Buddhist Monk and a Neuroscientist (2022)

https://thereader.mitpress.mit.edu/meditation-and-the-unconscious-buddhism-neuroscience-conversat...
2•arunc•1h ago•0 comments

Do not give up your brain

https://cassidoo.co/post/good-brain/
2•gpi•1h ago•0 comments

Reality Is Breaking the "AI Revolution"

https://www.planetearthandbeyond.co/p/reality-is-breaking-the-ai-revolution
6•handfuloflight•1h ago•0 comments

Teen Jailed After Exploiting Refund Policy for $570k

https://www.vice.com/en/article/teen-jailed-after-exploiting-refund-policy-for-570000/
3•lnguyen•1h ago•1 comments
Open in hackernews

I trained a 90-day weather AI on a single GPU using 150 years of data

https://github.com/consigcody94/lilith
1•sentinelowl•1h ago

Comments

sentinelowl•1h ago
Hey HN,

I built LILITH, an open source ML weather prediction system that runs on consumer hardware. The model trains in 15 minutes on an RTX 3060, the checkpoint is 22MB, and inference takes under a second.

THE PROBLEM

GraphCast, Pangu-Weather, and similar models are impressive but require: - ERA5 reanalysis data (controlled by ECMWF) - 80GB+ VRAM for inference - Institutional-scale compute

Meanwhile, NOAA’s GHCN dataset has 100K+ weather stations, 150+ years of data, completely public domain.

THE APPROACH

Instead of requiring gridded reanalysis, LILITH learns directly from sparse station observations:

Transformer encoder on 30 days of historical data Autoregressive decoder for multi-day prediction Multi-timescale rollout: 6h steps for days 1-14, daily for 15-42, weekly for 43-90 Climate signal injection (ENSO, MJO) for extended range Total parameters: 1.87M. You could email the checkpoint.

RESULTS

Trained on 915K sequences from 300 US stations: - Temperature RMSE: 3.96C - Temperature MAE: 3.01C - Climatology baseline is ~7C RMSE

For context, this beats just predicting historical averages, though it is not GraphCast-accurate for short range. The value is accessibility, not beating ECMWF.

HONEST LIMITATIONS

Days 1-7 are worse than operational models 90-day “forecasts” are really climate outlooks, not weather predictions Currently US stations only No ensemble/uncertainty quantification yet TECH STACK

PyTorch 2.x with Flash Attention FastAPI backend Next.js 14 frontend with glassmorphism UI Trains on 8GB VRAM with mixed precision The frontend has interactive 90-day charts, a station command center showing all 300 stations with predicted vs actual temps, and historical data exploration.

WHY IT MATTERS

Weather prediction has been an institutional monopoly. The data is public, consumer GPUs are powerful enough, and transformer architectures are well understood. There is no reason useful forecasting should be locked behind institutional walls.

Would love feedback on the station-native approach vs requiring ERA5, and whether the multi-timescale rollout makes sense for extended range.