frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Open in hackernews

The 512KB Club

https://512kb.club/
84•lr0•3h ago

Comments

namegulf•3h ago
Not sure how a site can fit in that club?

For e.g., if someone uses Google Analytics, that alone comes to 430kb (which most people do)

znpy•3h ago
> Not sure how a site can fit in that club?

That's the challenge

namegulf•2h ago
true
undeveloper•2h ago
Then don't use google analytics.
namegulf•2h ago
One option is to use access logs on the server and process stats
skydhash•2h ago
It’s very easy. 512kb can fit a whole novel in epub format. And HTML is a very verbose language.
namegulf•2h ago
Just sticking with html, it's easy peasy
01HNNWZ0MV43FF•2h ago
I've never used it. Some browsers even honor that stupid beacon header now too
inetknght•2h ago
> For e.g., if someone uses Google Analytics, that alone comes to 430kb (which most people do)

Perhaps someone might not use Google Analytics. Perhaps someone might apply 430kb to actual content instead.

xigoi•2h ago
Back in the early internet, nobody had enough bandwidth to transmit 512 kB in a reasonable time, so clearly it has to be possible.
Levitz•1h ago
Well yes, we had low resolution images, no consideration for different viewports, no non-default fonts and little interactivity beyond links or other queries to the server.
busymom0•1h ago
> no non-default fonts

That's a win!!!

bilekas•3h ago
> Why does any site need to be that huge? It’s crazy.

It's advertising and data tracking.. Every. Single. Time.

PiHole/Adblocker have become essential for traversing the cesspool that is the modern internet.

skydhash•2h ago
While a lot of sites break when you disable JavaScript, browsing is very fluid when you do. I’m also using the html version of DDG. There’s only an handful of websites (and the majority are apps) that I’ve enabled JS for.

And one of these days, I will write a viewer for GitHub links, that will clone the repo and allows me to quickly browse it. For something that is aimed at dev, the platform is horrendous.

tredre3•42m ago
> It's advertising and data tracking.. Every. Single. Time.

Use bootstrap and one image larger than 16x16 and you're near 500KB already.

It's easy to blame the boogeyman but sometimes it's worth looking in the mirror too...

timenotwasted•2h ago
It's a fun way to push for a lighter web but without a way to distinguish the complexity of the sites on the list it's really not all that useful. It's kind of addressed in the FAQ "The whole point of the 512KB Club is to showcase what can be done with 512KB of space. Anyone can come along and make a <10KB site containing 3 lines of CSS and a handful of links to other pages" but without a way for the user to distinguish the site complexity at a glance I'm not sure I understand the point. Regardless of the FAQ the first few sites I clicked on while yes were quite light in size but also had nothing more than some text and background colors on their sites. Also any search site is going to be at the near top of the list e.g. https://steamosaic.com/

Complexity would be a subjective metric but without it I'm not sure what you take from this other than a fun little experiment, which is maybe all it's meant to be.

unglaublich•2h ago
So they should invert it like the demoscene.

Set the limit first, and then request folks to join the contest:

What crazy website can _you_ build in 512KB?

timenotwasted•2h ago
"What crazy website can _you_ build in 512KB?" Exactly! That would be super fun and interesting.
adamzwasserman•1h ago
Tag-based organization system with drag-drop, real-time filtering, row/column layouts. ~55KB gzipped.

Built it frustrated with Trello's limitations. The 512KB constraint forced good architecture: server-side rendering, progressive enhancement, shared indexes instead of per-item duplication. Perfect Lighthouse score so far - the real test is keeping it through release.

Extracting patterns into genX framework (genx.software) for release later this month.

unglaublich•2h ago
Building a sub 512KB website is trivial.

Just don't use external trackers, ads, fonts, videos.

Building a sub 512KB website that satisfies all departments of a company of non-trivial size; that is hard.

bilekas•2h ago
> Building a sub 512KB website is trivial.

Even for larger sites, it can be trivial, but I prefer to look at it from a non SPA/state-mgmt point of view.

Not every site needs to be an SPA. Or even a 'react app'. I visit a page, record your metrics on the backend for all I care, you have the request headers etc, just send me the data I need, nothing else.

It doesn't have to be ugly or even lacking some flair, 500KB is a lot of text. Per page request, with ootb browser caching, there's no excuse. People have forgotten that's all you need.

> People have forgotten that's all you need.

Edit : No they havent, they just can't monetize optimizations.

benchly•2h ago
I thought that too assumed my blog fit the criteria. I was wrong; weighed in at just over 100KB too heavy to get in the club.

My guess is the photos.

bjord•1h ago
are you using webp photos?
daemonologist•1h ago
webp is not much of an upgrade in my experience - jpegli pretty much matches it in quality/size while having better compatibility, and if you don't have the original photo and are working with old crusty jpegs it's often best to just leave them alone rather than re-encoding. jpeg-xl does make a noticeable difference, but it's not widely supported.
unglaublich•1h ago
Exactly, so it's not so much a demonstration of how nice a website fits in 512K as it's about _just not using any media_. Not very interesting imho.
snovv_crash•1h ago
Literally yesterday there were people defending 15MB websites:

https://news.ycombinator.com/item?id=45798681

zahlman•1h ago
> Building a sub 512KB website that satisfies all departments of a company of non-trivial size; that is hard.

And yet tons of personal blogs likely weigh in well over that mark, despite having no requirements beyond personally imposed ideas about how to share information with the world.

> Just don't use external trackers, ads, fonts, videos.

The Internet is likely full of "hero" images that weigh more than 512KB by themselves. For that matter, `bootstrap.min.css` + `bootstrap.min.js` is over half of that budget already.

Not that people need those things, either. But many have forgotten how to do without. (Or maybe bilekas is right; but I like the idea of making things small because of my aesthetic sense. I don't need a financial incentive for that. One of these days I should really figure out what I actually need for my own blog....)

Amorymeltzer•2h ago
Most thorough discussion here from a few years ago: <https://news.ycombinator.com/item?id=30125633>

Seems like lichess dropped off

opengrass•2h ago
14KB Club > 512KB Club
lloydatkinson•2h ago
There’s basically nothing on that list lol. Is there a list of all these N-KB Club sites?
scatbot•2h ago
I get the appeal of the 512KB Club. Most modern websites are bloat, slow and a privacy nightmare. I even get the nerdy thrill of fitting an entire website into a single IP packet, but honestly, this obsession with raw file size is kinda boring. It just encourages people micromanage whitespace, skip images or cut features like accessibility or responsive layouts.

A truly "suckless" website isn't about size. It's one that uses non-intrusive JS, embraces progressive enhancement, prioritizes accessibility, respects visitor's privacy and looks clean and functional on any device or output medium. If it ends up small: great! But that shouldn't be the point.

wredcoll•2h ago
A rather perfect example of "correlation is not causation". But being "suckless" is a lot harder to measure than just running `length(string)`.
OptionX•28m ago
Or to be even more suckless it would require the user toboatch in whatever feather they need.
adamzwasserman•2h ago
The 512KB limit isn't just minimalism - it forces architectural discipline.

I built a Trello alternative (frustrated with limitations: wanted rows and decent performance). Came in at ~55KB gzipped by following patterns, some of which I'm open sourcing as genX (genx.software - releasing this month):

- Server renders complete HTML (not JSON that needs client-side parsing) - JavaScript progressively enhances (doesn't recreate what's in the DOM) - Shared data structures (one index for all items, not one per item) - Use native browser features (DOM is already a data structure - article coming)

Most sites ship megabytes because modern tooling treats size as a rounding error. The 512KB constraint makes you think about what's expensive and get creative. Got rewarded with a perfect Lighthouse score in dev - striving to maintain it through release.

Would love feedback from this community when it's out.

continuational•1h ago
Seems like we can join the club! https://www.firefly-lang.org/ is 218 kB uncompressed.
retrac•1h ago
I use an Intel Atom netbook from 2010 as my test system. It has 1 GB of RAM and an in-order x86 processor. CPU Benchmark gives it 120 Mop/s integer and 42 MiB/s for AES. (For comparison, my usual laptop, which is also nearly obsolete with an i5-8350u gives 22,000 Mop/s and 2000 MiB/s respectively.)

The netbook can load Firefox in just a few seconds. And Hacker News loads almost instantly as on a modern machine. (Hit enter and the page is rendered before you can blink.)

The same machine can also play back 720p H.264 video smoothly.

And yet, if I go to Youtube or just about any other modern site, it takes literally a minute to load and render, none of the UI elements are responsive, and the site is unusable for playing videos. Why? I'm not asking for anything the hardware isn't capable of doing.

If my own work isn't snappy on the Atom I consider it a bug. There are a lot of people using smartphones and tablets with processors in the same class.

zeusk•1h ago
> And yet, if I go to Youtube or just about any other modern site, it takes literally a minute to load and render, none of the UI elements are responsive, and the site is unusable for playing videos. Why? I'm not asking for anything the hardware isn't capable of doing.

but the website and web renderer are definitely not optimized for a netbook from 2010 - even modern smartphones are better at rendering pages and video than your atom (or even 8350u) computers.

nasretdinov•1h ago
> even modern smartphones are better

That's an understatement if I've ever seen one! For web rendering single-threaded performance is what mostly matters and smartphones got crazy good single-core performance these days. The latest iPhone has faster single core than even most laptops

zeusk•1h ago
Yes, but parent comment definitely implied they weren't talking about people running on the latest and best out there. Even the middle-grade smartphones today are leaps and bounds better than the atom from 2010.
zahlman•1h ago
> Your total UNCOMPRESSED web resources must not exceed 512KB.

I only see domains listed. Does this refer to the main page only, or the entire site?

paulddraper•1h ago
> Your total UNCOMPRESSED web resources must not exceed 512KB

JavaScript gets all the hate for size, but images easily surpass even your most bloated frameworks.

Which is why the websites on this list largely don't use media.

---

The problem with JavaScript is the network size (well, not as much); it's the execution time.

Narishma•51m ago
Would help if there was a short description of what the websites are about, instead of just a list of random URLs.
HuwFulcher•9m ago
While a fun idea, arbitrary limits like this just aren’t necessary. Yes it’s all well and good in the name of reducing trackers, etc but what if I want to have an image heavy site? Why does that get perceived as a bad thing?
viccis•5m ago
It calls out the NYT at the beginning, but am I supposed to be impressed that a bunch of mostly obscure minimalist blogs are a few megabytes smaller than the biggest online news site (by subscribers) in the world?

What are we doing here? And to brag about this while including image media in the size is just onanistic.

We're open-sourcing the successor of Jupyter notebook

https://deepnote.com/blog/were-open-sourcing-the-successor-of-jupyter-notebook
70•zX41ZdbW•1h ago•39 comments

Pg_lake: Postgres with Iceberg and data lake access

https://github.com/Snowflake-Labs/pg_lake
166•plaur782•3h ago•50 comments

Show HN: A CSS-Only Terrain Generator

https://terra.layoutit.com
198•rofko•5h ago•60 comments

Man spent 200 days building a solar-powered explorer yacht that can run forever

https://supercarblondie.com/solar-powered-explorer-yacht-helios-11/
14•rmason•38m ago•2 comments

NoLongerEvil-Thermostat – Nest Generation 1 and 2 Firmware

https://github.com/codykociemba/NoLongerEvil-Thermostat
33•mukti•2h ago•7 comments

Launch HN: Plexe (YC X25) – Build production-grade ML models from prompts

https://www.plexe.ai/
30•vaibhavdubey97•2h ago•7 comments

Codemaps: Understand Code, Before You Vibe It

https://cognition.ai/blog/codemaps
43•janpio•1h ago•7 comments

What is a manifold?

https://www.quantamagazine.org/what-is-a-manifold-20251103/
274•isaacfrond•9h ago•85 comments

Exploring a space-based, scalable AI infrastructure system design

https://research.google/blog/exploring-a-space-based-scalable-ai-infrastructure-system-design/
29•meetpateltech•2h ago•31 comments

Normalize Identifying Corporate Devices in Your Software

https://lgug2z.com/articles/normalize-identifying-corporate-devices-in-your-software/
27•Bogdanp•6d ago•11 comments

Optimizing Datalog for the GPU

https://danglingpointers.substack.com/p/optimizing-datalog-for-the-gpu
70•blakepelton•4h ago•13 comments

Recovering videos from my Sony camera that I stupidly deleted

https://www.jeffgeerling.com/blog/2025/recovering-videos-my-sony-camera-i-stupidly-deleted
35•speckx•1w ago•18 comments

The Rust Foundation Maintainers Fund

https://rustfoundation.org/media/announcing-the-rust-foundation-maintainers-fund/
45•amalinovic•2h ago•26 comments

This Day in 1988, the Morris worm infected 10% of the Internet within 24 hours

https://www.tomshardware.com/tech-industry/cyber-security/on-this-day-in-1988-the-morris-worm-sli...
134•canucker2016•4h ago•75 comments

Chaining FFmpeg with a Browser Agent

https://100x.bot/a/chaining-ffmpeg-with-browser-agent
69•shardullavekar•6h ago•37 comments

How devtools map minified JS code back to your TypeScript source code

https://www.polarsignals.com/blog/posts/2025/11/04/javascript-source-maps-internals
31•manojvivek•4h ago•7 comments

Bloom filters are good for search that does not scale

https://notpeerreviewed.com/blog/bloom-filters/
135•birdculture•10h ago•28 comments

My Truck Desk

https://www.theparisreview.org/blog/2025/10/29/truck-desk/
349•zdw•16h ago•82 comments

Customize Nano Text Editor

https://shafi.ddns.net/blog/customize-nano-text-editor
92•shafiemoji•1w ago•39 comments

Whole Earth Index

https://wholeearth.info/
4•bookofjoe•1w ago•0 comments

Tell HN: X is opening any tweet link in a webview whether you press it or not

412•stillatit•13h ago•375 comments

The 512KB Club

https://512kb.club/
84•lr0•3h ago•44 comments

Aisuru botnet shifts from DDoS to residential proxies

https://krebsonsecurity.com/2025/10/aisuru-botnet-shifts-from-ddos-to-residential-proxies/
42•feross•6d ago•12 comments

Things you can do with diodes

https://lcamtuf.substack.com/p/things-you-can-do-with-diodes
335•zdw•19h ago•96 comments

AI's Dial-Up Era

https://www.wreflection.com/p/ai-dial-up-era
418•nowflux•22h ago•376 comments

You can't cURL a Border

https://drobinin.com/posts/you-cant-curl-a-border/
401•valzevul•18h ago•216 comments

When stick figures fought

https://animationobsessive.substack.com/p/when-stick-figures-fought
303•ani_obsessive•18h ago•109 comments

Show HN: I built a local-first daily planner for iOS

https://apps.apple.com/ca/app/to-do-list-planner-zesfy/id6479947874
61•zesfy•5h ago•45 comments

Tenacity – a multi-track audio editor/recorder

https://tenacityaudio.org
110•smartmic•1w ago•32 comments

Data breach at major Swedish software supplier impacts 1.5M

https://www.bleepingcomputer.com/news/security/data-breach-at-major-swedish-software-supplier-imp...
24•fleahunter•2h ago•7 comments