frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Atlas: Manage your database schema as code

https://github.com/ariga/atlas
1•quectophoton•1m ago•0 comments

Geist Pixel

https://vercel.com/blog/introducing-geist-pixel
1•helloplanets•3m ago•0 comments

Show HN: MCP to get latest dependency package and tool versions

https://github.com/MShekow/package-version-check-mcp
1•mshekow•11m ago•0 comments

The better you get at something, the harder it becomes to do

https://seekingtrust.substack.com/p/improving-at-writing-made-me-almost
2•FinnLobsien•13m ago•0 comments

Show HN: WP Float – Archive WordPress blogs to free static hosting

https://wpfloat.netlify.app/
1•zizoulegrande•14m ago•0 comments

Show HN: I Hacked My Family's Meal Planning with an App

https://mealjar.app
1•melvinzammit•15m ago•0 comments

Sony BMG copy protection rootkit scandal

https://en.wikipedia.org/wiki/Sony_BMG_copy_protection_rootkit_scandal
1•basilikum•17m ago•0 comments

The Future of Systems

https://novlabs.ai/mission/
2•tekbog•18m ago•1 comments

NASA now allowing astronauts to bring their smartphones on space missions

https://twitter.com/NASAAdmin/status/2019259382962307393
2•gbugniot•22m ago•0 comments

Claude Code Is the Inflection Point

https://newsletter.semianalysis.com/p/claude-code-is-the-inflection-point
3•throwaw12•24m ago•1 comments

Show HN: MicroClaw – Agentic AI Assistant for Telegram, Built in Rust

https://github.com/microclaw/microclaw
1•everettjf•24m ago•2 comments

Show HN: Omni-BLAS – 4x faster matrix multiplication via Monte Carlo sampling

https://github.com/AleatorAI/OMNI-BLAS
1•LowSpecEng•25m ago•1 comments

The AI-Ready Software Developer: Conclusion – Same Game, Different Dice

https://codemanship.wordpress.com/2026/01/05/the-ai-ready-software-developer-conclusion-same-game...
1•lifeisstillgood•27m ago•0 comments

AI Agent Automates Google Stock Analysis from Financial Reports

https://pardusai.org/view/54c6646b9e273bbe103b76256a91a7f30da624062a8a6eeb16febfe403efd078
1•JasonHEIN•30m ago•0 comments

Voxtral Realtime 4B Pure C Implementation

https://github.com/antirez/voxtral.c
2•andreabat•33m ago•1 comments

I Was Trapped in Chinese Mafia Crypto Slavery [video]

https://www.youtube.com/watch?v=zOcNaWmmn0A
2•mgh2•39m ago•0 comments

U.S. CBP Reported Employee Arrests (FY2020 – FYTD)

https://www.cbp.gov/newsroom/stats/reported-employee-arrests
1•ludicrousdispla•41m ago•0 comments

Show HN: I built a free UCP checker – see if AI agents can find your store

https://ucphub.ai/ucp-store-check/
2•vladeta•46m ago•1 comments

Show HN: SVGV – A Real-Time Vector Video Format for Budget Hardware

https://github.com/thealidev/VectorVision-SVGV
1•thealidev•47m ago•0 comments

Study of 150 developers shows AI generated code no harder to maintain long term

https://www.youtube.com/watch?v=b9EbCb5A408
1•lifeisstillgood•48m ago•0 comments

Spotify now requires premium accounts for developer mode API access

https://www.neowin.net/news/spotify-now-requires-premium-accounts-for-developer-mode-api-access/
1•bundie•50m ago•0 comments

When Albert Einstein Moved to Princeton

https://twitter.com/Math_files/status/2020017485815456224
1•keepamovin•52m ago•0 comments

Agents.md as a Dark Signal

https://joshmock.com/post/2026-agents-md-as-a-dark-signal/
2•birdculture•54m ago•0 comments

System time, clocks, and their syncing in macOS

https://eclecticlight.co/2025/05/21/system-time-clocks-and-their-syncing-in-macos/
1•fanf2•55m ago•0 comments

McCLIM and 7GUIs – Part 1: The Counter

https://turtleware.eu/posts/McCLIM-and-7GUIs---Part-1-The-Counter.html
2•ramenbytes•58m ago•0 comments

So whats the next word, then? Almost-no-math intro to transformer models

https://matthias-kainer.de/blog/posts/so-whats-the-next-word-then-/
1•oesimania•59m ago•0 comments

Ed Zitron: The Hater's Guide to Microsoft

https://bsky.app/profile/edzitron.com/post/3me7ibeym2c2n
2•vintagedave•1h ago•1 comments

UK infants ill after drinking contaminated baby formula of Nestle and Danone

https://www.bbc.com/news/articles/c931rxnwn3lo
1•__natty__•1h ago•0 comments

Show HN: Android-based audio player for seniors – Homer Audio Player

https://homeraudioplayer.app
3•cinusek•1h ago•2 comments

Starter Template for Ory Kratos

https://github.com/Samuelk0nrad/docker-ory
1•samuel_0xK•1h ago•0 comments
Open in hackernews

The 512KB Club

https://512kb.club/
161•lr0•3mo ago

Comments

namegulf•3mo ago
Not sure how a site can fit in that club?

For e.g., if someone uses Google Analytics, that alone comes to 430kb (which most people do)

znpy•3mo ago
> Not sure how a site can fit in that club?

That's the challenge

namegulf•3mo ago
true
undeveloper•3mo ago
Then don't use google analytics.
namegulf•3mo ago
One option is to use access logs on the server and process stats
skydhash•3mo ago
It’s very easy. 512kb can fit a whole novel in epub format. And HTML is a very verbose language.
namegulf•3mo ago
Just sticking with html, it's easy peasy
01HNNWZ0MV43FF•3mo ago
I've never used it. Some browsers even honor that stupid beacon header now too
inetknght•3mo ago
> For e.g., if someone uses Google Analytics, that alone comes to 430kb (which most people do)

Perhaps someone might not use Google Analytics. Perhaps someone might apply 430kb to actual content instead.

xigoi•3mo ago
Back in the early internet, nobody had enough bandwidth to transmit 512 kB in a reasonable time, so clearly it has to be possible.
Levitz•3mo ago
Well yes, we had low resolution images, no consideration for different viewports, no non-default fonts and little interactivity beyond links or other queries to the server.
busymom0•3mo ago
> no non-default fonts

That's a win!!!

xigoi•3mo ago
Responsivity is a few hundred bytes of CSS. Modern formats for fonts (woff2) and images (webp) allow you to squeeze them in a couple hundred kilobytes, which still leaves plenty of room for scripts.
baobun•3mo ago
There are plenty of anaytics options at <1% that size.

https://github.com/Destiner/astro-analytics

bilekas•3mo ago
> Why does any site need to be that huge? It’s crazy.

It's advertising and data tracking.. Every. Single. Time.

PiHole/Adblocker have become essential for traversing the cesspool that is the modern internet.

skydhash•3mo ago
While a lot of sites break when you disable JavaScript, browsing is very fluid when you do. I’m also using the html version of DDG. There’s only an handful of websites (and the majority are apps) that I’ve enabled JS for.

And one of these days, I will write a viewer for GitHub links, that will clone the repo and allows me to quickly browse it. For something that is aimed at dev, the platform is horrendous.

tredre3•3mo ago
> It's advertising and data tracking.. Every. Single. Time.

Use bootstrap and one image larger than 16x16 and you're near 500KB already.

It's easy to blame the boogeyman but sometimes it's worth looking in the mirror too...

bilekas•3mo ago
I liked bootstrap, in my world (non front-end) it was a standardised way to decide on mobile screens etc. I am old so bear with me, but that was a whole thing. There was no \SPEC\ these are the things that need to be established and done, it was decided by browsers in the end.

Now, browsers are built mostly by those who ran(today it's a mess) the web. HTML schema and spec... I've never met anyone that's contributed to it in the last 10 years.

The best developers and engineers I do know, don't consider it anymore. It's been done in our eyes. The big companies took over.

timenotwasted•3mo ago
It's a fun way to push for a lighter web but without a way to distinguish the complexity of the sites on the list it's really not all that useful. It's kind of addressed in the FAQ "The whole point of the 512KB Club is to showcase what can be done with 512KB of space. Anyone can come along and make a <10KB site containing 3 lines of CSS and a handful of links to other pages" but without a way for the user to distinguish the site complexity at a glance I'm not sure I understand the point. Regardless of the FAQ the first few sites I clicked on while yes were quite light in size but also had nothing more than some text and background colors on their sites. Also any search site is going to be at the near top of the list e.g. https://steamosaic.com/

Complexity would be a subjective metric but without it I'm not sure what you take from this other than a fun little experiment, which is maybe all it's meant to be.

unglaublich•3mo ago
So they should invert it like the demoscene.

Set the limit first, and then request folks to join the contest:

What crazy website can _you_ build in 512KB?

timenotwasted•3mo ago
"What crazy website can _you_ build in 512KB?" Exactly! That would be super fun and interesting.
adamzwasserman•3mo ago
Tag-based organization system with drag-drop, real-time filtering, row/column layouts. ~55KB gzipped.

Built it frustrated with Trello's limitations. The 512KB constraint forced good architecture: server-side rendering, progressive enhancement, shared indexes instead of per-item duplication. Perfect Lighthouse score so far - the real test is keeping it through release.

Extracting patterns into genX framework (genx.software) for release later this month.

unglaublich•3mo ago
Building a sub 512KB website is trivial.

Just don't use external trackers, ads, fonts, videos.

Building a sub 512KB website that satisfies all departments of a company of non-trivial size; that is hard.

bilekas•3mo ago
> Building a sub 512KB website is trivial.

Even for larger sites, it can be trivial, but I prefer to look at it from a non SPA/state-mgmt point of view.

Not every site needs to be an SPA. Or even a 'react app'. I visit a page, record your metrics on the backend for all I care, you have the request headers etc, just send me the data I need, nothing else.

It doesn't have to be ugly or even lacking some flair, 500KB is a lot of text. Per page request, with ootb browser caching, there's no excuse. People have forgotten that's all you need.

> People have forgotten that's all you need.

Edit : No they havent, they just can't monetize optimizations.

EMM_386•3mo ago
You don't even need a framework for a SPA.

I have a SPA that is just vanilla web components and is clean, small, fast and tidy. No outside frameworks, just my own small amount of transpiled TypeScript.

I prefer to write them that way because it meets my needs for the particular site and is very responsive. But I've also done PHP, ASP.Net, Rails and other server-side work for many years. Best tool for the job, and sometimes they are very different.

benchly•3mo ago
I thought that too assumed my blog fit the criteria. I was wrong; weighed in at just over 100KB too heavy to get in the club.

My guess is the photos.

bjord•3mo ago
are you using webp photos?
daemonologist•3mo ago
webp is not much of an upgrade in my experience - jpegli pretty much matches it in quality/size while having better compatibility, and if you don't have the original photo and are working with old crusty jpegs it's often best to just leave them alone rather than re-encoding. jpeg-xl does make a noticeable difference, but it's not widely supported.
gruturo•3mo ago
Webp is utter garbage. Awful quality loss, which they're still in denial of, for minimal size gains over a jpeg encoded with modern software (which is also way way more compatible - software from 35 years ago can open it). But I'm sure it scores well in whatever flawed perceptual benchmark they automated.

If only google didn't oppose jxl - but they'd have to implicitly admit that webp is garbage and they don't like doing that.

acdha•3mo ago
WebP produced good file size reductions because they recompressed JPEGs and ignored the loss in detail. I benchmarked it the day it launched and it was never once competitive enough to be worth the cost of using it. If browsers had supported JPEG-2000 in the 2000s, it’d have stomped WebP on every benchmark – even a tuned JPEG encoder did surprisingly well given the age of that format.

HEIC, AVIF, JXL, etc. are worth the trouble.

theandrewbailey•3mo ago
WebP isn't enough of an improvement to justify the overhead of supporting it for me. AVIF is twice as efficient as JPEG, and has 97+% support across browsers. Use <picture> for JPEG fallback.

https://caniuse.com/?search=avif

https://developer.mozilla.org/en-US/docs/Web/HTML/Reference/...

unglaublich•3mo ago
Exactly, so it's not so much a demonstration of how nice a website fits in 512K as it's about _just not using any media_. Not very interesting imho.
NooneAtAll3•3mo ago
maybe lower the resolution?

at least to test your guess

est•3mo ago
I ran a pretty lightweight blog, but sometimes several .jpg would easily cost >512KB
theandrewbailey•3mo ago
Same here. You can join me in the https://1mb.club/
benchly•3mo ago
Hah! Maybe for now, but I'll probably pass that with a few more posts I have in the works. They're related to some projects I have going (that I will eventually finish), which means more photos.

What I really need to do is standardize my format and size, just to make things a bit more visually tidy. I'm not all that interested in keeping below a certain size, more so that I'm not doing anything I consider unnecessary for the function and feel of the site. Fairly certain whatever readership I have is either related to me or people I play D&D with, so it's really just a fun thing for me to do once in awhile.

snovv_crash•3mo ago
Literally yesterday there were people defending 15MB websites:

https://news.ycombinator.com/item?id=45798681

zahlman•3mo ago
> Building a sub 512KB website that satisfies all departments of a company of non-trivial size; that is hard.

And yet tons of personal blogs likely weigh in well over that mark, despite having no requirements beyond personally imposed ideas about how to share information with the world.

> Just don't use external trackers, ads, fonts, videos.

The Internet is likely full of "hero" images that weigh more than 512KB by themselves. For that matter, `bootstrap.min.css` + `bootstrap.min.js` is over half of that budget already.

Not that people need those things, either. But many have forgotten how to do without. (Or maybe bilekas is right; but I like the idea of making things small because of my aesthetic sense. I don't need a financial incentive for that. One of these days I should really figure out what I actually need for my own blog....)

cnnlives73•3mo ago
> Building a sub 512KB website that satisfies all departments of a company of non-trivial size; that is hard.

I’m all for minimalism and have a much lower target than 512KB for some of my work. But I think the goal of having the entire site in 512KB is a little strange; someone might have a 511KB SaaS SPA app while others struggle to have 51 10KB pages. That’s not the same.

Amorymeltzer•3mo ago
Most thorough discussion here from a few years ago: <https://news.ycombinator.com/item?id=30125633>

Seems like lichess dropped off

opengrass•3mo ago
14KB Club > 512KB Club
lloydatkinson•3mo ago
There’s basically nothing on that list lol. Is there a list of all these N-KB Club sites?
Linux-Fan•3mo ago
I don't know of any list, but I know at least of the following “clubs” pages:

size-related

- <https://14kbclub.com/> - only learned about it today but I am not sure if my site would qualify (it is only ~12 KiB, but does multiple requests...)

- <https://250kb.club/>

- <https://512kb.club> - my site got removed as “ulta minimal” :(

- <https://1mb.club/>

not specifically size-related

- <https://no-js.club/members/>

- <https://xhtml.club/>

- <https://textonly.website/> - my site got removed (I guess because it has a logo and this makes it not text-only...)

There used to be also a 10 KB club and per its rules my site would have qualified except for the requirement to be featured on HN or otherwise be a “noteworthy site” if I recall correctly. However, 10KB club seems to be offline for some time already...

In general the issue with these kinds of pages is mostly that they only check _one_ page (typically the homepage but sometimes I see people submit a special “reduced version” of their homepage, too...). Of course if _all_ pages were to be relevant I think even my (pretty miminmalist) page wouldn't qualify because some pages have high-resolution images I guess...

scatbot•3mo ago
I get the appeal of the 512KB Club. Most modern websites are bloat, slow and a privacy nightmare. I even get the nerdy thrill of fitting an entire website into a single IP packet, but honestly, this obsession with raw file size is kinda boring. It just encourages people micromanage whitespace, skip images or cut features like accessibility or responsive layouts.

A truly "suckless" website isn't about size. It's one that uses non-intrusive JS, embraces progressive enhancement, prioritizes accessibility, respects visitor's privacy and looks clean and functional on any device or output medium. If it ends up small: great! But that shouldn't be the point.

wredcoll•3mo ago
A rather perfect example of "correlation is not causation". But being "suckless" is a lot harder to measure than just running `length(string)`.
OptionX•3mo ago
Or to be even more suckless it would require the user toboatch in whatever feather they need.
ciupicri•3mo ago
Size matters too if your bandwidth is limited.
adamzwasserman•3mo ago
The 512KB limit isn't just minimalism - it forces architectural discipline.

I built a Trello alternative (frustrated with limitations: wanted rows and decent performance). Came in at ~55KB gzipped by following patterns, some of which I'm open sourcing as genX (genx.software - releasing this month):

- Server renders complete HTML (not JSON that needs client-side parsing) - JavaScript progressively enhances (doesn't recreate what's in the DOM) - Shared data structures (one index for all items, not one per item) - Use native browser features (DOM is already a data structure - article coming)

Most sites ship megabytes because modern tooling treats size as a rounding error. The 512KB constraint makes you think about what's expensive and get creative. Got rewarded with a perfect Lighthouse score in dev - striving to maintain it through release.

Would love feedback from this community when it's out.

kelvinjps10•3mo ago
I'd like too see a 512 KB club but for apps most of them are blogs and that's easier to do than an app. Link your app when you finish it pls
adamzwasserman•3mo ago
Wil do!
catapart•3mo ago
Any interest in honorable mentions? I've got a (pretty basic) kanban-style taskboard manager app clocking in at just under 715kb.[0] When I finish the project and switch to using the minified code, I think I'd still be a hair over 512, so it's a fully lost cause at making the club, but I hope you can forgive the shameless plug due to shared appreciation for small apps!

[0] https://catapart.github.io/magnit-ceapp-taskboard-manager/de...

continuational•3mo ago
Seems like we can join the club! https://www.firefly-lang.org/ is 218 kB uncompressed.
retrac•3mo ago
I use an Intel Atom netbook from 2010 as my test system. It has 1 GB of RAM and an in-order x86 processor. CPU Benchmark gives it 120 Mop/s integer and 42 MiB/s for AES. (For comparison, my usual laptop, which is also nearly obsolete with an i5-8350u gives 22,000 Mop/s and 2000 MiB/s respectively.)

The netbook can load Firefox in just a few seconds. And Hacker News loads almost instantly as on a modern machine. (Hit enter and the page is rendered before you can blink.)

The same machine can also play back 720p H.264 video smoothly.

And yet, if I go to Youtube or just about any other modern site, it takes literally a minute to load and render, none of the UI elements are responsive, and the site is unusable for playing videos. Why? I'm not asking for anything the hardware isn't capable of doing.

If my own work isn't snappy on the Atom I consider it a bug. There are a lot of people using smartphones and tablets with processors in the same class.

zeusk•3mo ago
> And yet, if I go to Youtube or just about any other modern site, it takes literally a minute to load and render, none of the UI elements are responsive, and the site is unusable for playing videos. Why? I'm not asking for anything the hardware isn't capable of doing.

but the website and web renderer are definitely not optimized for a netbook from 2010 - even modern smartphones are better at rendering pages and video than your atom (or even 8350u) computers.

nasretdinov•3mo ago
> even modern smartphones are better

That's an understatement if I've ever seen one! For web rendering single-threaded performance is what mostly matters and smartphones got crazy good single-core performance these days. The latest iPhone has faster single core than even most laptops

zeusk•3mo ago
Yes, but parent comment definitely implied they weren't talking about people running on the latest and best out there. Even the middle-grade smartphones today are leaps and bounds better than the atom from 2010.
aj_hackman•3mo ago
If it goes down you can always upgrade to a Raspberry Pi 3B+.
zozbot234•3mo ago
Youtube serves AV1 video these days, which needs CPU rendering on older machines. It might become usable if you switch it to 144p resolution. (For reference, such low-resolution video was very common in the mid-1990s, so it's not wildly out of place on a machine from 2010.)
radiator•3mo ago
Perhaps, but running on an 2010 Atom, you don't even get to press the play button of a video within one minute. That was the point, I believe.
zahlman•3mo ago
> Your total UNCOMPRESSED web resources must not exceed 512KB.

I only see domains listed. Does this refer to the main page only, or the entire site?

paulddraper•3mo ago
> Your total UNCOMPRESSED web resources must not exceed 512KB

JavaScript gets all the hate for size, but images easily surpass even your most bloated frameworks.

Which is why the websites on this list largely don't use media.

---

The problem with JavaScript is the network size (well, not as much); it's the execution time.

Narishma•3mo ago
Would help if there was a short description of what the websites are about, instead of just a list of random URLs.
HuwFulcher•3mo ago
While a fun idea, arbitrary limits like this just aren’t necessary. Yes it’s all well and good in the name of reducing trackers, etc but what if I want to have an image heavy site? Why does that get perceived as a bad thing?
viccis•3mo ago
It calls out the NYT at the beginning, but am I supposed to be impressed that a bunch of mostly obscure minimalist blogs are a few megabytes smaller than the biggest online news site (by subscribers) in the world?

What are we doing here? And to brag about this while including image media in the size is just onanistic.

raldu•3mo ago
So, “Let’s build carbon-titanium-foldable bicycles instead of bloated modern cars, and still get from A to B?”

How many mainstream online platform users care about the difference in KB in their experience, anyway?

The sites in the list are hobbyist clubs with a technical point of view, which wouldn’t make sense for a mass media outlet with millions of daily traffic, and real interdepartmental complexity and compliance issues to deal with.

aj_hackman•3mo ago
> Let’s build carbon-titanium-foldable bicycles instead of bloated modern cars, and still get from A to B?

Yeah, that sounds awesome.

Nobody _needs_ to run a 4-minute mile, or win a chess tournament, or climb a mountain, or make the most efficient computer program. They're still worthwhile human endeavors.

Dylan16807•3mo ago
> So, “Let’s build carbon-titanium-foldable bicycles instead of bloated modern cars, and still get from A to B?”

That's when you fit the core of your website into 14KB so it can be sent in a single round trip.

512KB is a lot. You can fit a reasonable "car" into it.

welfare•3mo ago
As an engineering challenge, I love it.

Other than that, I would've understood this notion better in the 90's when we were all on dialups. Maybe my perception is skewed growing up and seeing in real-time a picture loading on a website?

Now, even with outdated hardware on an ok connection, even larger sites like WAPO (3MB) loads what I feel like instantly (within 5-10 seconds). If it loaded in 2 seconds or 1 second, I really don't know how that would impact my life in any way.

As long as a site isn't sluggish while you browse around.

MrMetric•3mo ago
My mobile phone's data connection isn't free. I'd prefer it not be wasted on sloppily-made websites.
morcus•3mo ago
Most of the time, and to a lot of people, it doesn't matter. I have a fast mobile data plan and fast home Internet. But even I have encountered the following circumstances where I wish sites were smaller:

- on an underground subway with slow and spotty connection

- on slow airplane WiFi

- on slow hotel WiFi

- on a slow mobile network while traveling internationally

rodolphoarruda•3mo ago
I like this website. It's very entertaining to me, and a bit nostalgic too. And those minimalist websites also help us remember the importance of building things to last the effects of time. Most of them are good candidates to stay online for the next 15 or 20 Internet years to come (almost like eternity in human terms).
graypegg•3mo ago
> Your total UNCOMPRESSED web resources must not exceed 512KB.

I would be interested to know how they define web resources. HN would only fit this description if we don't count every possible REST resource you could request, but instead just the images (3 svgs), CSS (news.css), and JS (hn.js).

The second you count any possible response to `https://news.ycombinator.com/item?...` in the total, we've blown the cap on 512kb... and that's where the actual useful content lays.

Feels like regular ol' REST-and-forms webapps aren't quite the target of this list though, so who knows.

genezeta•3mo ago
> I would be interested to know how they define web resources.

They explain things in the FAQ. You're supposed to do a "Cloudflare URL Scan" and read the "Total bytes". For HN this is 47kB [1], which, yes, is just the 6 requests needed for / and nothing more.

[1] https://radar.cloudflare.com/scan/4c2b759c-b690-44f0-b108-b9...

NooneAtAll3•3mo ago
> Cloudflare URL Scan

> Cloudflare

any chance for a non-monopoly version?

genezeta•3mo ago
While you might not get the exact same numbers (^1), you can get a very similar result in your browser's devtools, in the network tab, by doing a clean reload of the page. It will give you a total (both compressed/transferred bytes, and uncompressed).

---

(^1) If the page, as HN does, has some headers or additional content for logged in visitors, the numbers will generally be a bit different. But the difference will usually be small.

moebrowne•3mo ago
There is some movement on using a headless browser instead of Cloudflare.

See https://github.com/kevquirk/512kb.club/issues/1187 and https://github.com/kevquirk/512kb.club/pull/1830

defraudbah•3mo ago
the very first website has either 404 links or pages with over a megabyte of total payload. idea is good but I don't buy "fast" website that only serve text with the css from 70s
yakireev•3mo ago
You posted your comment on one of those, didn't you?
defraudbah•3mo ago
yes, the one about your mom, it didn't fit the screen though
squarefoot•3mo ago
> The 512KB limit isn't just minimalism - it forces architectural discipline.

True. I skimmed the biggest sites in that list, and they still are extremely fast. It's not just that size limit that makes the difference, but rather knowing that there is one and therefore forcing oneself to reason and use the right tools without cramming unneeded features.

It would be worth adding some information on the page about the best tools to help the creation of small yet functionally complete and pleasant to look at static sites. A few years ago I'd have said Hugo (https://gohugo.io/), but didn't check for a while and there could be better ones. Also ultra cheap hosting options comparable to Neocities (.org) but located in the EU.

1vuio0pswjnm7•3mo ago
Assuming most of these sites use Javascript, perhaps the size of memory use should also be considered

I use a text-only HTML viewer, no Javascript interpreter. This is either a 2M or 1.3M static binary

The size of the web page does not slow it down much, and I have never managed to crash it in over 15 years of use, unlike a popular browser

I routinely load catenated HTML files much larger than those found on the web. For example, on a severly underpowered computer, loading a 16M stored HTML file into the text-only client's cache takes about 8 seconds

I can lazily write custom commandline HTML filters that are much faster than Python or Javascript to extract and transform any web page into SQL or CSV. These filter are each ~40K static binary

As an experment I sloppily crammed 63 different web page styles into a single filter. The result was a 1.6M static binary

I use this filter every day for command line search

I'm a hobbyist, an "end user", not a developer

1vuio0pswjnm7•3mo ago
It appears the structure of the HTML being rendered affects loading time

For example, another 7.4 MB HTML file that is basically just a list of URLs loads in about 1.41s

NooneAtAll3•3mo ago
Is there a way of enforcing memory limits on websites from browser (user) side?

These clubs have little effect if there's no incentives in demand

tonymet•3mo ago
easy way: chrome extension to popup a banner when the heap > limit

hard way: custom chrome build to block websites from allocating heap > limit

est•3mo ago
many of the listed sites are way over 512KB. e.g. golang.cafe

I hope the club does a routine check with headless browsers or something.

mariodiana•3mo ago
I'm nostalgic for an old World Wide Web (which never really existed, thanks to GeoCities and such), and wish that we could form a sect of "Puritans," break away from the High Church, and sail away to some top-level domain of our own where we'll consider any outbound links heretical.
snvzz•3mo ago
Phantasy Star for Master System is 512KB.

A recent retranslation romhack exists[0] and it's pretty good.

0. https://github.com/maxim-zhao/psrp

casey2•3mo ago
>why do large sites You answered your question, they wouldn't be large if they were small, many users means many features which to some appears like bloat. The reality is that of the millions of users there are hundreds of thousands of frontends and extentions and other little programs that together use every little aspect of the site, not to mention all the ads and trackers that pay the bill for sites like nyt
Jotalea•3mo ago
I could have my future site [0] into this club, being ~434KB, but that would mean having to remove the blog posts or make them load manually, which I don't see as useful.

[0] https://jotalea.com.ar/pico/

nymanjon•3mo ago
I added one site that is mostly a static recipe site that my family uses. It includes VanJS to pick random recipes and you can save them as you go along to pick what you are planning for dinner that night. It also has a filter to find the recipes by name and a filter to filter by type. Mainly for personal use but shows what you can do with not a whole lot of code.[1]

I also added a question for my soccer app. Cloudflare doesn't know how to work with service worker-driven applications :-) This one puts the back end in the service worker and uses HTMZ-BE to make it feel like you are using an app. So, basically, a front end MPA with nice interactivity. Super light weight for what it does and easy to use.[2]

[1]: https://github.com/jon49/recipes

[2]: https://github.com/jon49/Soccer