To put this into another context, today there was a post about Slack's 404 page weighting 50Mb.
Evidently, the entire concept of size & communications efficiency has been abandoned
I don't really pine for the days of the PDP-8, when programmers had to make sure that almost every routine took fewer than 128 words, or the days of System/360, when you had to decide whether the fastest way to clear a register was to subtract it from itself or exclusive-or it with itself. We wasted a lot of time trying to get around stringent limitations of the technology just to do anything at all.
I just looked at the Activity Monitor on my Macbook. Emacs is using 115MB, Thunderbird is at 900MB, Chrome is at something like 2GB (I lost track of all the Renderer processes), and a Freecell game is using 164MB. Freecell, which ran just fine on Windows 95 in 8MB!
I'm quite happy with a video game taking a few gigabytes of memory, with all the art and sound assets it wants to keep loaded. But I really wonder whether we've lost something by not making more of an effort to use resources more frugally.
On the desktop we definitely lost responsiveness. Many webpages, even on the speediest, fatest, computer of them all are dog slow compared to what they should be.
Now some pages are fine but the amount of pigs out there is just plain insane.
I like my desktop to be lean and ultra low latency: I'll have tens of windows (including several browsers) and it's super snappy. Switching from one virtual workspace to another is too quick to see what's happening (it take milliseconds and I do it with a keyboard shortcut: reaching for the mouse is a waste of time).
I know what it means to have a system that feels like it's responding instantly as I do have such a system... Except when I browse the web!
And it's really only some (well, a lot of) sites: people who know what they're doing still come up with amazingly fast websites. But it's the turds: those shipping every package under the sun and calling thousands of micro-services, wasting all the memory available because they know jack shit about computer science, that make the Web a painful experience.
And although I use LLMs daily, I see a big overlap between those having the mindset required to produce such turds and those thinking LLMs are already perfectly fine today to replace devs, so I'm not exactly thrilled about the immediate future of the web.
P.S: before playing the apologists for such turd-sites, remember you're commenting on a very lean website. So there's that.
Remember, there's a gigabit pathway between server and browser, so use as much of the bandwidth as you need.
I'll bite. What do you think we've lost? What would the benefit be of using resources more frugally?
Disclosure: I'm an embedded systems programmer. I frequently find myself in the position where I have to be very careful with my usage of CPU cycles and memory resources. I still think we'd all be better off with infinitely fast, infinitely resourced computers. IMO, austerity offers no benefit.
(Remember Bill Atkinson's famous response, quoted here to how much code he'd written that week: -3000. He had reworked Quickdraw so that it was faster and better, with a net loss of 3000 lines of code.) Of course the classic Mac had its own constraints.
Yes, by several orders of magnitude. I couldn't enter or display Japanese or on my Atari 800 nor Apple 2 nor C64 (sorry, only 45 years ago). I couldn't display 200+ large 24bit images with ease (heres 100: https://www.flickr.com/groups/worldofarchitecture/pool/). Or try this: https://scrolldit.com/
I couldn't play 16 videos simultaneously while downloading stuff in the background and and playing a game. I could go on and on but my computer today is vastly more usable than any of my computers 40 years ago that could only effectively run one app at a time and I had to run QEMM and edit my config.sys and autoexec.bat to try to optimized my EMS and XMS memory cards.
I love that I can display a video as simple as
<video src="url-to-video"></video>
Emphasis mine, and tying with how it opened with the story about the designer who believed accessibility and "good design" are at odds (I'm screaming inside).
-- Feynman
uBlock Filters EasyList EasyPrivacy Online Malicious URL Blocklist Peter Lowe's Ad and tracking server list EasyList - Annoyances
For tech-inclined: Codeberg/GitLab/GitHub Pages or Cloudflare Pages
But, damn, that was some fun stuff. Really challenging to get the graphical results we wanted and keep it under budget (15 KB in the early days).
It's really satisfying.
Suggesting that an application should stay within a 128KB limit is akin to saying I enjoy playing games in polygon mode. Battlezone was impressive in the 90s, but today, it wouldn't meet user expectations.
In my opinion, initial load time is a better measure of performance. It combines both the initial application size and the time to interactivity.
Achieving this is much more complex. There are many strategies to reduce initial load size and improve time to interactivity, such as lazy loading, using a second browser process to run code, or minimizing requests altogether. However, due to this complexity, it's also much easier to make mistakes.
Another reason this is often not done well is that it requires cross-team collaboration and cross-domain knowledge. It necessitates both frontend and backend adjustments, as well as optimisation at the request and response levels. And it is often a non-functional requirement like accessibility that is hard to track for a lot of teams.
Why even make it "reactive"? Just make your site static server-rendered pages? Or just static pages. Is it because additional-content-loading is something users expect?
"Write your site in plain javascript and html. Don't use a framework. Write some minimal css. Bamo. Well under 128kb." ???
Some years ago I made a website again. Screw best practices, I used my systems engineering skills and the browser’s debugger. I had written game engines with soft realtime physics simulations and global illumination over the network. I knew what computers could do. This website would render within 1 frame at 60 FPS without any layout recalculation, garbage collection events, web requests that can’t be parallelized without dependencies etc.
I showed it to friends. They complained it doesn’t work. They didn’t realize that once they clicked, the site displayed the next content instantly (without any weird js tricks). This was a site with a fully responsive and complex looking design. The fact that users are SO used to terrible UX made me realize that I was right about this industry all along as a child.
garbuhj•4h ago
MaxBarraclough•4h ago
I'm reminded of The Website Obesity Crisis, [0] where the author mentions reading an article about web bloat, then noticing that page was not exactly a shining example of lightweight design. He even calls out Medium specifically.
[0] https://idlewords.com/talks/website_obesity.htm , discussed https://news.ycombinator.com/item?id=34466910
garbuhj•3h ago