frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Let your Coding Agent debug the browser session with Chrome DevTools MCP

https://developer.chrome.com/blog/chrome-devtools-mcp-debug-your-browser-session
204•xnx•3h ago•75 comments

The 49MB Web Page

https://thatshubham.com/blog/news-audit
159•kermatt•2h ago•90 comments

//go:fix inline and the source-level inliner

https://go.dev/blog/inliner
71•commotionfever•4d ago•16 comments

LLM Architecture Gallery

https://sebastianraschka.com/llm-architecture-gallery/
129•tzury•6h ago•6 comments

Separating the Wayland compositor and window manager

https://isaacfreund.com/blog/river-window-management/
170•dpassens•7h ago•74 comments

What makes Intel Optane stand out (2023)

https://blog.zuthof.nl/2023/06/02/what-makes-intel-optane-stand-out/
160•walterbell•7h ago•105 comments

C++26: The Oxford Variadic Comma

https://www.sandordargo.com/blog/2026/03/11/cpp26-oxford-variadic-comma
84•ingve•4d ago•39 comments

Bus travel from Lima to Rio de Janeiro

https://kenschutte.com/lima-to-rio-by-bus/
83•ks2048•4d ago•28 comments

Glassworm Is Back: A New Wave of Invisible Unicode Attacks Hits Repositories

https://www.aikido.dev/blog/glassworm-returns-unicode-attack-github-npm-vscode
181•robinhouston•9h ago•108 comments

Learning athletic humanoid tennis skills from imperfect human motion data

https://zzk273.github.io/LATENT/
101•danielmorozoff•6h ago•15 comments

In Memoriam: John W. Addison, my PhD advisor

https://billwadge.com/2026/03/15/in-memoriam-john-w-addison-jr-my-phd-advisor/
73•herodotus•6h ago•4 comments

A Visual Introduction to Machine Learning (2015)

https://r2d3.us/visual-intro-to-machine-learning-part-1/
293•vismit2000•11h ago•26 comments

Show HN: GDSL – 800 line kernel: Lisp subset in 500, C subset in 1300

https://firthemouse.github.io/
48•FirTheMouse•6h ago•13 comments

Show HN: Signet – Autonomous wildfire tracking from satellite and weather data

https://signet.watch
100•mapldx•10h ago•27 comments

Autoresearch Hub

http://autoresearchhub.com/
25•EvgeniyZh•1d ago•9 comments

Show HN: What if your synthesizer was powered by APL (or a dumb K clone)?

https://octetta.github.io/k-synth/
69•octetta•9h ago•28 comments

Hollywood Enters Oscars Weekend in Existential Crisis

https://www.theculturenewspaper.com/hollywood-enters-oscars-weekend-in-existential-crisis/
101•RickJWagner•9h ago•325 comments

Ask HN: How is AI-assisted coding going for you professionally?

145•svara•6h ago•221 comments

Office.eu launches as Europe's sovereign office platform

https://office.eu/media/pressrelease-20260304
225•campuscodi•3h ago•120 comments

Chasing the Ivory-Billed Woodpecker (2023)

https://gardenandgun.com/feature/chasing-the-ivory-billed-woodpecker/
3•NaOH•4d ago•0 comments

IBM, sonic delay lines, and the history of the 80×24 display (2019)

https://www.righto.com/2019/11/ibm-sonic-delay-lines-and-history-of.html
70•rbanffy•11h ago•26 comments

Generating All 32-Bit Primes (Part I)

https://hnlyman.github.io/pages/prime32_I.html
73•hnlyman•10h ago•22 comments

Grandparents are glued to their phones, families are worried [video]

https://www.bbc.com/reel/video/p0n61dg3/grandparents-are-glued-to-their-phones-families-are-worried
159•tartoran•4h ago•108 comments

$96 3D-printed rocket that recalculates its mid-air trajectory using a $5 sensor

https://github.com/novatic14/MANPADS-System-Launcher-and-Rocket
352•ZacnyLos•11h ago•328 comments

Kniterate Notes

https://soup.agnescameron.info//2026/03/07/kniterate-notes.html
51•surprisetalk•5d ago•10 comments

Animated 'Firefly' Reboot in Development from Nathan Fillion, 20th TV

https://www.hollywoodreporter.com/tv/tv-news/animated-firefly-reboot-in-development-nathan-fillio...
82•Amorymeltzer•4h ago•14 comments

Why Mathematica does not simplify sinh(arccosh(x))

https://www.johndcook.com/blog/2026/03/10/sinh-arccosh/
141•ibobev•4d ago•54 comments

The 100 hour gap between a vibecoded prototype and a working product

https://kanfa.macbudkowski.com/vibecoding-cryptosaurus
216•kiwieater•10h ago•287 comments

Measure of Justice: Covering the Cerîde-I Adliye Covers (2017)

https://www.denizcemonduygu.com/2017/05/measure-of-justice/
4•benbreen•3d ago•0 comments

Stop Sloppypasta

https://stopsloppypasta.ai/
15•namnnumbr•4h ago•1 comments
Open in hackernews

The 49MB Web Page

https://thatshubham.com/blog/news-audit
156•kermatt•2h ago

Comments

Bratmon•1h ago
Maybe I'm just getting old, but I've gotten tired of these "Journalists shouldn't try to make their living by finding profitable ads, they should just put in ads that look pretty but pay almost nothing and supplement their income by working at McDonalds" takes.
ronsor•1h ago
Well, I'm going to block the ads anyway (or just leave), so if they're trying to find profitable ads, they may need to revise their strategy.
jdross•1h ago
“I’m going to either steal your work in a way you don’t consent to, or not consume it” isn’t really great. The alternative is paywalls
zoklet-enjoyer•1h ago
Much of their work consists of poorly sourced articles, sensationalism, disinformation, and bias to sway the audience.
xigoi•44m ago
Steal? Their server gave me some HTML and it’s up to my user agent to present it however I want.
decimalenough•1h ago
I'm pretty sure people would read more and click on more ads if they didn't have to endure waiting for 49 MB of crap and then navigating a pop-up obstacle course for each article.
Bratmon•38m ago
100,000 people clicking at $0.01 CPM is way worse for them than 10,000 people clicking at $2 CPM.
bsjshshsb•1h ago
49MB or homelessness? There is surely other options.
hilbert42•1h ago
Solution, see my post. ;-)
Bratmon•36m ago
If you can think of any, then congratulations! You've saved journalism!

You should probably tell someone so the knowledge doesn't die with you.

neya•1h ago
This argument is valid if journalism was actually journalism instead of just ripping off trending stories from HN and Reddit and rehashing it with sloppy AI and calling it a day and putting in 4 lines of text buried inside 400 ads.
pibaker•1h ago
I don't like the state of journalism either but you realize this is a vicious cycle, no? People not paying for news (by buying newspaper, or more importantly paying for classified ads) leading to low quality online reporting leading to people not wanting to pay for online news.
curtisblaine•45m ago
I never understand this type of comment. People don't pay for news so newspapers (which by the way have pay walls) are forced to degrade their service. It seems strange to me. If I have a restaurant and people don't want to pay for my food, making even worse food with worse service doesn't seem a good solution. If I write books and people don't buy them, writing worse books doesn't make my sales better. Why journalists are different? They sell a service for money like all the others, but for some reason they have a special status and it's totally understandable that they respond to bad sales with a worse product. And actually, somehow it's our fault as customers. For some reason we should keep buying newspapers even if we don't think it's worth to save them from themselves.
apublicfrog•28m ago
Using your analogy, if every restaurant in town had a problem where most people wanted to come in and get food for free (and it was an expectation in the industry) and people refused to go in and pay, everyone would be upset they could no longer go out to eat when there were none left. If nobody is interested in paying for their meal, you can't be shocked the ingredient and chef quality drops in turn.
scared_together•1h ago
In the case of the New York Times, they have subscriptions and many are willing to pay for their work - but their subscriptions are not ad-free.
curtisblaine•59m ago
> Journalists shouldn't try to make their living by finding profitable ads

I mean, they can absolutely try. That doesn't mean they should succeed.

decimalenough•1h ago
This is why people continue to lament Google Reader (and RSS in general): it was a way to read content on your own terms, without getting hijacked by ads.
Cyphase•1h ago
RSS and feed readers still exist! All hope is not lost.
fsflover•1h ago
People should stop lamenting Google Reader and start using RSS. There are numerous threads about it on HN, e.g., https://news.ycombinator.com/item?id=45459233
bergheim•1h ago
What on earth do you have to rely on alphabet, an ad company, to read rss for? there are many other options, that are not made by an ad company.

Google Reader was never the answer. It's such a shame that people even here don't realize that relying on Google for that had interests at odds - and you weren't part of the equation at all.

Well, except for your data. You didn't give them enough data. So they shut down shop. Gmail though, ammirite? :D

Yeah I wonder why gmail was not one of the shut down products /s

bot403•1h ago
Why lament it? I've been using Inoreader for over a decade after Google Reader went away. And I gladly pay for it year after year.
h4ch1•1h ago
This rubbish also exists disproportionately for recipe pages/cooking websites as well.

You have 20 ads scattered around, an autoplaying video of some random recipe/ad, 2-3 popups to subscribe, buy some affiliated product and then the author's life story and then a story ABOUT the recipe before I am able to see the detailed recipe in the proper format.

It's second nature to open all these websites in reader mode for me atp.

jopsen•1h ago
Good sites do exist. It's just that they drown.
h4ch1•1h ago
True, these ad heavy cooking sites also dabble extensively in SEOmaxxing their way to the top.
ray023•1h ago
I think it's a GOOD thing, actually. Because all these publications a dying anyway. And even if your filter out all the ad and surveillance trash, you are left with trash propaganda and brain rot content. Like why even make the effort of filtering out the actual text from some "journalist" from these propaganda outlets. It's not even worth it.

If people tune out only because how horrible the sites are, good.

hilbert42•1h ago
These days the NYT is in a race to the bottom. I no longer even bother to bypass ads let alone read the news stories because of its page bloat and other annoyances. It's just not worth the effort.

Surely news outlets like the NYT must realize that savvy web surfers like yours truly when encountering "difficult" news sites—those behind firewalls and or with megabytes of JavaScript bloat—will just go elsewhere or load pages without JavaScript.

We'll simply cut the headlines from the offending website and past it into a search engine and find another site with the same or similar info but with easier access.

I no longer think about it as by now my actions are automatic. Rarely do I find an important story that's just limited to only one website, generally dozens have the story and because of syndication the alternative site one selects even has identical text and images.

My default browsing is with JavaScript defaulted to "off" and it's rare that I have to enable it (which I can do with just one click).

I never see Ads on my Android phone or PC and that includes YouTube. Disabling JavaScript on webpages nukes just about all ads, they just vanish, any that escape through are then trapped by other means. In ahort, ads are optional. (YouTube doesn't work sans JS, so just use NewPipe or PipePipe to bypass ads.)

Disabling JavaScript also makes pages blindingly fast as all that unnecessary crap isn't loaded. Also, sans JS it's much harder for websites to violate one's privacy and sell one's data.

Do I feel guilty about skimming off info in this manner? No, not the slightest bit. If these sites played fair then it'd be a different matter but they don't. As they act like sleazebags they deserve to be treated as such.

CalRobert•1h ago
Do you think youtube will continue to make it possible to use alternate clients, or eventually go the way of e.g. Netflix with DRM so you're forced to use their client and watch ads?
curtisblaine•1h ago
Big tech will slowly enforce "secure browsing" and "secure OS" in a way that will make it impossible to browse the web without a signed executable approved by them. DRM is just a temporary stopgap.
hilbert42•50m ago
If Google were just starting YouTube today then DRM would likely be enforced through a dedicated app. The trouble for Google is that millions watch YouTube through web browsers many of whom aren't even using a Google account let alone even being subscribers to a particular YouTube page. Viewership would drop dramatically.

Only several days ago I watched the presenter of RobWords whinging about wanting more subscribers and stating that many more people just watch his presentations than watch and also subscribe.

The other problem YouTube has is that unlike Netflix et al with high ranking commercial content are the millions of small presenters who do not use advertising and or just want to tell the world at large their particular stories. Enforced DRM would altogether ruin that ecosystem.

alpinisme•1h ago
What does playing fair mean in this context? It would be one thing if you were a paid subscriber complaining that even paying sucks so you left, but it sounds like you’re not.
hilbert42•1h ago
I'd like to answer that in detail but it's impractical to do so here as it'd take pages. As a starter though begin with them not violating users' privacy.

Another quick point: my observation is that the worse the ad problem the lower quality the content is. Cory Doctorow's "enshitification" encapsulates the problems in a nutshell.

zahlman•57m ago
If you have enough detail for a blog post I'd heartily encourage you to submit it.
curtisblaine•1h ago
You're right, it means nothing. But it cuts two ways. These sites are sending me bytes and I choose which bytes I visualize (via an ad blocker). Any expectation the website has about how I consume the content has no meaning and it's entirely their problem.
Aurornis•37m ago
It is strange to hear these threats about avoiding websites from people who are not subscribers and also definitely using an ad blocker.

News sites aren’t publishing their content for the warm fuzzy feeling of seeing their visitor count go up. They’re running businesses. If you’re dead set on not paying and not seeing ads, it’s actually better for them that you don’t visit the site at all.

appreciatorBus•1h ago
> Surely news outlets like the NYT must realize that savvy web surfers like yours truly when encountering "difficult" news sites—those behind firewalls and or with megabytes of JavaScript bloat—will just go elsewhere or load pages without JavaScript.

No.

"savvy" web surfers are a rounding error in global audience terms. Vast majorities of web users, whether paying subscribers to a site like NYT or not, have no idea what a megabyte is, nor what javascript is, nor why they might want to care about either. The only consideration is whether the site has content they want to consume and whether or not it loads. It's true that a double digit % are using ad blockers, but they aren't doing this out of deep concerns about Javascript complexity.

Do what you have to do, but no one at the NYT is losing any sleep over people like us.

keane•1h ago
It’s hard to beat https://lite.cnn.com and https://text.npr.org (I imagine their own employees likely use these as well) or https://newsminimalist.com
TheMode•40m ago
https://lite.cnn.com seems to load 200KB of CSS
gxs•16m ago
I’m honestly dumbfounded that these exist

In the past some site had light versions, but I haven’t come across one in over 10 years

Makes me wonder if this isn’t just some rogue employee maintaining this without anyone else realizing it

It’s the light version, but ironically I would happily pay these ad networks a monthly $20 to just serve these lite pages and not track me. They don’t make anywhere close to that from me in a year

Sadly, here’s how it would go: they’d do it, it be successful, they’d ipo, after a few years they’d need growth, they’d introduce a new tier with ads, and eventually you’d somehow wind up watching ads again

Aurornis•49m ago
> Surely news outlets like the NYT must realize that savvy web surfers like yours truly when encountering "difficult" news sites—those behind firewalls and or with megabytes of JavaScript bloat—will just go elsewhere or load pages without JavaScript.

They know this. They also know that web surfers like you would never actually buy a subscription and you have an ad blocker running to deny any revenue generation opportunities.

Visitors like you are a tiny minority who were never going to contribute revenue anyway. You’re doing them a very tiny favor by staying away instead of incrementally increasing their hosting bills.

jkestner•41m ago
> They also know that web surfers like you would never actually buy a subscription

I subscribe, and yet they still bombard me with ads. Fuck that. One reason I don’t use apps is that I can’t block ads.

emodendroket•22m ago
> Surely news outlets like the NYT must realize that savvy web surfers like yours truly when encountering "difficult" news sites—those behind firewalls and or with megabytes of JavaScript bloat—will just go elsewhere or load pages without JavaScript.

Seems like a gross overestimation of how much facility people have with computers but they don't want random article readers anyway; they want subscribers who use the app or whatever.

napolux•1h ago
and the NYT web team was praised as one of the best in the world some (many?) years ago.
gnabgib•1h ago
Some of them are good (formerly Richard Harris - Svelte[0]) some of them should stop podcasting.

[0]: https://svelte.dev/

keane•1h ago
previously: nytlabs.com https://web.archive.org/web/20191025052129/http://nytlabs.co...

now: https://rd.nytimes.com

mvrckhckr•1h ago
Only major media can get away with this kind of bloat. For the normal website, Google would never include you in the SERPs even if your page is a fraction of that size.
galphanet•1h ago
This is just the top of the iceberg. Don't get me started on airlines websites (looking at you Air Canada), where the product owner, designers, developers are not able to get a simple workflow straight without loading Mb of useless javascript and interrupt the user journey multiple times. Give me back the command line terminal like Amadeus, that would be perfect.

How can we go back to a Web where websites are designed to be used by the user and not for the shareholders?

bigfatkitten•1h ago
> How can we go back to a Web where websites are designed to be used by the user and not for the shareholders?

Or for developers to pad their CV.

niccl•1h ago
Sadly, I think the only answer is some other form of payment than ad clicks. I've no idea what that could be, though.
userbinator•1h ago
How can we go back to a Web where websites are designed to be used by the user and not for the shareholders?

Loudly oppose the trendchasing devs who have been brainwashed into the "newer is better" mindset by Big Tech. I'm sure the shareholders would want to reduce the amount they spend on server/bandwidth costs and doing "development and maintenance" too.

Simple HTML forms can already make for a very usable and cheap site, yet a whole generation of developers have been fed propaganda about how they need to use JS for everything.

ngruhn•1h ago
> Don't get me started on airlines websites

You can't beat China Southern . They have the most dog shit website I've ever seen. The flight was fine but I gave up doing online check in after 3 attempts. Never mind the bloat:

- required text fields with wrong or missing labels. One field was labeled "ticket no.". It kept getting rejected. I randomly tried passport number instead. It worked.

- sometimes fields only have a placeholder that you can't fully read because the field has not enough width ("Please enter the correct...") and the placeholder disappears once you start typing.

- date picker is randomly in Chinese

- makes you go through multi step seat selection process only to tell you at the end that seat selection is not possible anymore.

- signed up with email; logged out and went back to the SAME login page; now sign up via phone number is required!?

PunchyHamster•1h ago
Our developers managed to run around 750MB per website open once.

They have put in ticket with ops that the server is slow and could we look at it. So we looked. Every single video on a page with long video list pre-loaded a part of it. The single reason the site didn't ran like shit for them is coz office had direct fiber to out datacenter few blocks away.

We really shouldn't allow web developers more than 128kbit of connection speed, anything more and they just make nonsense out of it.

ceejayoz•1h ago
Same for fancy computers. Dev on a fast one if you like, but test things out on a Chromebook.
mananaysiempre•1h ago
“Craptop duty”[1]. (Third time in three years I’m posting an essentially identical comment, hah.)

[1] https://css-tricks.com/test-your-product-on-a-crappy-laptop/

tom1337•41m ago
I now wonder if it'd be a good idea to move our end to end tests to a pretty slow vm instead of beefy 8 core 32gb ram machine and check which timeouts will be triggered because our app may have been unoptimized for slower environments...
Joel_Mckay•1h ago
Based on the damage rate for company laptop screens, one can usually be sure anything high-end will be out of your own pocket. =3
drcongo•41m ago
Music producers often have some shitty speakers known as grot boxes that they use to make sure their mix will sound as good as it can on consumer audio, not just on their extremely expensive studio monitors. Chromebooks are perfectly analogous. As a side note, today I learned that Grotbox is now an actual brand: https://grotbox.com
sublinear•1h ago
I'm pretty damn sure those videos were put on the page because someone in marketing wanted them. I'm pretty sure then QA complained the videos loaded too slowly, so the preloading was added. Then, the upper management responsible for the mess shrugged their shoulders and let it ship.

You're not insightful for noticing a website is dog slow or that there is a ton of data being served (almost none of which is actually the code). Please stop blaming the devs. You're laundering blame. Almost no detail of a web site or app is ever up to the devs alone.

From the perspective of the devs, they expect that the infrastructure can handle what the business wanted. If you have a problem you really should punch up, not down.

arccy•1h ago
Sounds just like a "helpless" dev that shifts blame to anyone but themselves.
sublinear•1h ago
Do you have a suggestion how else to handle the situation I described?
jiggawatts•13m ago
There’s a magic word that can be used in scenarios like this: “No.”

Failing that, interpret the requirements.

Nobody can watch a bunch of videos at once that don’t even show up until you scroll! That’s a nonsense requirement and the dev’s failure to push back or redirect in a more viable direction is a sign of their incompetence, not that of the non-technical manager that saw YouTube’s interface and assumes that that’s normal and doable.

It is! You’d have to know about lazy loading and CDNs, but neither is black magic.

Joel_Mckay•1h ago
In general, how people communicate internally and with the public is important.

https://en.wikipedia.org/wiki/Conway's_law

Have a wonderful day =3

zahlman•1h ago
"Developers" here clearly refers to the entire organization responsible. The internal politics of the foo.com providers are not relevant to Foo users.
sublinear•47m ago
I agree except for your definition of "developers". I see this all the time and can't understand why the blame can't just be the business as a whole instead of singling out "developers". In fact, the only time I ever hear "developers" used that way it's a gamer without a job saying it.

The blame clearly lies with the contradictory requirements provided by the broader business too divorced from implementation details to know they're asking for something dumb. Developers do not decide those.

hobs•1h ago
From the perspective of the devs, they have a responsibility for saying something literally wont fly anywhere, ever, saying the business is responsible for every bad decision is a complete abrogation of your responsibilities.
sublinear•1h ago
Why don't you tell your boss or team something like that and see how well that flies.

The responsibility of the devs is to deliver what was asked. They can and probably do make notes of the results. So does QA. So do the other stakeholders. On their respective teams they get the same BS from everyone who isn't pleased with the outcome.

Ultimately things are on a deadline and the devs must meet requirements where the priority is not performance. It says nothing about their ability to write performant code. It says nothing about whether that performant code is even possible in a browser while meeting the approval of the dozens of people with their own agendas. It says everything about where you work.

jkestner•44m ago
Maybe everyone’s got a different situation, but when a different department tried to put ActiveX avatars all over their site, though it offended me from a UX perspective, I was able to get higher ups to reject it by pointing out that it would shut out 20% of their customers.

We always have discussions here about how you have to learn to talk to communicate your value to clients in a language they understand. Same goes for internal communications.

hobs•6m ago
I didn't say anything about their development abilities, what I am pointing to is their professional responsibility. If a doctor is asked by a client to cut off their arm and they say no, and the client fires them, did the doctor err? (No) This doesn't comment on their ability to do surgery.
toast0•6m ago
> The responsibility of the devs is to deliver what was asked.

Software development isn't factory work. And factory workers are expected to notice problems and escalate them.

Anyway, they're paying me far too much to have me turn off my brain and just check the boxes they want checked in all situations. Sometimes, checking boxes because they need to be checked is the thing to do, but usually it's not.

xigoi•55m ago
> Please stop blaming the devs. You're laundering blame. Almost no detail of a web site or app is ever up to the devs alone.

If a bridge engineer is asked to build a bridge that would collapse under its own weight, they will refuse. Why should it be different for software engineers?

sublinear•52m ago
It's a website and not a bridge. Based on the description given, it's not a critical website either. If it was, the requirements would have specified it must be built differently.

You're not even arguing with me BTW. You're arguing against the entire premise of running a business. Priorities are not going to necessarily be what you value most.

nikanj•50m ago
Because bridge engineers can be sued if the bridge kills people
Joel_Mckay•1h ago
If you want to see context aware pre-fetching done right go to mcmaster.com ...

There are good reasons to have a small cheap development staging server, as the rate-limited connection implicitly trains people what not to include. =3

vunderba•28m ago
PSA for those who aren’t aware: Chromium/Firefox-based browsers have a Network tab in the developer tools where you can dial down your bandwidth to simulate a slower 3G or 4G connection.

Combined with CPU throttling, it's a decent sanity check to see how well your site will perform on more modest setups.

KronisLV•9m ago
I once spent around an hour optimizing a feature because it felt slow - turns out that the slower simulated connection had just stayed enabled after a restart (can’t remember if it was just the browser or the OS, but I previously needed it and then later just forgot to turn it off). Good times, useful feature though!
solarkraft•5m ago
Working as intended!
Crowberry•1h ago
I hate this trend of active distraction. Most blogs have a popup asking you to subscribe as soon as you start scrolling.

It’s as if everyone designed their website around the KPI of irritating your visitors and getting them to leave ASAP.

throwatdem12311•1h ago
49mb web page? Try a 45mb graphql response.
zahlman•1h ago
This site more or less practices what it preaches. `newsbanner.webp` is 87.1KB (downloaded and saved; the Network tab in Firefox may report a few times that and I don't know why); the total image size is less than a meg and then there's just 65.6KB of HTML and 15.5 of CSS.

And it works without JavaScript... but there does appear to be some tracking stuff. A deferred call out to Cloudflare, a hit counter I think? and some inline stuff at the bottom that defers some local CDN thing the old-fashioned way. Noscript catches all of this and I didn't feel like allowing it in order to weigh it.

cjs_ac•1h ago
My family's first broadband internet connection, circa 2005, came with a monthly data quota of 400 MB.

The fundamental problem of journalism is that the economics no longer works out. Historically, the price of a copy of a newspaper barely covered the cost of printing; the rest of the cost was covered by advertising. And there was an awful lot of advertising: everything was advertised in newspapers. Facebook Marketplace and Craigslist were a section of the newspaper, as was whichever website you check for used cars or real estate listings. Journalism had to be subsidised by advertising, because most people aren't actually that interested in the news to pay the full cost of quality reporting; nowadays, the only newspapers that are thriving are those that aggressively target those who have an immediate financial interest in knowing what's going on: the Financial Times, Bloomberg, and so on.

The fact is that for most people, the news was interesting because it was new every day. Now that there is a more compelling flood of entertainment in television and the internet, news reporting is becoming a niche product.

The lengths that news websites are going to to extract data from their readers to sell to data brokers is just a last-ditch attempt to remain profitable.

lousken•58m ago
rule #1 is to always give your js devs only core 2 quad cpus + 16GB of RAM

they won't be able to complain about low memory but their experience will be terrible every time they try to shove something horrible into the codebase

userbinator•56m ago
I also use and like the comparison in units of Windows 95 installs (~40MB), which is also rather ironic in that Win95 was widely considered bloated when it was released.

While this article focuses on ads, it's worth noting that sites have had ads for a long time, but it's their obnoxiousness and resource usage that's increased wildly over time. I wouldn't mind small sponsored links and (non-animated!) banners, but the moment I enable JS to read an article and it results in a flurry of shit flying all over the page and trying to get my attention, I leave promptly.

gxs•6m ago
I would love for someone more knowledgeable in this space than I to chime in on the economics of this industry

Are the few cents you get from antagonizing users really worth it?

I suspect the answer is simple and that most users don’t give a shit

I think it has to do a lot with when you came of age - I’m in my late 30s, I got my first tech job at 14 as a sys admin for a large school district, and every single developer, admin, etc that I knew was already going on about the free internet. As a result, I’ve never had a tolerance for anything but the most reasonable advertisements

I think that ideology is necessary to care enough and be motivated enough to really get rid of ads, how fucking awful the websites are alone should be enough but for most people it isn’t

the_snooze•53m ago
It's really hard to consider any kind of web dev as "engineering." Outcomes like this show that they don't have any particular care for constraints. It's throw-spaghetti-at-the-wall YOLO programming.
BoneShard•50m ago
it's still engineering, just for different constraints - cost & speed.
nayroclade•8m ago
There are plenty of web devs who care about performance and engineering quality. But caring about such things when you work on something like a news site is impossible: These sites make their money through user tracking, and it's literally your job to stuff in as many 3rd-party trackers as management tells you to. Any dev who says no on the basis that it'll slow the site down will get fired as quickly as a chef who get a shift job in McDonalds and tries to argue for better cuisine.
shevy-java•47m ago
Ublock origin helps mitigate at the least a little bit here.
mrb•24m ago
Let's play a fun prediction: I ask HN readers what will be the page size of NYTimes.com in 10 years? Or 20 years?

Want to bet 100 MB? 1 GB? Is it unthinkable?

20 years ago, a 49 MB home page was unthinkable.

dizzy9•6m ago
I remember in 2008, when Wizards of the Coast re-launched the official Dungeons & Dragons website to coincide with the announcement of the fourth edition rules. The site was something in the region of 4 MB, plus a 20 MB embedded video file. A huge number of people were refreshing the site to see what the announcement was, and it was completely slammed. Nobody could watch the trailer until they uploaded it to YouTube later.

4 MB was an absurd size for a website in 2008. It's still an absurd size for a website.