frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

Show HN: Draw a fish and watch it swim with the others

https://drawafish.com
480•hallak•3d ago•155 comments

Corporation for Public Broadcasting Ceasing Operations

https://cpb.org/pressroom/Corporation-Public-Broadcasting-Addresses-Operations-Following-Loss-Federal-Funding
276•coloneltcb•1h ago•235 comments

Supporting the BEAM community with free CI/CD security audits

https://www.erlang-solutions.com/blog/supporting-the-beam-community-with-free-ci-cd-security-audits/
44•todsacerdoti•2h ago•6 comments

Gemini 2.5 Deep Think

https://blog.google/products/gemini/gemini-2-5-deep-think/
287•meetpateltech•7h ago•143 comments

Ask HN: Who is hiring? (August 2025)

70•whoishiring•3h ago•101 comments

Fast (2019)

https://patrickcollison.com/fast
57•samuel246•3h ago•8 comments

At 17, Hannah Cairo solved a major math mystery

https://www.quantamagazine.org/at-17-hannah-cairo-solved-a-major-math-mystery-20250801/
78•baruchel•2h ago•32 comments

Hyrum's Law

https://www.hyrumslaw.com
55•andsoitis•3h ago•21 comments

Coverage Cat (YC S22) Is Hiring a Senior, Staff, or Principal Engineer

https://www.coveragecat.com/careers/engineering/software-engineer
1•botacode•1h ago

I couldn't submit a PR, so I got hired and fixed it myself

https://www.skeptrune.com/posts/doing-the-little-things/
23•skeptrune•1h ago•5 comments

Ask HN: Who wants to be hired? (August 2025)

29•whoishiring•3h ago•75 comments

Pseudo, a Common Lisp macro for pseudocode expressions

http://funcall.blogspot.com/2025/07/pseudo.html
27•reikonomusha•3d ago•0 comments

Show HN: An interactive dashboard to explore NYC rentals data

https://leaseswap.nyc/analytics
54•giulioco•2d ago•34 comments

Google Shifts goo.gl Policy: Inactive Links Deactivated, Active Links Preserved

https://blog.google/technology/developers/googl-link-shortening-update/
59•shuuji3•58m ago•43 comments

Make Your Own Backup System – Part 2: Forging the FreeBSD Backup Stronghold

https://it-notes.dragas.net/2025/07/29/make-your-own-backup-system-part-2-forging-the-freebsd-backup-stronghold/
35•todsacerdoti•3d ago•0 comments

Launch HN: Societies.io (YC W25) – AI simulations of your target audience

41•p-sharpe•6h ago•31 comments

Show HN: Pontoon – Open-source customer data syncs

https://github.com/pontoon-data/Pontoon
26•alexdriedger•3h ago•6 comments

Yes in My Bamako Yard

https://asteriskmag.com/issues/11/yes-in-my-bamako-yard
50•surprisetalk•4d ago•6 comments

Ergonomic keyboarding with the Svalboard: a half-year retrospective

https://twey.io/hci/svalboard/
17•Twey•2h ago•3 comments

Every satellite orbiting earth and who owns them (2023)

https://dewesoft.com/blog/every-satellite-orbiting-earth-and-who-owns-them
234•jonbaer•13h ago•106 comments

Rollercoaster Tycoon (Or, MicroProse's Last Hurrah)

https://www.filfre.net/2025/08/rollercoaster-tycoon-or-microproses-last-hurrah/
71•cybersoyuz•2h ago•20 comments

Replacing tmux in my dev workflow

https://bower.sh/you-might-not-need-tmux
194•elashri•9h ago•217 comments

Our Farewell from Google Play

https://secuso.aifb.kit.edu/english/2809.php
152•shakna•9h ago•54 comments

Slow

https://michaelnotebook.com/slow/index.html
903•calvinfo•23h ago•214 comments

OpenAI Leaks 120B Open Model on Hugging Face

https://twitter.com/main_horse/status/1951201925778776530
80•skadamat•2h ago•36 comments

The untold impact of cancellation

https://pretty.direct/impact
246•cbeach•6h ago•262 comments

Fakes, Nazis, and Fake Nazis

https://airmail.news/issues/2025-7-26/fakes-nazis-and-fake-nazis
14•prismatic•3d ago•0 comments

Live coding interviews measure stress, not coding skills

https://hadid.dev/posts/living-coding/
397•mustaphah•5h ago•406 comments

Long Term Support

https://www.sqlite.org/lts.html
133•rishikeshs•4h ago•43 comments

The anti-abundance critique on housing is wrong

https://www.derekthompson.org/p/the-anti-abundance-critique-on-housing
469•rbanffy•21h ago•717 comments
Open in hackernews

The Chrome Speculation Rules API allows the browser to preload and prerender

https://www.docuseal.com/blog/make-any-website-load-faster-with-6-lines-html
103•amadeuspagel•1d ago

Comments

game_the0ry•1d ago
Wonder if this is will replace how nextjs and nuxt do optimistic pre-fetching when users hover on links.

Also brings up the questions:

- should browser do this by default?

- if yes, would that result in too many necessary requests (more $$)?

Either way, good to know.

babanooey21•1d ago
It probably won't replace Nuxt.js/Nuxt’s pre-fetching, as such websites function as SPAs, using internal JavaScript pushState navigation, which has become standard for those frameworks.

However, Next.js pre-fetching can't perform pre-rendering on hover, which can cause a noticeable lag during navigation. The native Chrome API allows not only pre-fetching, but also pre-rendering, enabling instant page navigation.

exasperaited•1d ago
Link prefetching is generally something you would want a website to retain control over, because it can distort stats, cause resource starvation, and even (when web developers are idiots) cause things like deletions (when a clickable link has a destructive outcome).

I am reminded of the infamous time when DHH had to have it explained to him that GET requests shouldn’t have side effects, after the Rails scaffolding generated CRUD deletes on GET requests.

https://dhh.dk/arc/000454.html

Google were not doing anything wrong here, and DHH was merely trying to deflect blame for the incompetence of the Rails design.

But the fact remains, alas, that this kind of pattern of mistakes is so common, prefetching by default has risks.

radicaldreamer•1d ago
Imagine mousing over a delete account button and having your browser render that page and execute JS in the background.
dlcarrier•1d ago
Is it typical to count a single bracket as a line?
technojunkie•1d ago
This article is a good reminder of another one from late last year: https://csswizardry.com/2024/12/a-layered-approach-to-specul...

In the above article, Harry gives a more nuanced and specific method using data attributes to target specific anchors in the document, one reason being you don't need to prerender login or logout pages.

<a href data-prefetch>Prefetched Link</a>

<a href data-prefetch=prerender>Prerendered Link</a>

<a href data-prefetch=false>Untouched Link</a>

meindnoch•1d ago
Ah, another web standard™ from Chrome. Just what we needed!
rafram•1d ago
Like it or not, the current web standardization process requires implementations to be shipped in multiple browsers before something can enter the spec.
epolanski•1d ago
In canary or experimental flags.
bornfreddy•23h ago
Multiple browser engines or just multiple browsers?
leptons•1d ago
Internet Explorer first implemented XMLHttpRequest, and then it became a standard. Without browser makers innovating we'de be waiting a long, long time for the W3C to make any progress, if any at all.
giantrobot•1d ago
A majority of Chrome's "standard" pushes have the alternate use of better browser fingerprinting. Ones of end users might use WebMIDI to work with musical devices but every scammy AdTech company will use it to fingerprint users. Same with most of their other "standards". It's almost like Google is an AdTech firm that happens to make a web browser.
leptons•23h ago
I honestly don't care about fingerprinting. It's too late to worry about that, because there will always be a way to do it, to some extent. And me using a browser with WebMIDI only means I'm one of the 3.4 billion people using Chrome. There are better ways to "fingerprint" people online. Detecting browser APIs is not a particularly good one.
giantrobot•20h ago
It's not the presence of WebMIDI that's the problem. It's the devices it can enumerate. Same with their other Web* APIs that want to enumerate devices outside the browser's normal sandboxing.
leptons•12h ago
So what? You need to give access to each website that wants to interact with WebMIDI. It doesn't just let any website poll the devices. It's no different than the location API or any other API - the user has to explicitly grant access to each website that wants to use the API. If you don't trust the site, don't give it access. It's really as simple as that.

Sorry chicken little, the sky is not falling.

HaZeust•1d ago
Off-topic (semi) but I'm a big fan of Docuseal - I use them more my client-contractor agreements without any issue. Pricing is unbeatable as well, other contract-signing services have completely lost the plot.
t1234s•1d ago
Don't feature like this waste bandwidth and battery on mobile devices?
babanooey21•1d ago
It preloads pages on mouse hover over the a href link. On mobile there are no mouse hover events. The page can be preloaded on "touchstart" event which almost definitely results in page visit.
radicaldreamer•1d ago
Not to mention laptops! Loads of people use those on battery power
youngtaff•19h ago
Putting aside the lack of hover on mobile (there are other ways to trigger it)

It’s not clear it will waste battery on mobile… not sure if it’s still the case but mobile radios go to sleep and waking them used a non-trivial amount of energy so preloading a page was more efficient than let the radio go to sleep and then waking it

Need someone who’s more informed than I am to review wether that’s still the case

MrJohz•19h ago
The point of this proposal is to bring this feature under the control of the browser, which is probably better placed to decide when preloading/prerendering should happen. It's already possible to do something very similar using JS, but it's difficult to control globally when that happens across different sites. Whereas when this feature is built into the browser, then the browser can automatically disable prerendering when the user (for example) has low battery, or is on a metered internet connection, or if the user simply doesn't want prerendering to happen at all.

So in theory, this should actually reduce bandwidth/battery wastage, by giving more control to the browser and user, rather than individual websites.

ozgrakkurt•1d ago
Or just put in some effort to make things actually more efficient and don’t waste resources on the user’s machine.
ashwinsundar•1d ago
Those aren’t mutually exclusive goals. You can serve efficient pages AND enable pre-fetch/pre-render. Let’s strive for sub-50ms load times
tonyhart7•1d ago
Yeah but it "fake" sub 50ms load when you load it at the front before it shows
dlivingston•1d ago
I guess you could call it fake or cheating, but ahead-of-time preparation of resources and state is used all the time. Speculative execution [0], DNS prefetching [1], shader pre-compilation, ... .

[0]: https://en.wikipedia.org/wiki/Speculative_execution

[1]: https://www.chromium.org/developers/design-documents/dns-pre...

tonyhart7•1d ago
also dns not changing every second where you website need it

Yeah but you not only "count" it only when it shows tho????

I'am not saying its not "valid" but when you count it only when you shows it, are we really not missing a part why we need "cache" in the first place?

dietr1ch•1d ago
Every layer underneath tries really hard to cheat and keep things usable/fast.

This includes libraries, kernels, CPUs, devices, drivers and controllers. The higher level at which you cheat, the higher the benefits.

jmull•1d ago
Not mutually exclusive, but they compete for resources.

Prefetch/prerender use server resources, which costs money. Moderate eagerness isn’t bad, but also has a small window of effect (e.g. very slow pages will still be almost as slow, unless all your users languidly hover their mouse over each link for a while).

Creating efficient pages takes time from a competent developer, which costs money upfront, but saves server resources over time.

I don’t have anything against prefetch/render, but it’s a small thing compared to efficient pages (at which point you usually don’t need it).

ashwinsundar•23h ago
> Creating efficient pages takes time from a competent developer, which costs money upfront, but saves server resources over time.

Not trying to be a contrarian just for the sake of it, but I don't think this has to be true. Choice of technology or framework also influences how easy it is to create an efficient page, and that's a free choice one can make*

* Unless you are being forced to make framework/language/tech decisions by someone else, in which case carry on with this claim. But please don't suggest it's a universal claim

dietr1ch•1d ago
Idk, if you are starting from prerender/prefetch `where href_matches "/*"` maybe you are wasting resources like you are swinging at a piñata in a different room.

This approach will just force the pre-loader/renderer/fetcher to be cautious and just prepare a couple of items (in document order unless you randomise or figure out a ranking metric) and have low hit ratios.

I think existing preloading/rendering on hover works really well on desktop, but I'm not aware of an equivalent for mobile. Maybe you can just preload visible links as there's fewer of them? But tradeoffs on mobile are beyond just latency, so it might not be worth it.

jameslk•1d ago
“Just” is doing a lot of heavy lifting there. There’s a lot that has to go on between the backend and frontend to make modern websites with all their dynamic moving pieces, tons of video/imagery, and heavy marketing/analytics scripts run on a single thread (yes I’m aware things can load/run on other threads but the main thread is the bottleneck). Browsers are just guessing how it will all come together on every page load using heuristics to orchestrate downloading and running all the resources. Often those heuristics are wrong, but they’re the best you can get when you have such an open ended thing as the web and all the legacy it carries with it

There’s an entire field called web performance engineering, with web performance engineers as a title at many tech companies, because shaving milliseconds here and there is both very difficult but easily pays those salaries at scale

giantrobot•1d ago
> There’s a lot that has to go on between the backend and frontend to make modern websites with all their dynamic moving pieces, tons of video/imagery, and heavy marketing/analytics scripts run on a single thread

So there's a lot going on...with absolutely terrible sites that do everything they can to be user-hostile? The poor dears! We may need to break out the electron microscope to image the teeny tiny violin I will play for them.

All of that crap is not only unnecessary it's pretty much all under the category of anti-features. It's hard to read shit moving around a page or having a video playing underneath. Giant images and autoplaying video are a waste of my resources on the client side. They drain my battery and eat into my data caps.

The easiest web performance engineering anyone can do is fire any and all marketers, managers, or executives that ask for autoplaying videos and bouncing fade-in star wipe animations as a user scrolls a page.

jameslk•22h ago
> The easiest web performance engineering anyone can do is fire any and all marketers, managers, or executives

Your solution to web performance issues is to fire people?

giantrobot•20h ago
When they're the cause of the web performance problems it isn't the worst idea. The individual IC trying to get a site to load in a reasonable amount of time isn't pushing for yet another tracking framework loaded from yet another CDN or just a few more auto-playing videos in a pop-over window that can only be dismissed with a sub-pixel button.
cs02rm0•1d ago
Theres only so much efficiency you can squeeze out though if, say, you're using AWS Lambda. I can see this helping mitigate those cold start times.
pradn•1d ago
You can't avoid sending the user photos in a news article, for example. So the best you can do is start fetching/rendering the page 200ms early.
madduci•1d ago
The lines are more than six for Firefox, since the opinion is not supported
rafram•1d ago
Hasn't been implemented yet, but Mozilla supports this proposal and plans to implement it:

https://bugzilla.mozilla.org/show_bug.cgi?id=1969396

jameslk•1d ago
If server resources and scalability are a concern, speculative fetching will add more load to those resources, which may or may not be used. Same deal on the end user’s device. That’s the trade off. Also, this is basically a Blink-only feature so far

The article provides a script that tries to replicate pre-rendering that speculation rules do for Safari and Firefox, but this is only pre-fetching. It doesn’t do the full pre-render. Rendering is often half the battle when it comes to web performance

Another limitation is that if the page is at all dynamic, such as a shopping cart, speculation rules will have the same struggles as caching does: you may serve a stale response

duxup•1d ago
Definitely a balancing act where you consider how much work you might trigger.

But I can think of a few places I would use this for quality of life type enhancements that are for specific clients and etc.

pyman•23h ago
I've seen some sites, like Amazon, calculate the probability of a user clicking a link and preload the page. This is called predictive preloading (similar to speculative fetching). It means they load or prepare certain pages or assets before you actually click, based on what you're most likely to do next.

What I like about this is that it's not a guess like the browser does, it's based on probability and real user behaviour. The downside is the implementation cost.

Just wondering if this is something you do too.

jameslk•21h ago
For a while, there was a library built to do this: https://github.com/guess-js/guess

You can do this with speculation rules too. Your speculation rules are just prescriptive of what you think the user will navigate to next based on your own analytics data (or other heuristics)

Ultimately the pros/cons are similar. You just end up with potentially better (or worse) predictions. I suspect it isn’t much better than simple heuristics such as whether a cursor is hovering over a link or a link is in a viewport. You’d probably have to have a lot of data to keep your guesses accurate

Keep in mind that this will just help with the network load piece, not so much for the rendering piece. Often rendering is actually what is slowing down most heavy frontends. Especially when the largest above-the-fold content you want to display is an image or video

rafram•1d ago
> This includes fetching all sub-resources like CSS, JavaScript, and images, and executing the JavaScript.

So not necessarily any website, because that could cause issues if one of the prerendered pages runs side-effectful JavaScript.

echoangle•19h ago
Then it’s a badly designed website, GET requests (and arguably the JS delivered with a GET-requested HTMl page) should be side-effect free. Side effects should come from explicit user interaction.
rafram•19h ago
> and arguably the JS delivered with a GET-requested HTMl page

That's pretty hard to achieve.

echoangle•12h ago
Why? What side effects does loading and executing the JS of a normal website have? Except for analytics, I don’t see any.
cs02rm0•1d ago
Annoyingly, Brave has disabled support for this.

https://github.com/brave/brave-browser/issues/41164

slig•1d ago
Does this basically replaces the need for `instant.page`?
babanooey21•1d ago
It does, but currently is supported only in Chromium-based browsers. Also with pre-rendering on hover pages are displayed instantly unlike with instant.page where rendering happens on link click which might take a few hundred ms before displaying the page.

Update: Actually instant.page also uses Speculation Rules API where it's supported

petters•1d ago
Try navigating this site to get an idea of how it feels: https://instant.page/
pyman•1d ago
Fast on desktop, decent on mid-tier mobile, and slow on low-end devices.
tbeseda•1d ago
I mean that's generally the curve of performance across those devices where "fast", "decent", and "slow" are relative.
pyman•23h ago
That's right. I was just pointing out that speed can vary depending on a bunch of things:

https://www.gsma.com/r/wp-content/uploads/2023/10/The-State-...

accrual•1d ago
This is cool but man, it feels like we're pushing more and more complexity into the browser to build webpages that work like desktop apps.

Just reading "Chrome Speculation Rules API" makes my skin crawl a bit. We already have speculative CPU instructions, now we need to speculate which pages to preload in order to help mitigate the performance issues of loading megabytes of app in the browser?

I understand the benefits and maybe this is just me yelling at clouds, but it feels crazy coming from what the web used to be.

theZilber•1d ago
It is less about performance issues of loading megabytes on the browser (which is also an issue). It is about those cases where a fetch request may take a noticable amount of time just because of server distance, maybe the server needs to perform some work (ssr) to create the page (sometimes from data fetched from an external api).

If you have a desktop app it will also have to do the same work by fetching all the data it needs from the server, and it might sometimes cache some of the data locally (like user profile etc...). This allows the developers to load the data on user intent(hover, and some other configurable logic) instead of when application is loaded(slow preload), or when the user clicks (slow response).

Even if the the target page is 1byte, the network latency alone makes things feel slugish. This allows low effort fast ui with good opinionated api.

One of the reasons I can identify svelte sites within 5 seconds of visiting a page, is because they preload on hover, and navigating between pages feels instant. This is great and fighting against it seems unreasonable.

But I agree that in other cases where megabytes of data needs to be fetched upon navigating, using these features will probably cause more harm then good, unless applied with additional intelligent logic (if these features allow such extension).

Edit: i addressed preloading, regarding pretending its a whole new set of issues which i am less experienced with. Making web apps became easier but unfortunately them having slow rendering times and other issues.. well is a case of unmitigated tech debt that comes from making web application building more accessible.

bobro•1d ago
Can anyone give me a sense of how much load time this will really save? How much friction is too much?
pyman•23h ago
It depends on how heavy the assets are and the user's connection.
youngtaff•19h ago
On the right site it can make navigation feel instant
CyberDildonics•1d ago
Instead of this clickbait title it should have just been about using preloading instead of making your page load fast in the first place.

When you go to an old page with a modern computer and internet it load instantly.

myflash13•1d ago
The main issue I had with TurboLinks and similar hacks was that it broke scripts like the Stripe SDK which expected to be loaded exactly once. If you preloaded a page with the Stripe SDK and then navigated away, your browser console would become polluted with errors. I'm assuming this doesn't happen with a browser-native preloader because execution contexts are fully isolated (I would hope).
pelagicAustral•1d ago
Man, the amount of headaches turbo give me... I have ended up with apps polluted with "data-turbo=false" for this exact same reason... But I also admit that when it works, it's a really nice thing to have
nickromano•1d ago
TurboLinks only replaces the <body> so you can put any scripts you'd like loaded exactly once into the <head> tag. You can use <script async> to keep it from blocking.
myflash13•23h ago
yeah but I needed it loaded exactly once only on certain pages and not on others.
bberenberg•1d ago
As someone who uses Docuseal, please don’t focus on this and add UX improvements for end users. For example, filters for who has signed things.
jgalt212•1d ago
How does this not mess up your apache logs? They just show was Chrome is guessing, not what content your users are consuming.
zersiax•1d ago
From the article I'd assume this wouldn't work in any way for mobile given no hover, not for screen reader users because a website often has no idea where a screen reader's cursor is, and potentially not for keyboard users (haven't checked if keyboard focus triggers this prefetch/prerender or literally just mouse hover), so ... limited applicability, I'd say.
Imustaskforhelp•1d ago
maybe its the fact that its really easy adding something like this, and this (I think) or something which basically acomplishes the same thing (but in a better way?) are used by some major meta frameworks like nextJS etc.

I guess it has limited applicability but maybe its the small little gains that add victories. I really may be going on in a tangent but I always used to think that hardware is boring / there aren't too many optimizations its all transistors with and,or,not but then.. I read about all the crazy stuff like L1 cache and the marvel machinery that is known as computers. It blew my mind into shredders. Compilers are some madman's work too, the amount of optimization is just bonkers just for tiny performances but those tiny performance boosts in the whole stack makes everything run so fast. Its so cool.

MrJohz•19h ago
Part of the design of the feature is that the website doesn't have to specify "on hover" or "on focus", but instead they can just write "eagerness: moderate" (or "conservative", or "eager") and let the browser decide what the exact triggers are. If it turns out that keyboard focus is a useful predictor of whether a link is likely to be clicked, then browsers can add that as a trigger at one of the indicator levels, and websites will automatically get the new behaviour "for free".

Currently, "eagerness: conservative" activates on mouse/pointer down (as opposed to mouse/pointer up, which is when navigation normally happens), which will work for mobile devices as well. And "eagerness: moderate" includes the more conservative triggers, which means that even on devices with no hover functionality, prerendering can still kick in slightly before the navigation occurs.

Imustaskforhelp•1d ago
So I was watching this youtube short which said that this is sorta how instagram's approach to a similar problem was

So instagram founders worked at google and they found that if you had written your username, you had 80% or some high% chance to create an account since (I think the barrier of friction has been crossed and its all easier from now, so why miss, why do all efforts and leave now, I am invested into this now and I will use this now)

So insta founders basically made it so that whenever you upload a photo it would silently upload in the background and then you would mostly write some captions of the image and that would take some time too, so in that time, the picture gets loaded into the database and that's how it was so fast compared to its own peers while using the same technology

If someone scraps the picture/story and doesn't put it, they just delete it from the system.

I will link to the youtube short since that clearly explained it better than me but this was really nice how things are so connected that what I watched on youtube is helping on HN.

drabbiticus•1d ago
Can someone explain how this works with links that cause changes? (i.e. changing the amount of an item in a cart, or removing an item from a cart)

I assume you would have to tailor the prefetch/prerender targets to avoid these types of links? In other words, take some care with these specific wildcard targets in the link depending on your site?

deanebarker•1d ago
How does this affect analytics on the hosting site? Will they get phantom requests for pages that might not ever be viewed?
mpyne•23h ago
Yes, but it's in concept always been true that they could get page views that wouldn't be viewed, whether due to bot scraping or even sometimes where a human clicks and then just... doesn't read it.
youngtaff•19h ago
Analytics can detect prerendered pages and choose how they report them - GA ignores them I believe
autoexec•22h ago
We train users to hover over links to see where it would send you before you click on them because some websites will link to malicious content/domains. Now I guess some of those users will end up silently browsing to and executing code in the background for those sites every time they do that.

Seems like a great way to track users too. Will hovering over ads count as a click through? Should users have to worry about where their mouse rests on a page or what it passes over?

MrJohz•19h ago
In practice, this is almost entirely going to be used for internal links within a domain - you are not going to want to prerender domains you don't control, because you can't be sure they'll be prerender-safe. And I suspect most internal navigation will be obvious to the user - it's typically clear when I'm clicking links in a nav menu, or different product pages on a shopping site. So I suspect your first issue will not come up in practice - users will typically not need to check the sorts of links that will be prerendered.

Tracking is a legitimate concern, but quite frankly that's already happening, and at a much finer, more granular level than anything this feature can provide. Theoretically, this gives the possibility to add slightly more tracking for users that disable JS, but given the small proportion of such users, and the technical hoops you'd need to jump through to get useful tracking out of this, it's almost certainly not worth it.

G_o_D•16h ago
https://github.com/GoogleChromeLabs/quicklink https://criticalcss.com/ Been using this for years
corentin88•10h ago
Too much JavaScript for me. Why not offering something as simple as `<a href="…" rel="prefetch">`