In the above article, Harry gives a more nuanced and specific method using data attributes to target specific anchors in the document, one reason being you don't need to prerender login or logout pages.
<a href data-prefetch>Prefetched Link</a>
<a href data-prefetch=prerender>Prerendered Link</a>
<a href data-prefetch=false>Untouched Link</a>
Sorry chicken little, the sky is not falling.
It’s not clear it will waste battery on mobile… not sure if it’s still the case but mobile radios go to sleep and waking them used a non-trivial amount of energy so preloading a page was more efficient than let the radio go to sleep and then waking it
Need someone who’s more informed than I am to review wether that’s still the case
So in theory, this should actually reduce bandwidth/battery wastage, by giving more control to the browser and user, rather than individual websites.
[0]: https://en.wikipedia.org/wiki/Speculative_execution
[1]: https://www.chromium.org/developers/design-documents/dns-pre...
Yeah but you not only "count" it only when it shows tho????
I'am not saying its not "valid" but when you count it only when you shows it, are we really not missing a part why we need "cache" in the first place?
This includes libraries, kernels, CPUs, devices, drivers and controllers. The higher level at which you cheat, the higher the benefits.
Prefetch/prerender use server resources, which costs money. Moderate eagerness isn’t bad, but also has a small window of effect (e.g. very slow pages will still be almost as slow, unless all your users languidly hover their mouse over each link for a while).
Creating efficient pages takes time from a competent developer, which costs money upfront, but saves server resources over time.
I don’t have anything against prefetch/render, but it’s a small thing compared to efficient pages (at which point you usually don’t need it).
Not trying to be a contrarian just for the sake of it, but I don't think this has to be true. Choice of technology or framework also influences how easy it is to create an efficient page, and that's a free choice one can make*
* Unless you are being forced to make framework/language/tech decisions by someone else, in which case carry on with this claim. But please don't suggest it's a universal claim
This approach will just force the pre-loader/renderer/fetcher to be cautious and just prepare a couple of items (in document order unless you randomise or figure out a ranking metric) and have low hit ratios.
I think existing preloading/rendering on hover works really well on desktop, but I'm not aware of an equivalent for mobile. Maybe you can just preload visible links as there's fewer of them? But tradeoffs on mobile are beyond just latency, so it might not be worth it.
There’s an entire field called web performance engineering, with web performance engineers as a title at many tech companies, because shaving milliseconds here and there is both very difficult but easily pays those salaries at scale
So there's a lot going on...with absolutely terrible sites that do everything they can to be user-hostile? The poor dears! We may need to break out the electron microscope to image the teeny tiny violin I will play for them.
All of that crap is not only unnecessary it's pretty much all under the category of anti-features. It's hard to read shit moving around a page or having a video playing underneath. Giant images and autoplaying video are a waste of my resources on the client side. They drain my battery and eat into my data caps.
The easiest web performance engineering anyone can do is fire any and all marketers, managers, or executives that ask for autoplaying videos and bouncing fade-in star wipe animations as a user scrolls a page.
Your solution to web performance issues is to fire people?
The article provides a script that tries to replicate pre-rendering that speculation rules do for Safari and Firefox, but this is only pre-fetching. It doesn’t do the full pre-render. Rendering is often half the battle when it comes to web performance
Another limitation is that if the page is at all dynamic, such as a shopping cart, speculation rules will have the same struggles as caching does: you may serve a stale response
But I can think of a few places I would use this for quality of life type enhancements that are for specific clients and etc.
What I like about this is that it's not a guess like the browser does, it's based on probability and real user behaviour. The downside is the implementation cost.
Just wondering if this is something you do too.
You can do this with speculation rules too. Your speculation rules are just prescriptive of what you think the user will navigate to next based on your own analytics data (or other heuristics)
Ultimately the pros/cons are similar. You just end up with potentially better (or worse) predictions. I suspect it isn’t much better than simple heuristics such as whether a cursor is hovering over a link or a link is in a viewport. You’d probably have to have a lot of data to keep your guesses accurate
Keep in mind that this will just help with the network load piece, not so much for the rendering piece. Often rendering is actually what is slowing down most heavy frontends. Especially when the largest above-the-fold content you want to display is an image or video
So not necessarily any website, because that could cause issues if one of the prerendered pages runs side-effectful JavaScript.
That's pretty hard to achieve.
Update: Actually instant.page also uses Speculation Rules API where it's supported
https://www.gsma.com/r/wp-content/uploads/2023/10/The-State-...
Just reading "Chrome Speculation Rules API" makes my skin crawl a bit. We already have speculative CPU instructions, now we need to speculate which pages to preload in order to help mitigate the performance issues of loading megabytes of app in the browser?
I understand the benefits and maybe this is just me yelling at clouds, but it feels crazy coming from what the web used to be.
If you have a desktop app it will also have to do the same work by fetching all the data it needs from the server, and it might sometimes cache some of the data locally (like user profile etc...). This allows the developers to load the data on user intent(hover, and some other configurable logic) instead of when application is loaded(slow preload), or when the user clicks (slow response).
Even if the the target page is 1byte, the network latency alone makes things feel slugish. This allows low effort fast ui with good opinionated api.
One of the reasons I can identify svelte sites within 5 seconds of visiting a page, is because they preload on hover, and navigating between pages feels instant. This is great and fighting against it seems unreasonable.
But I agree that in other cases where megabytes of data needs to be fetched upon navigating, using these features will probably cause more harm then good, unless applied with additional intelligent logic (if these features allow such extension).
Edit: i addressed preloading, regarding pretending its a whole new set of issues which i am less experienced with. Making web apps became easier but unfortunately them having slow rendering times and other issues.. well is a case of unmitigated tech debt that comes from making web application building more accessible.
When you go to an old page with a modern computer and internet it load instantly.
I guess it has limited applicability but maybe its the small little gains that add victories. I really may be going on in a tangent but I always used to think that hardware is boring / there aren't too many optimizations its all transistors with and,or,not but then.. I read about all the crazy stuff like L1 cache and the marvel machinery that is known as computers. It blew my mind into shredders. Compilers are some madman's work too, the amount of optimization is just bonkers just for tiny performances but those tiny performance boosts in the whole stack makes everything run so fast. Its so cool.
Currently, "eagerness: conservative" activates on mouse/pointer down (as opposed to mouse/pointer up, which is when navigation normally happens), which will work for mobile devices as well. And "eagerness: moderate" includes the more conservative triggers, which means that even on devices with no hover functionality, prerendering can still kick in slightly before the navigation occurs.
So instagram founders worked at google and they found that if you had written your username, you had 80% or some high% chance to create an account since (I think the barrier of friction has been crossed and its all easier from now, so why miss, why do all efforts and leave now, I am invested into this now and I will use this now)
So insta founders basically made it so that whenever you upload a photo it would silently upload in the background and then you would mostly write some captions of the image and that would take some time too, so in that time, the picture gets loaded into the database and that's how it was so fast compared to its own peers while using the same technology
If someone scraps the picture/story and doesn't put it, they just delete it from the system.
I will link to the youtube short since that clearly explained it better than me but this was really nice how things are so connected that what I watched on youtube is helping on HN.
I assume you would have to tailor the prefetch/prerender targets to avoid these types of links? In other words, take some care with these specific wildcard targets in the link depending on your site?
Seems like a great way to track users too. Will hovering over ads count as a click through? Should users have to worry about where their mouse rests on a page or what it passes over?
Tracking is a legitimate concern, but quite frankly that's already happening, and at a much finer, more granular level than anything this feature can provide. Theoretically, this gives the possibility to add slightly more tracking for users that disable JS, but given the small proportion of such users, and the technical hoops you'd need to jump through to get useful tracking out of this, it's almost certainly not worth it.
game_the0ry•1d ago
Also brings up the questions:
- should browser do this by default?
- if yes, would that result in too many necessary requests (more $$)?
Either way, good to know.
babanooey21•1d ago
However, Next.js pre-fetching can't perform pre-rendering on hover, which can cause a noticeable lag during navigation. The native Chrome API allows not only pre-fetching, but also pre-rendering, enabling instant page navigation.
exasperaited•1d ago
I am reminded of the infamous time when DHH had to have it explained to him that GET requests shouldn’t have side effects, after the Rails scaffolding generated CRUD deletes on GET requests.
https://dhh.dk/arc/000454.html
Google were not doing anything wrong here, and DHH was merely trying to deflect blame for the incompetence of the Rails design.
But the fact remains, alas, that this kind of pattern of mistakes is so common, prefetching by default has risks.
radicaldreamer•1d ago