frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Open in hackernews

Show HN: BotMode checks if your site renders correctly for Googlebot

https://pagegym.com/botmode
1•razcoj•1h ago
Hi guys,

I made this tool that's built on top of the main PageGym analyzer, which respects the robots.txt files of all the resources on your site in the same way Googlebot would, and can also enforce the 2 MB HTML and resource limit that Google recently revealed through their documentation.

The idea is that if you're depending on scripts or some other resources which are loaded from a third party location, then access to them is governed by the robots.txt file of that third party, so if they're blocked and they happen to be required for rendering parts of your content, then your page might not get indexed correctly.

The same thing can happen if you're hitting the 2 MB limit for HTML or scripts. Yes quite a few websites seem to have this issue: https://pagegym.com/blog/2mb-googlebot-limit#stats

For the 2 MB thing it initially issues a warning if it finds any violations (this is actually part of the main tool), and if you have any you can turn on Trim mode in the reanalysis options to actually truncate the files.

Note: The reported timings are affected by robots.txt fetches, so don't rely on them. If you want proper load behaviour analysis use the main tool.