Thus, regardless how well one optimizes his site delivery (static site, minimizing, CDN, caching, etc.) a stampede of bot crawling does in the end become a DDoS, which if it doesn't take down the infrastructure, it might leave a deep hole in one's budget.
For example, for one of the sites I manage, I get daily peaks of ~300 requests per second measured at the backend, for a site that already employs heavy caching (both client-side, CDN, and even server-side). This wasn't so a few months back, and the site didn't just jump in popularity.
Very easy to bypass for sure, but custom enough to protect you from the horde of generic bots =p
Back then the "select all boxes with traffic light or something" is already ambiguous enough, now they even started to generate AI images for that and to be honest, I can't even get it right like 40% of the time...
... And the actual bots are able to do that better than me. What an absurd time to live in.
al2o3cr•5h ago
ciprian_craciun•5h ago
I agree that it is sad to see many online book stores moving from "selling" to "renting". But that is a completely different problem.
As a personal note, I know the pain of not being able to access scientific papers because they were behind paywalls, and I had to search for drafts to be able to read them. But that model was well in place circa 2010, thus it's and old tactic applied to a new field: books (and others).