"Why did you choose Go for your backend?"
"Well, the lightning strikes it implements deliver critical elec-damage against the customers that get past the ice blast from our DATA CACHE-powered FUNC, striking them in their frozen-status for x3 before they can get to our vital CPURAM resources. The most important thing to have in a cloud-based deployment is a diverse mix of elemental attacks and crit-stacking bonuses."
At my previous job, I became responsible for handling >350,000 requests/second and I dug so deep.
Just to push against this a bit. Redis can be very low memory cost and is very easy to run (I give it 5mb). I have a small single server with a few instances of my API that let's me cache pretty much everything I need.
Of course Redis can do other things than just a k/v cache, and at scale you just want to offload some load from your main SQL server. But for "small" use cases my conclusion was that Redis doesn't really add anything. OTOH it's also not especially difficult to run so it also not a big problem to use it, but by and large it seems superfluous.
I must admit I had no idea what goes into runnig a CDN. First I had a look at DO's spaces object storage which has CDN support. But it wasn't exactly the right tool for serving a static website apparently, but rather for serving large files. For example I couldn't make it serve css files with the correct mime type without some voodoo, so I had to conclude I wasn't doing the right thing.
Then I looked at DO's app platform. But that seemed like an overkill for just sharing some static content. It wants me to rely on an external service like GitHub to reliably serve the content. I already rely on DO, I don't want to additionally rely on something else too. Seems like I could also use DO's docker registry. What? To serve static content on CDN I need to create a whole container running the whole server? And what do I need to take into consideration when I want to update the content once per day simultaneously for all users? It's easy when it's all on my single VPS (with caching disabled for that url) but I actually have no idea what happens with the docker image once it's live on the "app platform". This is getting way more complex than I was hoping for. Should I go back to the spaces solution?
Right now I'm in a limbo. On one hand I want to be prepared in case I get lucky and my thing goes "viral". On the other hand my tiny VPS is running on 2% CPU usage with already quite a few users. And if I do get lucky, I should afford doubling my VPS capacity. But what about protection from DDoS? Anything else I should worry about? Why is everyone else using CDN?
And I don't even have caching! An article like this puts my problem into shame. I just want to serve a couple of plain web files and I can't choose what I should do. This article really shows how quickly the problem starts ballooning.
Configuration depends a lot on the specifics of your stack. For Svelte, you can use an adapter (https://svelte.dev/docs/kit/adapters) that will handle pointing to the CDN for you.
Cloudflare's offering is free, bunny.net is also probably going to be free for you if you don't have much traffic. CDNs are great insurance for static sites, because they can handle all the traffic you could possibly get without breaking a sweat.
If you want to be more fancy, you could do private GitHub repo+ Cloudflare pages enabling you to prepare puzzles in advance.
You can get quite far with both service’s free offering or Cloudflare’s $5/mo stuff.
Full disclosure: I work for neither companies but I have regularly been a happy paid customer of both products.
Varnish Cache have supported that since version 4.1, it was released 2015-09-30:
> Varnish will now use the stale-while-revalidate defined in RFC5861 to set object grace time.
https://varnish-cache.org/docs/trunk/whats-new/changes-4.1.h...
ec109685•12h ago
That math works until it doesn’t. If for some reason there’s a greater than 15 second response, the cache will open the floodgate and won’t return any cached response until the cache is full.
Similarly, errors need to be accounted for. If the server returns a 500, is it configured to be cached? Is stale-while-error configured so the error state doesn’t take the server down?
ndriscoll•11h ago
tshaddox•9h ago
jasonthorsness•9h ago