Is this true? He mentions the provider being AWS, surely some sort of threshold can be set?
In addition, it's a pay-per-use platform
1. High egress costs
2. No hard spending limits
Both of these were problems for the author. I don't mean to "blame the victim" but the choice of AWS here had a predictable outcome. Static documentation is the easiest content to host and AWS is the most expensive way to host it.
Sure, but it's unlikely you actually have to place a CDN in front of your manual, it's mostly text with few images. People default to using CDNs way too quickly today.
People simply do not understand how expen$ive AWS is, and how little value it actually has for most people.
You can just drag and drop a folder and have a static site hosted in a few minutes.
A lot of other people also pick it for very narrow use cases where it wouldn't have been that much more time to learn and do it themselves and end up paying a lot of money and also aren't happy
It's pretty nice for mid-size startups to completely ignore performance and capacity planning and be able to quickly churn out features while accumulating tech debt and hoping they make it long enough to pay the tech debt back
Personally, software engineering for me is mostly about trying to avoid accidental complexity. People obsessing about "web scale" and "distributed architecture" before they even figured out if people actually want to use the platform/product/tool they've used tends to add a lot of complexity.
You're welcome to set up the CDN with a CLI...
That's not really true if you care about reliability. You need 2 nodes in case one goes down/gets rebooted/etc, and then you need a way to direct traffic away from bad nodes (via DNS or a load balancer or etc).
You'll end up building half of a crappy CDN to try to make that work, and it's way more complicated than chucking CloudFlare in front of static assets.
I would be with you if this was something complicated to cache where you're server-side templating responses and can't just globally cache things, but for static HTML/CSS/JS/images it's basically 0 configuration.
While reliability is always some concern, we are talking about a website containing docs for a nerdy tool used by a minuscule percentage of developers. No one will complain if it goes down for 1h daily.
But, honestly, for this: just use github pages. It's OSS and GitHub is already used. They can use a separate repository to host the docs if repo size from assets are a concern (e.g. videos).
After initial setup it was smooth sailing. Other more reasonable setups would also have been smooth sailing, but... they weren't setup yet. I was uneasy about the possibility of a surprise bill happening, as it eventually did, but until the brain dead LLM leeches came along, that just never happened. After a decade of it not happening, I wasn't that concerned anymore, but I guess when it comes to the AI bots, I had my head in the sand a bit. I still though something like a 500% bill might happen, not 5000%.
Once it did happen, I immediately shut my sides down, and within the hour the account was no more. On the way out I saw that you can now actually set a "spending limit", it still had a [new] next to it. I tried setting it up, but could only quickly figure out how to setup a notification. It might be possible to set an actual spending limit, but not in a few minutes -- probably got to read some documentation for that.
But even if this were a one click setting, it wouldn't have made a difference at this point. You do this once and I am gone. Also, I wanted to move away from Amazon anyway, so really, this was the kick in the pants that I needed.
For now I am using Github Pages for the very static parts, and the free hosting provided by my email provider, for the slightly less static manuals generated with Github Actions. I would have made sense to use Github for both (not least so that Microsoft could cover the cost of the bots they have unleashed), but I wanted to avoid the complexity of committing to the same pages repository from the CI pipelines of multiple package repositories.
I donated a bit of money to help tarsius offset the cost of AWS LLM abuse, well deserved for the value I've gotten from his tools.
Yesterday evening I saw that I had a few new sponsors and was wondering where they had come from.
So in the end something good came of it. The one time donations covered the bill, and I also got a few new monthly sponsors. (Well, unless you also take the hours into account that it took me to move to new hosting, then its way way below minimal wage, but as a maintainer of free software, I am used to that by now.)
Sooo... I guess I should take the opportunity and do a bit of marketing. I am still making a living maintaining Magit et al., so please consider sponsoring my day to day work too. Thanks!
Here's a nice repo with details on how to support them!
https://github.com/tarsius/elisp-maintainers
Also worth pointing out that the author of Magit has made the unusual choice to make a living off developing Emacs packages. I've been happy to pitch in my own hard earned cash in return for the awesomeness that Magit is!
cratermoon•2mo ago
ssivark•2mo ago
embedding-shape•2mo ago
LtWorf•2mo ago
macintux•2mo ago
No wonder small businesses just put their information on Facebook instead of trying to manage a website.
LtWorf•2mo ago
nijave•2mo ago
LtWorf•2mo ago
kstrauser•2mo ago
embedding-shape•2mo ago
Are people not using fail2ban and similar at all anymore? Used to be standard practice until I guess before people started using PaaS instead and "running web applications" became a different role than "developing web applications".
nijave•2mo ago
Even harder with IPv6 considering things like privacy extensions where the IPs intentionally and automatically rotate
kstrauser•2mo ago
I went as far as blocking every AS that fetched a tripwire URL, but ended up blocking a huge chunk of the Internet, to the point that I asked myself whether it’d be easier to allowlist IPs, which is a horrid way to run a website.
But I did block IPv6 addresses as /48 networks, figuring that was a reasonable prefixlen for an individual attacker.
nijave•2mo ago
Also some solutions for generating static content sites instead of "dynamic" CMS where they store everything in a DB
If it's new, I'd say the easiest option is start with a content hosting system that has built-in caching (assuming that exists for what you're trying to deploy)