> GitHub Pages is not intended for or allowed to be used as a free web-hosting service to run your online business, e-commerce site, or any other website that is primarily directed at either facilitating commercial transactions or providing commercial software as a service (SaaS).
Not finding a similar mention for Cloudflare... commercial sites are fine there?
So a small company could host a static landing page with generic info and "contact us" etc., and that would be fine, I think?
It also mentions that breaking the rules will result in getting a "polite email suggesting ways to reduce load on Github".
Curious though how it handles a surge in requests, like from being on the front-page of HN. But many open source projects host their doc pages with Github pages and some of those get a lot of traffic so I'm sure that it's not an issue
curl -i https://simonw.github.io/
I get this: HTTP/2 200
server: GitHub.com
content-type: text/html; charset=utf-8
permissions-policy: interest-cohort=()
last-modified: Wed, 16 Nov 2022 21:38:29 GMT
access-control-allow-origin: *
etag: "63755855-299"
expires: Wed, 23 Apr 2025 18:20:50 GMT
cache-control: max-age=600
x-proxy-cache: MISS
x-github-request-id: 3D02:22250F:11BEDCA:123BE7A:68092D2A
accept-ranges: bytes
age: 0
date: Wed, 23 Apr 2025 18:10:50 GMT
via: 1.1 varnish
x-served-by: cache-pao-kpao1770029-PAO
x-cache: MISS
x-cache-hits: 0
x-timer: S1745431851.518299,VS0,VE110
vary: Accept-Encoding
x-fastly-request-id: 0df3187f050552dfde088fae8a6a83e0dde187f5
content-length: 665
The x-fastly-request-id is the giveaway.I however use CF workers a lot to deploy single-purpose webapps for my personal use.
Github has an automated action that then uses Pelican (a python based static site generator) to convert to HTML and deploys it to my VPS where it is served by Caddy.
Makes it very easy to have a WYSIWYG interface, the blog pages look basically identical to Obsidian.
https://mordenstar.com/blog/obsidian-to-pelican-exporter
Pelican static site generator:
I really wanted to use animated WebP but the iOS decoding is SUPER unreliable often resulting in choppy inconsistent framerates.
One thing I don't do but I know is more common is using <Picture> elements to serve scaled assets depending on if the user is coming from mobile vs desktop.
Depending on what you use for your blog, you might look and see if the SSG has plugins for media optimization. By the time I figured that out, I had already handrolled my own. :p
A friendly tip: you don’t have to populate the metadata yourself and can use ‘hugo new <dir>/<post_name>.md’ to create the file with the metadata.
No new editors to learn and one gets instant access to copilot etc.
[1] https://simonwillison.net/2023/Apr/2/calculator-for-words/#c...
I didn't know about Cloudflare pages, thanks for sharing!
I use Jekyll, Github pages and Cloudflare. I use hackmd for editing but Obsidian will work as well.
I tried some generators but it was so much more complicated than writing a style sheet and some pages. Maybe for some more complex use-case, okay, I get it, but the author's blog is so minimal and simple.
edit: today I learned people have very strong opinions about static site generators. Good valid reasons, y'all. Maybe my use case really is too simplistic!
The goal of generators is to reduce the friction of taking your notes/articles/etc. and wrapping them in thematically consistent HTML/CSS for your site. Once you've got it tuned/setup for your blog, they're pretty easy to use.
Obviously in your use-case where you find static site generators more complicated, then you can stick with raw html.
Obsidian is a nice middle ground between WYSIWYG and plain text - it doesn't send markup characters into the ether but at the same time does show you something close to the final text in real time.
Closest thing we've had to WordPerfect's Reveal Codes in decades.
It takes me nine characters plus URL and alt text in markdown using Hugo. I would be surprised if you get it right on the third try without consulting MDN, spending time to research, do image conversions, and fiddle with the <picture> tag and its srcset attribute.
compare https://fabiensanglard.net/fastdoom/ to https://fabiensanglard.net/fluide/.
Or at least, you will have as you realize things like you only wanted to do that on certain categories or whathaveyou.
You can in principle just write HTML with no script support, but it itself becomes an exercise in conspicuous asceticism. It is not unreasonable to want things like "the last 5 posts" on every page on a blog, or an RSS feed, or any number of other very basic, simple things that are impractical with an editor and raw HTML.
- the ability to update every page on my site at-once in a uniform fashion (want to change the page layout or add a link in the footer, either you're manually editing a hundred HTML files or a couple lines in a single template file)
- Syntax highlighting. My blog has code blocks so I would have to manually invoke a syntax highlighter for each code block (and every time I update the code I'd have to do it again).
- auto-invoking graphviz and gnuplot. Similar to the code blocks, I write graphviz dot and graphviz blocks as code in my blog posts and have my static site generator render them to images. If I was manually writing the HTML then I'd either end up committing the built binaries (rendered images) to git (which, of course, is bad practice but ultimately inconsequential when talking about small SVGs for the few charts/graphs on my blog) or I'd have to have a "build" step where a script invokes graphviz/gnuplot to output the images which is the first step on the road to a static site generator.
- Avoiding name collisions with internal anchor links (the kind with the `#` symbol that scrolls the page). I use these for footnotes and code block references. I could manually assign names for these for each blog post, but when combining multiple blog posts into a single list-view for the home page, there is the risk that I might reuse the same `#fn1` anchor across multiple blog posts. Using a static site generator, I don't need to concern myself with naming them at all and I have my static site generator assign unique names so there are no conflicts.
On my end I ended up building an entire custom thing that bastardizes SvelteKit to produce a static website that I then upload to GitHub Pages, but I think over-engineering a personal website is always good fun - and hey, I can tweak lots of silly aspects like how my post names get turned into /nice-and-readable-urls.
Out of curiosity, what's the advantage of using Cloudflare Pages over GitHub Pages? Both seem to require a GitHub repository.
A fun hobby of mine is Googling “how I built this blog with [next.js/Gatsby/etc etc]” and going to page 10 of the Google results.
It’s just hundreds of developer blogs with no blog posts other than the ones announcing how the blog was built.
"Weird Dude Who Writes Raw HTML"
My most recent example:
Is there a name for this phenomenon where this actually turns out to be true? Closest I can think of is "During a gold rush, sell shovels".
I have a 7,000 word blog post on how and why I switched which I didn't publish yet because I wanted to wait 6+ months with Hugo to make sure I ironed out all of the kinks. I guess I'm the anti-case for this one! Maybe I should post this one next week. It's been quite stable.
Why???
I actually prefer reading like this, even on desktop. It feels like it causes less eye strain. Though I might prefer if it was closer to 65rem.
Yes that was the intention.
https://matthewbilyeu.com/blog/2025-04-06/blogging-via-email
How is Obsidian for correcting this? Years ago I would have used something like grammarly to solve it but I'd rather something build it in if possible and make it as brainless as possible
Jekyll is slow for large content (my blog is huge), Hugo is fast. But I want to stay as mobile and lean as possible. I've tried and with a few changes in the template, I can move from Jekyll to Hugo in a weekend. I've also tried to stay as decoupled as possible from any templating engine (Jekyll), and hence rely mostly on pure HTML + CSS, while Jekyll is used more as the task runner. All the posts are separated by "YYYY" as folders and none of the posts have frontmatter.
I can also move between Github Pages, CloudFlare Pages, Netlify, or Vercel within hours if not minutes by just pointing the DNS to the ones that I want. I did this when Github Pages went down quite a few times about 3 years ago.
I almost went Markdown > Pandoc + Make but not worth it right now.
What hit me ( hard ) wasn't the blogging set up, it was this:
>And if anything’s unclear, LLMs like ChatGPT or Claude are great for filling in the gaps.
For people like me who grown up before Internet was a thing. If we dont understand anything we either have to go to the library to look it up go to find someone for help. Then it was encarta. When Internet arrived I could do look up faster, or more importantly if I am stuck anywhere I could ask on IRC or forums for help.
I am sensing a large part of learning on forums and online will be gone. Read the manual becomes ask the LLMs?
And I am sensing ( no offence ) the generation growing up with LLMs may be far worst. We already have a generation of people who think Googling the answer is 99% of the work when in fact the answer is not important. It is the process of getting to that answer that is largely missing from today's education.
Sorry about the off topic message. But that sentence really hit me differently.
It's easy to fret about, "How will the next generation survive in the world I grew up in, without the skills I developed?"
But the answer is, they won't. Just like you don't need the same skills a caveman had because you live in a thoroughly transformed world, the kids of today won't need the same skills you had because they'll live in a thoroughly transformed world.
Ofc some good or important things will always be lost from one generation to the next. But that's okay. Still, humanity marches onward.
I do see the "I asked chatGPT" response more and more and initially had a similar feeling but I think it's still early days for LLM's. Will they be around in 10 years and ubiquitous like search engines? Who knows. But undoubtably they will get better over time and more accurate. Just like how the early internet had a lot of bad information on it, it got better over time.
This might also be a divide between different types of people. Personally I am very curious and want to really understand how something works so I get tons of information that won't help me solve a problem, but I understand the tool or part better. I would guess you might be in the same basket. But there are also people who just want the answer to solve the problem. They don't care how it works they just want it to work. And that's fine. It takes all kinds. Not everyone needs to have a masters in CS to use a computer or program one. Best we can do is try and nurture curiosity among other people and help them figure out ways to learn more and more.
(I would use Obsidian Publish, but it rendered far too slowly on some pages. I do use their excellent sync service though.)
It is just centralizing the web. You can do a lot with a $4 droplet if a single board computer isn't your cup of tea. Not "buying" into icloud/cloudfare is alone worth that cost. Also much more meaningful stack to learn.
Nothing against the post/author, I just feel the creativity to "exploit" features of the giants that is put in place just to undermine alternatives is misplaced.
I do almost the same, but instead I use Astro.
I use Obsidian, with a GitHub submodule for the content, and host it all as a static page. I wrote about that here if anyone is interested: https://bryanhogan.com/blog/obsidian-astro-submodule
The only thing I want is to implement a gui for adding and editing posts.
Haven't heard of dev containers like that before, but cool to see that they can be used like that.
It’s funny because we could ostensibly switch to any git hoster but it’s really only GitHub and gitlab huh? And Cloudflare is hard to beat.
One thing I don't see the author mention that is part of what I plan to do with Obsidian is use Syncthing (which I already use for other things) so I can work on a post when I'm not at my laptop. Probably just to write down ideas/notes and then fully work it out when I get to my laptop.
If the blog author is here, curious if they commit drafts to their repo or not. I personally don't commit drafts. Besides also using 'draft: true` in the front-matter, I gitignore any markdown file where the filename starts with the word "draft". When I'm ready to publish I rename the file.
Yeah, I do commit drafts. My repo’s private, so I don’t mind keeping everything versioned there, including posts still marked as draft: true.
I've been doing it for 20+ years (xitami and thttpd before nginx) and it not only has an infinite lifetime (because it's .html and files) but it also has no attack surfaces or mantainence required.
All that static site generator and remote corporate services stuff will break within a year if left untouched. And re: security, running a static nginx server from home is about 10,000x less of a security risk than opening a random page in $browser with JS enabled. nginx RCEs are less than once a decade. And DoS re: home IP? A vastly over-stated risk for most human people. In my 20+ years I've never had a problem, and even if I ever do, who cares? It's not like my personal website needs nine 9s of uptime. It can be down for days, weeks, with no issue. It's just for fun.
That’s part of why I prefer hosting the static output somewhere external. Not perfect, but it lets me step away from the setup for months and still have it running.
As for IP, when it changes you can just copy the new IP and stop sending links with the old IP to friends. It's not a big deal. But a domain is nice (either some dyndns subdomain or a real tld with free DNS hosting (and dyndns updates) by zoneedit or the like).
I know you have acknowledged the decision to entrust nginx with all of your personal data and tax records and bank statements and legal documents and browser history and GitHub credentials and ssh private keys and so on.
But it's still madness. You are one oversight, accident, or bug away from total pwnage.
.
├── .devcontainer
│ └── devcontainer.json
├── content
│ └── posts
│ ├── 1718983133-post-1.md
│ ├── 1720321560-post-2.md
│ └── 1740070985-post-3.md
├── go.mod
├── go.sum
├── hugo.toml
└── README.md
3 directories, 8 files`themes` are small in size, why not copy them into your repository to keep the build hermetic?
If I want to do a post, I log in, draft the post in a simple rich-text editor with image support and keyboard shortcuts for formatting, and click "publish." I don't have to fool with anything, there is no chance of sync breaking, and it's instantly responsive.
The back-end is stored in Github, but the posts are stored, with revision history, in a Postgres database that I have full access to.
It's hard to envision a scenario where I'd prefer digging through a git repository to see a previous version of a post rather than just clicking into the CMS site and clicking on the historical version of the post that I'd like to look at, where it is instantly displayed including images. And even with daily blogging, the number of times I've actually looked at a prior version of a post is very low -- probably less than once a year.
Keeping Hugo installed and up to date as part of the publish process seems like a headache as well. I like the blog to be totally separate from my local machine, so if I change anything or switch laptops, it doesn't interfere with the publication process.
Manually adding the Hugo front matter to each post also strikes me as annoyingly fiddly, although you could use a text expander app to handle most of it. Another issue is that I'm not sure that Markdown would do well for the full scope of formatting, such as aligning images and adding image captions.
It seems this post talks about blogging from the desktop. But I just installed Obsidian on Android, it allows a filesystem vault. I think pairing it with Syncthing and some automation on my NAS (to do a git push to Github/Gitlab) could make it very streamlined.
I stopped uploading to Instagram the day they started using images for AI training.
If any of the file syncing applications work directly on the filesystem, I think you can use Obsidian on these folders and it'll sync automatically. On iOS, Obsidian defaults to iCloud, for example.
FWIW the CMS is Decap CMS and I have it configured likewise with Cloudflare Pages (since Pages supports functions that are needed for the auth/callback cycle).
Shameless plug for my AI blog run on Hugo -- https://reticulated.net/
Ps: Folks should chill out about wording here and there.
Droobox and similar services have never been seamless for me with Obsidian. There will inevitably be some conflict when the same file gets edited on two machines, and an ugly conflict file will appear, with no easy way to resolve it without a specialized diff tool. Sometimes this conflict file will get unnoticed and a change will fall theough the cracks. Not a deal breaker - but not "seamless" either.
- you haven't setup auto save/auto sync
- multiple people editing the documents
- you frequently have no Internet and the application closed when you re-gain Internet or similar
if only a single person edits the same document at a time and you always sync changes when they happen that should be a non issue
Relay attaches a CRDT to the Obsidian editor. It makes collaboration really smooth and removes conflicts.
Markdown collaboration is free, but we do charge for attachment/media storage. If you dm me (@dtkav) on our discord [1] or email me (in my profile) I'm happy to give out free storage to HN folks.
One other benefit of using Relay is that you can use the new "App Storage" feature on Android to keep your files isolated and private on your device [2]. Using dropbox/gdrive forces you to store your notes in your documents folder and that means that they can be read by other apps.
[0] https://relay.md
[1] https://discord.system3.md
[2] https://obsidian.md/changelog/2025-04-22-mobile-v1.8.10/
Basically, I saw that nextcloud built their own realtime text editor based on TipTap so I created an obsidian extension to connect to it.
Unfortunately work and uni got in the way but it was a very interesting idea/learning experience.
[submitted title was "How I Blog with Obsidian, Hugo, GitHub, and Cloudflare – Zero Cost, Fully Owned"]
Edit: I guess you mean the content itself is still on your machine if the services go away, and you can choose to host them elsewhere
- Content is just Markdown files in my local Obsidian vault
- Hugo builds the site locally - no dependency on external editors or platforms
- GitHub is just used for version control and deployment
- Cloudflare Pages handles static hosting, but I can move it elsewhere anytime
Would be nice to coin an unambiguous term for this as it's a useful design goal.
But, I think using the term "fully-owned" to refer to pushing up to GitHub, then deploying to Cloudflare Pages is definitely not "fully-owned"
You're right that hosting on GitHub and Cloudflare isn't infrastructure ownership. I should’ve been more precise with the wording.
The OP's fully owned is analogous to someone else doing the printing for the privelege of spying on your readers.
It's not really fully-owned, but it's owned in the ways that matter most
Two of which are services operated by a corporate entity and one of which is a closed source piece of software.
The only thing "owned" here is the fact that the entire blog is simple markdown and the domain name. However, that doesn't mean it's very portable. It's not impossible, but it's a lot more work than I would want to do.
The internet as a whole relies on a huge variety of services all working as they should.
I agree it’s not “self-hosted”. But compared to a closed CMS or paid platform, it feels meaningfully more in my hands.
Also, I like treating my blog as a version-controlled, declarative "codebase" that's just a bunch of plaintext files (no MySQL tables, XML or JSON to crawl through).
What if it's a dedicated machine, colocated? What if it's at home, but I pay an ISP?
edit: Downvoted, care to explain why? I genuinely wrestle with this question. I self-host lots of services at home - and I also self-host services on a cloud VPS, which have better availability and security posture with regards to my home network for things I make public or semi-public. Some have told me this isn't "self-hosting" and I am not sure where the line is drawn.
Anyway, the important thing is being in control of your own data. With proper off-site backups and reproducible setups using containers, migrating between VPS providers should usually take just a few hours.
I fully understand the arguments for (and against) managing your own server. But I've not been convinced by any arguments for that server being in your house/office rather than a climate controlled warehouse somewhere.
Well, unless you enjoy setting up and managing the physical hardware yourself of course. That's fully reason enough.
I think blogs should be built like this to make preservation easier. I'd love to have something that make content domain agnostic, more like git that allows cloning and distributing content without forcing people to guess when to pull and archive if they want to keep track of things.
Personally I feel if you can quickly pull out of a provider and host elsewhere with maybe just a config change - aye the data is fully owned, close enough.
Those risks still exist. GitHub and Cloudflare can do these things at any moment.
> This stack gives me full control: no subscriptions, no vendor lock-in,
> and no risk of platforms disappearing or changing policies.
I'm not trying to dunk on the author, but this sentiment encapsulates a tremendous paradigm shift in the meaning of "full control", compared to say:* Writing in obsidian
* Syncing files via git
* Publishing via nginx hosted on the cheapest possible vps (with https via let's encrypt)
Running a static blog is one of the easiest scenarios to self-host, because even if you get slashdotted, nginx should be able to handle all the requests even if you're host it on a potato
It's not free, but you can get a VPS for $20-$30 a year.
This isn't the best fit for everyone, but it seems weird to talk about "full control" and "no risk of platforms disappearing" when you're relying on the free tier of two external companies
The overhead of switching from Cloudflare to a VPS if they needed to is really not that much larger than switching from one VPS to another, so they likely figured it wasn't worth paying $30/yr to own the whole stack, as long as they architected it such that they could own the whole stack if they needed to.
The one thing I do differently with Obsidian is that I use a private git repo rather than having it live in iCloud. I sync it across an iphone, ipad, and windows desktop.
Note that Obsidian's markdown editing experience is _different_ from (but not necessarily better or worse than) what you'd get in a typical IDE. So while the choice seems weird to me, it absolutely makes sense if the author prefers the feature set that Obsidian offers. Being supported by so many different editors is one of markdown's strengths, after all, and this kind of editor-portability fits right in with the other parts of "Fully Owned" from the blog post.
But I honestly despise writing raw markdown in an IDE. If I'm writing (not coding), I need it to be somewhat visual, which means I want WYSIWYG -- and Obsidian is an excellent markdown editor, even if you don't use the other features.
My reasons for not liking writing "raw" markdown:
- Long links take up too much space. I put it in text so it'd be hidden
- No syntax highlighting in code blocks
- Text wrapping/font is typically not configured for easy reading of text
- A ton of markdown features are for visual formatting. Highlighting, bold, underline, strike-through, inline code, etc. If you stay in raw IDE no-preview, you never get the visual benefits.
- When I'm using markdown, I'm often mostly reading, and doing some writing, but even when I'm writing, I'm re-reading what I wrote constantly. It's annoying to switch to preview mode. Writing mode in IDEs isn't a pleasant reading experience unless you do a lot of configuration. (depending on the IDE of course)
I mean, writing raw md is fine for tiny little things. But because reading & writing are so linked, I don't like separate modes for each. I want to write in the visual mode I read in.
<cough> You didn't grow up with WordPerfect 5.1 for DOS with reveal codes on, did you? :)
Only friend has struggled to get pelican and git etc set up on a new laptop. And tbh I am lost - and horrified - at the latest windows. Not keen on being tech support and working out why python command hangs etc.
And my custom vps setup doesn’t do tls. And I don’t want to try.
So I’m wondering if there are any alternatives? Is there a web frontend for blogging straight to ghpages, for example?
What I regret though was using Tailwind. Mainly because I later couldn't find a straightforward way to just use normal CSS and ignore that within the scope of some component/page.
For my work[2] I am using SvelteKit and written my own blog using mdsvex and enabled pre-render. That works well too.
[1]: https://vivekshuk.la [2]: https://daestro.com/blog
How to handle images & video content, when using git to track files? I'll explain my setup...
My vision is a somewhat "lo-fi" and 10+ year durable system - so even my "CI/CD" is more local, in the form of shell scripts, rather than remotely (like GitHub Actions)
I have a folder, that's basically my-website.com, and it has a folder `docs` for content (that's `mkdocs` default content folder). The top level directory is a git repository, which is pushed to Codeberg (code repository, similar to GitHub).
As a content editor, I currently use VSCodium (open-source VSCode) + FOAM (clone of ROAM, similar to Obsidian, which lets you cross-link Markdown files with Obsidian-style links [[MyLink]]. To be specific here, on MacOS, I created a shortcut to the website folder on Finder, and I drag that onto the VSCodium app icon, when I want to write. It's a pretty easy workflow on my computer (not practical on mobile)
I use MkDocs to generate the HTML site. I use a simple deploy script to run `mkdocs build` and `aws s3 sync` to copy the files to an AWS S3 bucket.
This all works pretty well, but I'm now trying to figure out how to handle photos & videos.
To give an example, I would have something like `~/_Websites/my-website.com/docs` and inside of that I would have `journal/2025/04/2025-04-23.md` as a journal entry. Related photos and videos, I use `journal/2025/04/media` - so its sort of a catch-all for all the media files for the month.
Recently, I added some large videos (unrelated: but I'm recording video of my CNC router doing cool stuff), and quickly realized
(1) Git is not the right spot for large media (I knew this, but just hadn't hit the problem yet - seeing how long `git pushes` take).
(1.1) I actually have a broken repo right now, as I committed video files into it, and can't `git push` without the network connection being cut. I think it may be on the Codeberg side, because they have a limit of 1 GB per repo.... So I'm also trying to figure out how to back out the change, get the video file out of there, and arrive to a better solution.
(2) After reading `git-lfs` (Git Large File Storage) website several times, I can't quite figure out how to integrate it - or IF I should integrate it.
(3) Now, I'm noodling on having something like a `MEDIA.my-website.com` directory, which is basically a non-git-tracked folder structure of photos/videos, which I would then rsync to a separate S3 bucket - and then website content could reference it. However - I'm fearful that over time, the markdown content and the media site would be out of "sync" - I frequently re-organize content as needed. For example, I might start with a `python` folder, and a `java` folder, but then later create a top-level `programming` folder. I could see doing the same with the media folder too. `padel` and `squash` folder (containing video clips of games, how-to's) might be grouped under a `sports` folder, and so on. Dragging these folders around while the media content is inside the folder usually doesn't cause problems, because of relative links. However when the content & media are in different file structures, broken links would happen - and this reduces the "fluidity" / increases friction in naturally re-organizing the site with time.
To conclude: - How to handle video media with git tracked markdown content?
Anyway! I appreciate your patience in reading this, and hope you get the idea of the setup - curious what folks who have been down this route can recommend.
voidUpdate•9h ago
ingav•9h ago
ahmedfromtunis•8h ago
It's like having infinite "free" domains (even with the small fee for the base domain.)
But the most important part is the fun of just having an entire namespace at your disposal to create whatever "domain name" in seconds.
voidUpdate•8h ago
ahmedfromtunis•8h ago
Nothing technically groundbreaking (actually Cloudflare does it automatically for me), but it's a nice quality of life trick.