It's a great first step, but I struggle to see who this would be used by. So far Bluesky seems to be the only decentralized platform that's broken into the mainstream, and it'll only be more difficult for the video market.
However, the pricing is still a far cry from non-decentralized solutions (for example, MUX's free plan offers 100K minutes per month [1]) and so the only other selling point is joining the fediverse—which is a good thing, but hard to get people to convert on (in Bluesky's case the turmoil that is now X was required).
I know there are competing and cheaper services, but it still seems to be a big burden to get into. I've been trying to use Rumble a bit more, as well as appreciate the entry of Pepperbox, Floatplane and others. It's still a bit of a mess and none of them match the 10' experience of YouTube on Android TV, but it's getting better.
If you're not streaming live I believe you can serve video content out of R2 instead, which still somehow only charges for storage but offers completely free outbound bandwidth (egress).
> [...] Finally, we made it clear that customers can serve video and other large files using the CDN so long as that content is hosted by a Cloudflare service like Stream, Images, or R2.
Is self-hosting video still difficult, today in 2025?
My intuition is that there are less formats to worry about today, and serving video from static hosting that supports HTTP range headers may be enough for most devices to work.
What are the remaining hard problems? Maybe mechanisms to negotiate lower resolution for slower connections?
UPDATE: Looks like this offers some answers to my questions: https://help.micro.blog/t/micro-blog-studio/4081
The hardest bit appears to be HLS - HTTP Live Streaming - the thing where a video gets divided up into lots of little .ts segment files and served via a m3u8 playlist.
For most media, 1080p is fine enough. Add 720p and you have enough for 99% of the world.
Bunny CDN charge $5/TB for their volume network, which should be pretty good for video distribution, reducing after 500TB/month.
At a bitrate of 5Mbps (respectable for 1080p, significant overkill for more static types of content, as technical stuff will tend to be), 1TB is 444 hours. If, like OP, you publish 90-second videos, that’s 17,777 complete watches per terabyte. Depending on your situation, that might sound like not much or like a lot.
Put the other way round, at 5Mbps and $5/TB, each watch-hour costs $0.01125, a bit over one cent, and it takes 3,555 people watching your 90 second video to cost one cent.
For the sort of scale that most people are dealing with, it’s simply not an issue.
I don’t know if bots upset this balance. They may.
If you actually are spending more than a terabyte per month on it, then for technical audiences at least, I suspect that if you invited donations to specifically cover hosting costs (something along the lines of “I host these videos myself because ads and relying on YouTube are both bad for society; if you feel inclined, you can donate to help cover the cost, currently about $X/month”) you’d very quickly get a surplus. Or for longer-form content, charge something for 4K video (which costs 4.5¢ per watch-hour at 20Mbps and $5/TB) and let that subsidise the free 1080p (costing 1.125¢ per watch-hour) stream.
(On the $5/TB figure: my $5/month Vultr VPS includes 1TB per month, and charges $10/TB after that. Some VPS providers include a lot more; a Hetzner €3.49/month VPS in Europe includes 20TB then charges €1/TB. But remember, if you host video from one point only, that it is unlikely to work well for people halfway round the planet. See another of my comments in this thread for description.)
As for storage: each 90-second 5Mbps video is 56.25MB, and at a rate of $0.01/GB/month, each one will cost you $0.00675 per year to keep. Were you to post one 90-second video every single day and keep them all online, your monthly bill would grow by about $0.20 each year.
I want the fediverse and open web to be viable, and we need a solution to hosting high view count videos.
I’m assuming their CDNs are just specialised low cost hosting providers as opposed to p2p IoT botnets, but you never know.
I wish someone had the time and motivation to do an investigatory technical deep dive into the infrastructure they use.
I grew up in Australia, and more recently moved to India. Both are well away from the USA, where we who visit find the internet to be bafflingly fast just because of low latency (since most developers aren’t at all careful about avoiding request waterfalls, so even ignoring restricted bandwidth, 200ms of added latency makes the page load take several seconds longer).
Australia wasn’t great for playing high-bitrate videos hosted in the USA. With a high-quality 4G broadband link (rural western Victoria, clear line of sight to the tower 400m away that might host under 300 subscribers; at least 55/25Mbps via cheap phones when I measured it eight years ago), somehow you couldn’t actually download them at more than a few megabits per second at best. I think this is mostly latency effects, even though TCP is supposed to speed up.
India is terrible for playing high-bitrate videos hosted in the USA. A 100Mbps fibre connection can be completely undermined by what I imagine to be terrible peering arrangements on all broadband ISPs I’ve interacted with (Hireach, Airtel, Vybe), and files hosted in the US may trickle across at half a megabit per second or even less. Right now I can copy a file from my VPS in Australia at 1.6Mbps.
And so CDNs are, very sadly, rather valuable.
There are plenty of good providers and some of them are practically free.
I mean sure, if you want to roll your own CDN by hosting boxes in colos across multiple continents and applying geographical load balancing via DNS you're taking on a whole lot of extra complexity, but I think outsourcing that to Cloudflare or Fastly or Fly.io or whomever is a reasonable strategy that still counts as "self-hosting", at least in comparison to using YouTube.
I want to self-host things. Currently I use a VPS. I’m planning on trying out hosting from home. Either way, if I get into doing much multi-megabit-per-second video stuff, hosting of that bit will definitely be going behind some CDN.
Speed tests (e.g. https://www.speedtest.net/) may let you choose a location to measure to. If I measure with https://www.speedtest.net/ to Planet Networks, Inc. in Newark, NJ, USA, it slowly climbs to 87Mbps/56Mbps. Trouble is, at greater distances these numbers are basically always unrealistic—you can’t actually reach them in normal usage.
Various hosts give you big files you can download to test them, which is often a more realistic measurement of what you’ll experience on the web at large. https://nj-us-ping.vultr.com/vultr.com.100MB.bin, for example, is from roughly the same geographical place, and downloading it in Firefox or curl only really reaches 1–1.3Mbps and would take 11–12 minutes. Whereas https://bom-in-ping.vultr.com/vultr.com.100MB.bin (Mumbai; I’m in Hyderabad) happily saturates my 100Mbps link and completes in nine seconds.
I extracted a RubyGem at https://github.com/beautifulruby/hls that I use to point at a folder full of videos, then my scripts converts them into HLS and uploads them to a private Tigris S3 bucket. I then have to rewrite playlists from the server with pre-signed S3 URLs.
It’s not that it’s difficult per se, but it does require a meticulous attention to detail to put all the pieces together.
The weird m3u8 trick gets you better streaming and seeking performance, plus different video quality depending on the device and network connection.
Btw, indexing issue is another big problem on this platform.
kstrauser•2mo ago
I had fun playing with SSGs for years. I’m having more fun just writing posts and letting them get broadcast to Mastodon on wherever else I’ve configured them at the same time. It wasn’t clear to me until I read Manton’s book, but its goal is to be a social media service that’s built completely on open web standards that everyone can participate with.