They bet on open source and they open source a lot of technology.
It's one of the best companies when it comes to open source.
I don't know how much total they donate, but I've seen tons of grants given to projects from them.
Been doing this for 20 years. React/JSX is the easiest (for me)
React and JSX really did help a lot compared to how it used to be, which was pretty unmanageable already.
This could not be more wrong. Meta is still using PHP AFAIK but I'm not sure it's modern. They created the Hack programming language ~10 years ago but it doesn't look like it's been updated in several years. Most of the improvements they touted were included in PHP 7 years ago.
But when the backend world was either Java or ASP, FB chose PHP and helped us other small companies out.
They eventually went Hack, the rest went Node for the most part.
But during those PHP years they gave us HHVM and many PHP improvements to get us through.
But PHP wouldn't be here today if it wasn't for Meta and it's support.
The analogy fails because free samples cost costco (or whatever the vendor is) money. Raking Meta over the coals for using ffmpeg instead of paying for some proprietary makes as much sense as raking every tech company over the coals for using Linux. Or maybe you'd do that too, I can't tell.
If you get mad when a company makes good use of open source and contributes to a project’s betterment, you do not understand the point of open source, you’re just fumbling for a pitchfork.
Some comments seem to glance over the fact that they did give back and they are not the only ones benefitting from this. Could they give more? Sure, but this is exactly one of the benefits of open source where everyone benefits from changes that were upstreamed or financially supported by an entity instead of re-implementing it internally.
Big tech companies can easily hire manpower to make proprietary versions of software, or just pay licensing fees for other proprietary software. They don’t rely on open source.
Instead, modern startups wouldn’t exist without open source.
But personally, I took issue with the tone of the blog post, characterised by this opening framing:
>For many years we had to rely on our own internally developed fork of FFmpeg to provide features that have only recently been added to FFmpeg
Could they not have upstreamed those features in the first place? They didn't integrate with upstream and now they're trying to spin this whole thing as a positive? It doesn't seem to acknowledge that they could've done better (e.g. the mantra of 'upstream early; upstream often').
The attempt to spin it ("bringing benefits to Meta, the wider industry, and people who use our products") just felt tone-deaf. The people reading this post are engineers - I don't like it when marketing fluff gets shoe-horned into a technical blog post, especially when it's trying to put lipstick on a story that is a mix of good and not so good things.
So yeah, you're right, they've contributed to OSS, which is good. But the communication of that contribution could have been different.
But corporate blog posts often go this way. I'm not mad at them or anything. Just a mild dislike ;)
Hard to say without being there, but in my experience it's very easy to end up in "we'll just patch this thing quickly for this use case" to applying a bunch of hacks in various places and then ending up with an out of sync fork. As a developer I've been there many times.
It's a big step to go from patching one specific company internal use case to contributing a feature that works for every user of ffmpeg and will be accepted upstream.
However, my interpretation of the article was that they did a lot more than just patching pieces. They, perhaps, could have taken a much earlier opportunity to work with the core maintainers of ffmpeg to help define its direction and integrate improvements, rather than having to assist a significant overhaul now (years later).
This is the gold standard, sure. In practice, you end up maintaining a branch simply because upstream isn't merging your changes on your timescale, or because you don't quite match their design — this is completely reasonable on both sides, because they have different priorities.
[Edit: Why is anyone downvoting me linking to the previous post of this? What possible objection could you have to this particular comment?]
Oof. That is so relatable.
Also ffmpeg 8 is finally handling HDR and SDR color mapping for HDR perfectly as of my last recompile on Gentoo :)
I've been out of the game for a bit but it's great to hear.
While it is good they worked to get their internal improvements into upstream, and this is certainly better behavior than some other unmentioned tech giants. It makes one wonder (since they are presumably running it tens of billions of times per day), if they were involved in supporting these improvements all along. If not, why not?
I worked at fb, and I'm 100% certain we sponsored VLC and OBS at the time. It would be strange if we didn't sponsor FFMPEG, but regardless (as the article says) we definitely got out of our internal fork and upstreamed a lot of the changes.
I worked on live, and everyone in the entire org worships ffmpeg.
doesn't matter how you worship ffmpeg if a company, which makes billions by destroying our society, gives a little bit of handout back.
So good for you? Bad for ffmpeg, society and the rest of the world.
and I know the teams love ffmpeg, there are some great folks at meta just not a lot in the c suite
This makes a lot of sense for the live-streaming use case, and some sense for just generally transcoding a video into multiple formats. But I would love to see time-axis parallelization in ffmpeg. Basically quickly split the input video into keyframe chunks then encode each keyframe in parallel. This would allow excellent parallelization even when only producing a single output. (And without lowering video quality as most intra-frame parallelization does)
comrade1234•2h ago
Maxious•2h ago
dewey•1h ago
petcat•1h ago
BonoboIO•33m ago