vs. spamming OSS maintainers with slop reports costs Google nothing
If it was slop they could complain that it was wasting their time on false or unimportant reports, instead they seem to be complaining that the program reported a legitimate security issue?
Their choice becomes to: - maintain a complex fork, constantly integrating from upstream. - Or pin to some old version and maybe go through a Herculean effort to rebase when something they truly must have merges upstream. - Or genuinely fork it and employ an expert in this highly specific domain to write what will often end up being parallel features and security patches to mainline FFmpeg.
Or, of course, pay someone in doing OSS to fix it in mainline. Which is the beauty of open source; that’s genuinely the least painful option, and also happens to be the one that benefits the community the most.
If that's what I have to expect, I'd rather not even interact with them at all.
But the way I see it, a bug report is a bug report, no matter how small or big the bug or the team, it should be addressed.
I don’t know, I’m not exactly a pillar of the FOSS community with weight behind my words.
As the article states, these are AI-generated bug reports. So it's a trillion-dollar company throwing AI slop over the wall and demanding a 90-day turn around from unpaid volunteers.
and the ffmpeg maintainers say it's not wanted
so it's slop
I'm not a Google fan, but if the maintainers are unable to understand that, I welcome a fork.
There is a convergence of very annoying trends happening: more and more are garbage found and written using AI and with an impact which is questionable at best, the way CVE are published and classified is idiotic and platform founding vulnerability research like Google are more and more hostile to projects leaving very little time to actually work on fixes before publishing.
This is leading to more and more open source developers throwing the towel.
Some of them are not even bugs in the traditional sense of the world but expected behaviours which can lead to unsecure side effects.
This was a bug, which caused an exploitable security vulnerability. The bug was reported to ffmpeg, over their preferred method for being notified about vulnerabilities in the software they maintain. Once ffmpeg fixed the bug, a CVE number was issued for the purpose of tracking (e.g. which versions are vulnerable, which were never vulnerable, which have a fix).
Having a CVE identifier is important because we can't just talk about "the ffmpeg vulnerability" when there have been a dozen this year, each with different attack surfaces. But it really is just an arbitrary number, while the bug is the actual problem.
The 90 day period is the grace period for the dev, not a demand. If they don't want to fix it then it goes public.
If this keeps up, there won't be anyone willing to maintain the software due to burn out.
In today's situation, free software is keeping many companies honest. Losing that kind of leverage would be a loss to the society overall.
And the public disclosure is going to hurt the users which could include defense, banks and other critical institutions.
That’s how open source works.
What's the point of just showering these things with bug reports when the same tool (or a similar one) can also apparently fix the problem too?
And maybe it's fine to have AI-generated articles that summarize Twitter threads for HN, but this is not a good summarization of the discussion that unfolded in the wake of this complaint. For one, it doesn't mention a reply from Google security, which you would think should be pretty relevant here.
Another bunch of people who make era-defining software where they extract everything they can. From customers, transactionally. From the first bunch, pure extraction.
They could adopt a more flexible policy for FOSS though.
chill, nobody knows what ffmpeg is
To me its okay to “demand” from a for profit company (eg google) to fix an issue fast. Because they have ressources. But to “demand” that an oss project fix something with a certain (possibly tight) timeframe.. well I’m sure you better than me, that that’s not who volunteering works
Let's say that FFMPEG has a 10 CVE where a very easy stream can cause it to RCE. So what?
We are talking about software commonly for end users deployed to encode their own media. Something that rarely comes in untrusted forms. For an exploit to happen, you need to have a situation where an attacker gets out a exploited media file which people commonly transcode via FFMPEG. Not an easy task.
This sure does matter to the likes of google assuming they are using ffmpeg for their backend processing. It doesn't matter at all for just about anyone else.
You might as well tell me that `tar` has a CVE. That's great, but I don't generally go around tarring or untarring files I don't trust.
Looks like firefox does the same.
I would be shocked if any company working with user generated video from the likes of zoom or TikTok or YouTube to small apps all over which do not have it in their pipeline somewhere.
It would be surprising to find memory corruption in tar in 2025, but not in ffmpeg.
In this world the user is left vulnerable because attackers can use published vulnerabilities that the maintainers are to overwhelmed to fix
Google runs this security program even on libraries they do not use at all, where it's not a demand, it's just whitehat security auditing. I don't see the meaningful difference between Google doing it and some guy with a blog doing it here.
Making the vulnerability public makes it easy to find to exploit, but it also makes it easy to find to fix.
Yes? It's in the license
>NO WARRANTY
>15. BECAUSE THE LIBRARY IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY FOR THE LIBRARY, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES PROVIDE THE LIBRARY "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE.
If I really care, I can submit a patch or pay someone to. The ffmpeg devs don't owe me anything.
Really, the burden is on those shipping products that depend on ffmpeg: they are the ones who have to fix the security issues for their customers. If Google is one of those companies, they should provide the fix in the given time.
The X days is a concession to the developers that the public disclosure will be delayed to give them an opportunity to address the issue.
If the obscure coded is not included by default or cannot be triggered by any means other than being explicitly asked for, then it would be reasonable to tag it Won't Fix. If it can be triggered by other means, such as auto file type detection on a renamed file, then it doesn't matter how obscure the feature is, the exploit would affect all.
What is the alternative to a time limited embargo. I don't particularly like the idea of groups of people having exploits that they have known about for ages that haven't been publicly disclosed. That is the kind of information that finds itself in the wrong hands.
Of course companies should financially support the developers of the software they depend upon. Many do this for OSS in the form of having a paid employee that works on the project.
Specifically, FFMPEG seems to have a problem that much of their limitation of resources comes from them alienating contributors. This isn't isolated to just this bug report.
It's standard practice for commercially-sponsored software, and it doesn't necessarily fit volunteer maintained software. You can't have the same expectations.
Why is Google deliberately running an AI process to find these bugs if they're just going to dump them all on the FFmpeg team to fix?
They have the option to pay someone to fix them.
They also have the option to not spend resources finding the bugs in the first place.
If they think these are so damn important to find that it's worth devoting those resources to, then they can damn well pay for fixing them too.
Or they can shut the hell up and let FFmpeg do its thing in the way that has kept it one of the https://xkcd.com/2347/ pieces of everyone's infrastructure for over 2 decades.
Are the bug reports accurate? If so, then they are contributing just as if I found them and sent a bug report, I'd be contributing. Of course a PR that fixes the bug is much better than just a report, but reports have value, too.
The alternative is to leave it unfound, which is not a better alternative in my opinion. It's still there and potentially exploitable even when unreported.
As to why they bother finding these bugs... it's because that's how Google does things. You don't wait for something to break or be exploited, you load your compiler up with santizers and go hunting for bugs.
Yeah this one is kind of trivial, but if the bug-finding infrastructure is already set up it would be even more stupid if Google just sat on it.
That is not an accurate description? Project Zero was using a 90 day disclosure policy from the start, so for over a decade.
What changed[0] in 2025 is that they disclose earlier than 90 days that there is an issue, but not what the issue is. And actually, from [1] it does not look like that trial policy was applied to ffmpeg.
> To me its okay to “demand” from a for profit company (eg google) to fix an issue fast. Because they have ressources. But to “demand” that an oss project fix something with a certain (possibly tight) timeframe.. well I’m sure you better than me, that that’s not who volunteering works
You clearly know that no actual demands or even requests for a fix were made, hence the scare quotes. But given you know it, why call it a "demand"?
[0] https://googleprojectzero.blogspot.com/2025/07/reporting-tra..., discussed at https://news.ycombinator.com/item?id=44724287
[1] https://googleprojectzero.blogspot.com/p/reporting-transpare...
codependency is when someone accepts too much responsibility, in particular responsibility for someone else or other things out of their control.
the answer is to have a "healthy neutrality".
I think this is the heart of the issue and it boils off all of the unimportant details.
If it's a real, serious issue, you want to know about it and you want to fix it. Regardless of who reports it.
If it's a real, but unimportant issue, you probably at least want to track it, but aren't worried about disclosure. Regardless of who reports it.
If it's invalid, or AI slop, you probably just want to close/ignore it. Regardless of who reports it.
It seems entirely irrelevant who is reporting these issues. As a software project, ultimately you make the judgment call about what bugs you fix and what ones you don't.
I dont agree the following framing is accurate, but I can mention it because you've already said the important part (about how this issue exists, and mearly knowing about it doesn't create required work.) But here announcing it, and registering a CVE, Google is starting the clock. By some metrics, it was already running, but the reputational risk clearly was not. This does change priorities, and requires as urgent context switch. neither are free actions, especially not within FOSS.
To me, being someone who believes everyone, individuals and groups, have a responsibility to contribute fairly. I would frame it as Google's behavior gives the appearance weaponizing their cost center externally, given this is something Google could easily fix, but instead they shirked that responsibility to unfunded volunteers.
If you (Amazon, in this case) can put it that way, it seems like throwing them 10 or 20 thousand a year would simply be a good insurance policy! Any benefits you might get in goodwill and influence are a bonus.
But on a more serious note, it is crazy that between Google and Amazon they can not fund them with 50k each per year, so that they can pay people to work on this.
Specially Google, with Youtube, they can very easily pay them more. 100k~200k easily.
And they get the tool + community good will, all for a rounding error on any part of their budgets...
That is why I said easily 100~200k. It will be a rounding error for them.
It is actually crazy that Google is not already hiring the main dev to work on ffmpeg with all the use they give it on Youtube.
I also wonder if it is maybe used by Netflix also.
They do and it is.
https://netflixtechblog.com/the-making-of-ves-the-cosmos-mic...
https://netflixtechblog.com/for-your-eyes-only-improving-net...
A rising tide lifts all yachts. If he had written the check, my instinct tells me, he would have enough for two yachts. Goodwill is an actual line item on 10Q's and 10K's. I don't know why companies think it's worth ignoring.
That's capitalism, they need to quit their whining or move to North Korea. /s The whole point is to maximize value to the shareholders, and the more work they can shove onto unpaid volunteers, the move money they can shove into stock buybacks or dividends.
The system is broken. IMHO, there outta be a law mandating reasonable payments from multi-billion dollar companies to open source software maintainers.
It's not "whining" to refuse to do unpaid labor for the benefit of someone else - especially when the someone else is as well-resourced as Google.
enterprise must pay.
Thus, as Mark Atwood, an open source policy expert, pointed out on Twitter, he had to keep telling Amazon to not do things that would mess up FFmpeg because, he had to keep explaining to his bosses that “They are not a vendor, there is no NDA, we have no leverage, your VP has refused to help fund them, and they could kill three major product lines tomorrow with an email. So, stop, and listen to me … ”
I agree with the headline here. If Google can pay someone to find bugs, they can pay someone to fix them. How many time have managers said "Don't come to me with problems, come with solutions"
It might not make sense morally, but it makes total sense from a business perspective… if they are going to pay for the development, they are going to want to maintain control.
As it stands, they're just abusing someone's gift.
Like jerks.
The "kill it with an email" probably means that whoever said this is afraid that some usecase there wouldn't stand up to an audit by the usual patent troll mothercluckers. The patents surrounding video are so complex, old and plentiful that I'd assume full compliance is outright impossible.
The API manual for it is nearly 4000 pages and it can do insane stuff[1].
I had to use it at last job(TM), it's not terrible API wise.
[1] https://docs.aws.amazon.com/pdfs/mediaconvert/latest/apirefe... CAUTION: big PDF.
I interpret this as meaning there was an implied "if you screw this up" at the end of "they could kill three major product lines with an email."
Most of them would just pirate in the old days, and most FOSS licences give them clear conscience to behave as always.
Yes, GPL 3 is a lot ideologically but it was trying to limit excessive leeching.
Now that I have opened the flood gates of a 20 year old debate, time to walk away.
So I'm not sure what GPLv3 really has to do with it in this case, if it under was a "No billion dollar company allowed" non-free-but-source-available license, this same thing would have happened if the project was popular enough for Project Zero to have looked at it for security issues.
1) dedicating compute resources to continuously fuzzing the entire project
2) dedicating engineering resources to validating the results and creating accurate and well-informed bug reports (in this case, a seriously underestimated security issue)
3) additionally for codecs that Google likely does not even internally use or compile, purely for the greater good of FFMPEG's user base
Needless to say, while I agree Google has a penny to spare to fund FFMPEG, and should (although they already contribute), I do not agree with funding this maintainer.
Providing a real CVE is a contribution, not a burden. The ffmpeg folks can ignore it, since by all indications it's pretty minor.
…. overheard at a meeting of CEO and CTO at generic evil mega tech corp recently.
Burning cash to generate spam bug reports to burden volunteer projects when you have the extra cash to burn to just fix the damn issue leaves a very sour taste in my mouth.
That's the difference between "it may or may not be that there's someone who cares" versus "no one should be running this software anywhere in the general vicinity of untrusted inputs".
Probably the right solution is to disable this codec. You should have to make a choice to compile with it; although if you're running ffmpeg in a context where security matters, you really should be hand picking the enabled codecs anyway.
If you announce a vulnerability (unspecified) is found in a project before the patch is released doesn't that just incentivize bad actors to now direct their efforts at finding a vulnerability in that project?
I don't see why actors would suddenly reallocate large amounts of effort especially since a patch is now known to be coming for the issue that was found and thus the usefulness of the bug (even if found) is rather limited.
Like how Apple stopped using up to date the GNU tools in 2008 because of GPL3. That moved showed me then that Apple did not want you to use your computer as your computer.
And of course, they won't share with each other. So another driver would be fear of a slight competitive disadvantage vs other-big-tech-monstrosity having a better version.
Now, in this scenario, some tech CEO, somewhere has this brilliant bright spark.
"Hey, instead of dumping all these manhours & resources into DIYing it, with no guarantee that we still won't be left behind - why don't we just throw 100k at the original oss project. We'll milk the publicity, and ... we won't have to do the work, and ... my competitors won't be able to use it"
I quite like this scenario.
It'll still cause Google and many others to panic, but weird and custom licenses are even worse for attracting business than open source ones.
Probably could pull in millions per year.
Submit the bug AND the patch and be done with it; don't make it someone else's problem when it's an OSS library/tool. A for-profit vendor? Absolutely. But this? Hell naw.
I'm not being dismissive. I understand the imperetive of identifying and fixing vulnerabilities. I also understand the detrimental impact that these problems can potentially have on Google.
What I don't understand is the choice to have a public facing project about this. Can anyone shine a light on this?
Their security team gaining experience on other projects can teach them some more diversity in terms of (malware) approaches and vulnerability classes, which can in turn be used to secure their own software better.
For other projects there's some vanity/reputation to be gained. Having some big names with impressive resumes publicly talk about their work can help attract talent.
Lastly, Google got real upset that the NSA spied on them (without their knowledge, they can't help against warrants of course).
Then again, there's probably also some Silicon Valley bullshit money being thrown around. Makes you wonder why they don't invest a little bit more to pay someone to submit a fix.
The recent iOS zero-day (CVE-2025-43300) targeted the rarely used DNG image format. How long before this FFMPEG vulnerability is exploited to compromise legacy devices in the wild, I wonder?
I’m not a fan of this grandstanding for arguably questionable funding. (I surely would not fund those who believe these issues are slop.) I’d like to think most contributors already understand the severity and genuinely care about keeping FFMPEG secure.
I read this as nobody wants CVEs open on their product, so you might feel forced to fix them. I find it more understandable if we talk about web frameworks: Wordpress don't want security CVEs open for months or years, or users would be upset they introduce new features while neglecting safety.
I am a nobody, and whenever I found a bug I work extra to attach a fix in the same issue. Google should do the same.
That does not impact their business or their operations in any way whatsoever.
> If it's a valid bug then it's a valid bug end of story.
This isn't a binary. It's why CVEs have a whole sordid scoring system to go along with them.
> Software owes it to its users to be secure
ffmpeg owes me nothing. I haven't paid them a dime.
I don't know what tools and backends they use exactly, but working purely by statistics, I'm sure some place in Google's massive cloud compute empire is relying on ffmpeg to process data from the internet.
I don't consider a security issue to be a "standard bug." I need to look at it, and [maybe] fix it, regardless of who reported it.
But in my projects, I have gotten requests (sometimes, demands) that I change things like the published API (a general-purpose API), to optimize some niche functionality for one user.
I'll usually politely decline these, and respond with an explanation as to why, along with suggestions for them to add it, after the fact.
Sure, triage it. It shouldn’t be publicly disclosed within a week of the report though, because the fix is still a relatively low priority.
This position likely to drive away maintainers. Generally the maintainers need these projects less than the big companies that use them. I'm not sure what Google's endgame is
Google is under no obligation to work on FFmpeg.
Google leveraging AI to spam ffmpeg devs with bugs that range from real to obscure to wrongly reported may be annoying. But even then I still don't think Google is to be held accountable for reporting bugs nor is it required to fix bugs. Note: I do think Google should help pay for costs and what not. If they were a good company they would not only report bugs but also have had developers fix the bugs, but they are selfish and greedy, everyone knows that. Even then they are not responsible for bugs in ffmpeg. And IF the bug report is valid, then I also see no problem.
The article also confuses things. For instance:
"Many in the FFmpeg community argue, with reason, that it is unreasonable for a trillion-dollar corporation like Google, which heavily relies on FFmpeg in its products, to shift the workload of fixing vulnerabilities to unpaid volunteers"
How could Google do that? It is not Google's decision. That is up to volunteers. If they refuse to fix bug reports reported from Google then this is fine. But it is THEIR decision, not Google.
"With this policy change, GPZ announces that it has reported an issue on a specific project within a week of discovery, and the security standard 90-day disclosure clock then starts, regardless of whether a patch is available or not."
Well, many opinions here. I think ALL bugs and exploits should be INSTANTLY AND WITHOUT ANY DELAY, be made fully transparent and public. I understand the other side of the medal too, bla bla we need time to fix it bla bla. I totally understand it. Even then I believe the only truthful, honest way to deal with this, is 100% transparency at all times. This includes when there are negative side effects too, such as open holes. I believe in transparency, not in secrecy. There can not be any compromise here IMO.
"Many volunteer open source program maintainers and developers feel this is massively unfair to put them under such pressure when Google has billions to address the problem."
So what? Google reports issues. You can either fix that or not. Either way is a strategy. It is not Google's fault when software can be exploited, unless they wrote the code. Conversely, the bug or flaw would still exist in the code EVEN IF GOOGLE WOULD NOT REPORT IT. So I don't understand this part. I totally understand the issue of Google being greedy, but this here is not solely about Google's greed. This is also how a project deals with (real) issues (if they are not real then you can ask Google why they send out so much spam).
That Google abuses AI to spam down real human beings is evil and shabby. I am all for ending Google on this planet - it does so much evil. But either it is a bug, or not. I don't understand the opinion of ffmpeg devs "because it is Google, we want zero bug reports". That just makes no sense.
"The fundamental problem remains that the FFmpeg team lacks the financial and developer resources to address a flood of AI-created CVEs."
Well, that is more an issue in how to handle Google spamming down people. Sue them in court so that they stop spamming. But if it is a legit bug report, why is that a problem? Are ffmpeg devs concerned about the code quality being bad? If it is about money then even though I think all of Google's assets should be seized and the CEOs that have done so much evil in the last 20 years be put to court, it really is not their responsibility to fix anything written by others. That's just not how software engineering works; it makes no sense. It seems people confuse ethics with responsibilities here. The GPL doesn't mandate code fixing to be done; it mandates that if you publish a derivative etc... of the code, that code has to be published under the same licence and made available to people. That's about it, give or take. It doesn't say corporations or anyone else HAS to fix something.
"On the other hand, security experts are certainly right in thinking that FFmpeg is a critical part of the Internet’s technology framework and that security issues do need to be made public responsibly and addressed."
I am all for that too, but even stricter - all security issues are to be made public instantly, without delay, fully and completely. I went to open source because I got tired of Microsoft. Why would I want to go back to evil? Not being transparent here is no valid excuse IMO.
"The reality is, however, that without more support from the trillion-dollar companies that profit from open source, many woefully underfunded, volunteer-driven critical open-source projects will no longer be maintained at all."
Wait - so it is Google's fault if projects die due to lack of funding? How does that explanation work?
You can choose another licence model. Many choose BSD/MIT. Others choose GPL. And so forth.
"For example, Wellnhofer has said he will no longer maintain libxml2 in December. Libxml2 is a critical library in all web browsers, web servers, LibreOffice and numerous Linux packages. We don’t need any more arguments; we need real support for critical open source programs before we have another major security breach."
Yes, that is a problem - the funding part. I completely agree. I still don't understand the "logic" of trying to force corporations to have to do so when they are not obliged. If you don't want corporations to use your code, specify that in the licence. The GPL does not do that. I am confused about this "debate" because it makes no real sense to me from an objective point of view. The only part that I can understand pisses off real humans is when Google uses AI as a pester-spam attack orgy. Hopefully a court agrees and spits up any company with more than 100 developers into smaller entities the moment they use AI to spam real human beings.
JamesBarney•1h ago
Msurrow•1h ago
Now, if Google or whoever really feels like fixing fast is so important, then they could very well contribute by submitting a patch along with their issue report.
Then everybody wins.
danlitt•1h ago
This is very far from obvious. If google doesn't feel like prioritising a critical issue, it remains irresponsible not to warn other users of the same library.
Msurrow•1h ago
foolswisdom•1h ago
afiori•58m ago
derf_•1h ago
I don't want to discourage anyone from submitting patches, but that does not necessarily remove all (or even the bulk of) the work from the maintainers. As someone who has received numerous patches to multimedia libraries from security researchers, they still need review, they often have to be rewritten, and most importantly, the issue must be understood by someone with the appropriate domain knowledge and context to know if the patch merely papers over the symptoms or resolves the underlying issue, whether the solution breaks anything else, and whether or not there might be more, similar issues lurking. It is hard for someone not deeply involved in the project to do all of those things.
Ygg2•1h ago
Not publicly disclosing it also carries risk. Library users get wrong impression that library has no vulnerabilities, while numerous bugs are reported but don't appear due to FOSS policy.
phoronixrly•1h ago
NobodyNada•52m ago
You can never be sure that you're the only one in the world that has discovered or will discover a vulnerability, especially if the vulnerability can be found by an LLM. If you keep a vulnerability a secret, then you're leaving open a known opportunity for criminals and spying governments to find a zero day, maybe even a decade from now.
For this one in particular: AFAIK, since the codec is enabled by default, anyone who processes a maliciously crafted .mp4 file with ffmpeg is vulnerable. Being an open-source project, ffmpeg has no obligation to provide me secure software or to patch known vulnerabilities. But publicly disclosing those vulnerabilities means that I can take steps to protect myself (such as disabling this obscure niche codec that I'm literally never going to use), without any pressure on ffmpeg to do any work at all. The fact that ffmpeg commits themselves to fixing known vulnerabilities is commendable, and I appreciate them for that, but they're the ones volunteering to do that -- they don't owe it to anyone. Open-source maintainers always have the right to ignore a bug report; it's not an obligation to do work unless they make it one.
Vulnerability research is itself a form of contribution to open source -- a highly specialized and much more expensive form of contribution than contributing code. FFmpeg has a point that companies should be better about funding and contributing to open-source projects that they rely on, but telling security researchers that their highly valuable contribution is not welcome because it's not enough is absurd, and is itself an example of making ridiculous demands for free work from a volunteer in the open-source community. It sends the message that white-hat security research is not welcome, which is a deterrent to future researchers from ethically finding and disclosing vulnerabilities in the future.
As an FFmpeg user, I am better off in a world where Google disclosed this vulnerability -- regardless of whether they, FFmpeg, or anyone else wrote a patch -- because a vulnerability I know about is less dangerous than one I don't know about.