I jest; the vagueposting led to uninformed speculation, panic, reddit levels of baseless accusation, and harassment of the developers: https://news.ycombinator.com/item?id=43477057
I hope Google's experiment doesn't turn out the same.
To be fair, it seems like the only way of avoiding something like that is never saying anything publicly. The crowds of the internet eagerly jump into any drama, vague or not, and balloon it regardless.
If you google his name, 80% of the results are articles about how he denying doing that on purpose.
Resurrecting a 4 month old issue that evaporated in a day or two seems like poor form to me.
Also I believe most of the responsibility for the negative behavior should be assigned to those actually engaging in it, not the initial post. I understand others reasonably disagree (notably about the accusation and harrassment).
Tbh, it sounds like you might have been personally affected? At any rate, I certainly don't condone a mob mentality.
I bring it up because of the unmissable parallels. Google are trialling a policy to see what will happen, but this incident shows already what can happen.
RbtB is a trusted blog by the HN crowd, and her vaguepost unexpectedly whipped up hysteria. It was only quelled by a post with more details the next day. Google Project Zero has enormous levels of trust, intends to vaguepost as policy, and not post more details the next day to satisfy the mob.
It does not look good for volunteer maintainers to suffer an entire world of talentless clowns rifling through every commit and asking "is this the bug Project Zero found?"
Propagating the fix downstream depends on the release cycles of all downward vendors. Giving them a heads up will help planning, but I doubt it will significantly impact the patching timeline.
It is highly more likely that companies will get stressed that the public knows they have a vulnerability, while they are still working to fix it. The pressure from these companies will probably shut this policy change down.
Also, will this policy apply also to Google's own products?
Google's products represent 3/6 of the initial vulnerabilities following this new reporting policy in the linked reporting page.
At the same time, I think publicly sharing that some vulnerability was discovered can be valuable information to attackers, particularly in the context of disclosure on open source projects: it's been my experience that maintaining a completely hermetic embargo on an OSS component is extremely difficult, both because of the number of people involved and because fixing the vulnerability sometimes requires advance changes to other public components.
I'm not sure there's a great solution to this.
For customers, it also gives them leverage to contact vendors and ask politely for news on the patch.
(And to be clear: I see the benefit here. But I'm talking principally about open source projects, not the vendors you're presumably paying.)
There's always a usability / functionality vs security tradeoff
Why is this? Especially for smaller or more stable open-source projects, the number of commits in a 90-day period that have the possibility to be security-relevant are likely to be quite low, perhaps as low as single digits. So the specific commit that fixes the reported security issue is highly likely to be identified immediately, and now there's a race to develop and use an exploit.
As one example, a stable project that's been the target of significant security hardening and analysis is the libpng decoder. Over the past 3 months (May 1 - Jul 29), its main branch has seen 41 total commits. Of those, at least 25 were non-code changes, involving documentation updates, release engineering activities, and build system / cross-platform support. If Project Zero had announced a vulnerability in this project on May 1 with a disclosure embargo of today, there would be at most 16 commits to inspect over 3 months to find the bug. That's not a lot of work for a dedicated team.
So now, do we delay publishing security fixes to public repos and try and maintain private infrastructure and testing for all of this? And then try and get a release made, propagated to multiple layers of downstream vendors, have them make releases, etc... all within a day or two? That's pretty hard, just organizationally. No great answers here.
This paragraph is very confusing: What data is meant by "this data"? If they mean the announcement of "there's something", isn't the timeline of disclosure made public already under current reporting policy once everything has been opened up?
In other words, the date of initial report is not new data? Sure the delay is reduced, but it's not new at all in contrast to what the paragraph suggests.
Closed source software doesn't get to benefit from the goodwill of the open source software community, which includes independent security researchers as well as orgs like P0.
I guess our disagreement can be distilled down to one question:
Why would an emphasis on closed source products help FOSS, and why would an emphasis on FOSS help closed source?
Because this seems backwards to me. Maybe it makes sense in public relations where vibes are more important than substance and nobody thinks for more than 100 milliseconds?
> I just stepped down as libxslt maintainer and it's unlikely that this project will ever be maintained again. It's even more unlikely with Google Project Zero, the best white-hat security researchers money can buy, breathing down the necks of volunteers.
I’ve not really made up my mind about what happened with libxml2, to be clear. Perhaps in this world some projects really are vulnerable enough that they deserve to die. But as we see, this can entail essentially punishing people who decide to take up e.g. parsers as a hobby. And not doing that is something I feel I value higher than even security of the software ecosystem as a whole.
Project zero could also open a mailing list for trusted downstreams and publish the newly found announcements there.
The real goal seems to be to increase pressure on upstream, which in our modern times ranks lowest on the open source ladder: Below distributors, corporations, security pundits (some of whom do not write software themselves and have never been upstream for anything) and demanding users.
esnard•12h ago
- Vulnerability Disclosure FAQ ( https://googleprojectzero.blogspot.com/p/vulnerability-discl... )
- Reporting Transparency ( https://googleprojectzero.blogspot.com/p/reporting-transpare... )