> But right now there are still no signed dependencies and nothing stopping people using AI agents, or just plain old scripts, from creating thousands of junk or namesquatting repositories.
This is as close as we get in this particular piece. So what's the alternative here exactly - do we want uploaders to sign up with Microsoft accounts? Some sort of developer vetting process? A curated lib store? I'm sure everybody will be thrilled if Microsoft does that to the JS ecosystem. (/s) I'm not seeing a great deal of difference between having someone's NPM creds and having someone's signing key. Let's make things better but let's also be precise, please.
Considering these attacks are stealing API tokens by running code on developer's machines; I don't see how signing helps, attackers will just steal the private keys and sign their malware with those.
https://duckduckgo.com/?q=site%3Atheonion.com+%22no+way+to+p...
Also, smaller package managers tend to learn from these attacks on npm, and by the time the malware authors try to use similar types of attacks on them the registries already have mitigations in place.
Ruby has had signed gems since v2 [2].
These aren't a panacea. But they do mean an effort has been made.
npm has been talking about maybe doing something since 2013 [3], but ended up doing... Nothing. [4]
I don't think it's fair to compare npm to the others.
[0] https://docs.pypi.org/attestations/producing-attestations/
[1] https://docs.pypi.org/trusted-publishers/
[2] https://docs.ruby-lang.org/en/master/Gem/Security.html
https://docs.npmjs.com/trusted-publishers
https://docs.npmjs.com/generating-provenance-statements
Trusted Publishing is relatively new - GA-ed in July https://github.blog/changelog/2025-07-31-npm-trusted-publish...
In a very real sense, it shifts responsibility to someone else. For example, if the uploader was using Google as their identity provider and their Google account was popped, the attacker would be able to impersonate the uploader. So I wouldn’t describe it as establishing a strong trust relationship with the uploader.
It only meaningfully improves the security of the NPM ecosystem if (a) everyone is forced to sign packages and (b) identity providers require more secure authentication methods with as hardware tokens or passkeys.
"go get" doesn't execute downloaded code automatically; there's no "postinstall" script (there can be a manual "go generate" or "go tool" the user may run)
Go doesn't upgrade existing dependencies automatically, even when adding a new dependency: you need an explicit "go get -u"
You don't use the same tool to both fetch and publish ("go get" vs "git push") so it's less likely a module publisher would get pwned while working on something unrelated.
The Go community tends not to "micropublish" so fewer people have or need commit rights to published go modules.
Go has a decent standard library so there are fewer "missing core functionality" third-party packages that world + dog depends on.
Npm is easier to pwn than Go, Maven, RubyGems, PyPI, CPAN, etc. because the design has more footguns and its community likes it that way
I'll add a 6th difference: "go get" downloads source code, not maintainer-provided tarballs. You can't sneak extra things in there that aren't in the code repo.
That’s why the joke doesn’t really work: America is a huge outlier for gun violence because we lack structural protections. Australia doesn’t have fewer attacks in proportion to a smaller population, they have a lower rate of those attacks per-capita because they have put rules in place to be less of a soft target.
It's literally just a joke. If it tickles your fancy, it works for you. If you get lost in the weeds of comparing the socio-political mechanisms of open source to guns, or note that supply chain attacks happen to other package managers, the joke won't work for you.
I assure you, it works just fine for me even though yes I think it would be ridiculous to claim there's anything more to the comparison than, "This thing keeps happening, nobody thinks doing anything about it is worth the bother, so look at that, it keeps happening."
The issue is the people that use npm, and choose to have 48 layers of dependencies without paying for anything. Blaming microsoft, which is a company which pays engineers and audits all of its code and dependencies, is a step in the wrong direction in the necessary self reflection path off npm vulns.
I reckon that the ecosystem would have been much healthier if NPM had not been kept running without the care it requires.
Microsoft isn’t any better steward than the original teams.
This issue has happened Plenty under Microsoft’s ownership.
Would love to hear your genius solutions right here that Microsoft is too dumb to come up with and implement.
Future attacks would then require a level access of access that's already synonymous with "game over" for all intents and purposes (e.g. physical access, malware, or inside job). It's not bulletproof but it would be many orders of magnitude better than the current situation.
That phishing email is just one of the ways attackers use to infiltrate, which is not Microsoft's problem to begin with. Next time, the attacker could install malware in your machine that silently runs code and publish a package on your behalf using your own credentials stored locally while you think everything is ok, and you'd still blame Microsoft for not doing enough.
I already addressed this in the previous comment but I hope you realize the absurdity of this statement. If the attacker can corner you in a dark alley they can steal your yubikey and beat the PIN out of you, too. By that logic is 2FA futile and should we all stop using it?
Security isn't binary, simply raising the bar from falling for a phishing email to gaining access to someone's machine will probably eliminate 99% of all compromises.
> and you'd still blame Microsoft for not doing enough
Gaining access to someone's machine is definitive "game over" scenario, using that as an excuse not to harden security to the point that that's the only option left is lazy and irresponsible. Even with that kind of access, code signing will slow the viral spread way down which would make a difference.
Once you make it hard to hijack packages, time will be better spent on investing in sandboxing which also protects people from insider threats.
Hilarious that you think this is a some sort of impossible feat for a trillion dollar company.
The tradeoff for security is usability and the worse the usability gets the more people fight back against it.
Microsoft already owned GitHub. I don't see how acquiring npm would make a meaningful difference with respect to training material, especially since npm was already an open package repository which anyone could download without first buying the company.
sure there are other ways for the package maintainer to notice they were pwned, but often they will not notice.
https://docs.renovatebot.com/configuration-options/#minimumr...
You can probably get a list of the repos with a github API or something.
Git clone with org admin user credentials (can be read only) so you have access to all the repos.
run grep on all package.json files, search for all of the affected repos.
No need to do any code regarding versions, just filter it down and manually process versions if needed. If you have any of these packages, no matter the versions, you should already be making efforts to migrating, kill the baby with the bathwater, cut off the arm before the gangrene spreads. At any rate you can check versions manually after you have filtered it down to something reasonable, part of automating is knowing when to stop.
Proxy NPM with something like Artifactory which stops the bad package getting back in or ending up in any new builds.
Follow it up with endpoint protection to weed the package out of the local checked out copies and .npm on the individual dev boxes.
but only npm started with a desire to monetize it (well, npm and docker hub) and in its desire for control didn't implement (or allowed the community to implement) basic higiene.
1. Switch to pnpm, it's not only faster and more space efficient, but also disables post-install scripts by default. Very few packages actually need those to function, most use it for spam and analytics. When you install packages into the project for the first time, it tells you what post-install scripts were skipped, and tells you how to whitelist only those you need. In most projects I don't enable any, and everything works fine. The "worst" projects required allowing two scripts, out of a couple dozen or so.
They also added this recently, which lets you introduce delays for new versions when updating packages. Combined with `pnpm audit`, I think it can replace the last suggestion of setting up a helper dependency bot with zero reliance on additional services, commercial or not:
https://pnpm.io/settings#minimumreleaseage
2. If you're on Linux, wrap your package managers into bubblewrap, which is a lightweight sandbox that will block access to almost all of your system, including sensitive files like ~/.ssh, and prevent anything running under it from escalating privileges. It's used by flatpak and Steam. A fully working & slightly improved version was posted here:
https://news.ycombinator.com/item?id=45271988
I posted the original here, but it was somewhat broken because some flags were sorted incorrectly (mea culpa). I still prefer using a separate cache directory instead of sharing the "global" ~/.cache because sensitive information might also end up there.
https://news.ycombinator.com/item?id=45041798
3. Setup renovate or any similar bot to introduce artificial delays into your supply chain, but also to fast-track fixes for publicly known vulnerabilities. This suggestion caused some unhappiness in the previous discussion for some reason — I really don't care which service you're using, this is not an ad, just setup something to track your dependencies because you will forget it. You can fully self-host it, I don't use their commercial offering — never has, don't plan to.
https://docs.renovatebot.com/configuration-options/#minimumr...
https://docs.renovatebot.com/presets-default/#enablevulnerab...
4. For those truly paranoid or working on very juicy targets, you can always stick your work into a virtual machine, keeping secrets out of there, maybe with one virtual machine per project.
1. Documentation is virtually nonexistent. I think that is inexcusable for a security tool!
2. The man page says that it's deprecated, and has done for around a decade. No news on when they will actually remove it, maybe they never will? Hard to recommend it with that axe hanging over it though.
>Hard to recommend it with that axe hanging over it though.
Given the alternative being no way to limit untrusted tooling at all today, it seems worthwhile using it despite these problems?
There's also a (very slim) chance that if it became central to the security of developers on macOS that Apple would give slightly more consideration to it
0: https://chromium.googlesource.com/chromium/src/+/refs/heads/...
in pnpm docs it says:
""" enablePrePostScripts Default: true Type: Boolean When true, pnpm will run any pre/post scripts automatically. So running pnpm foo will be like running pnpm prefoo && pnpm foo && pnpm postfoo. """
am i missing something here?
Would love to see some default-secure package management / repo options. Even a 24 hour delayed mirror would be better than than what we have today.
find . -name package.json -not -path "/node_modules/" -exec sh -c ' for pkg; do lock="$(dirname "$pkg")/package-lock.json" [ -f "$lock" ] || continue tmp="$(mktemp)" jq --argfile lock "$lock" \ ".dependencies |= with_entries(.value = $lock.dependencies[.key].version) | .devDependencies |= with_entries(.value = $lock.dependencies[.key].version // $lock.devDependencies[.key].version)" \ "$pkg" > "$tmp" && mv "$tmp" "$pkg" done ' sh {} +
What does this actually achieve?
Whether that's so important, I'm not sure.
So, you pin the version and update periodically when security issues arise in your dependencies.
This is indented
By two spacesMy read of your article is that you don't like postinstall scripts and npx.
I'm not convinced that removing those would have a particularly major impact on supply chain attacks. The nature of npm is that it distributes code that is then executed. Even without npx, an attacker could still release an updated package which, when executed as a dependency, steals environment variables or similar.
And in the meantime, discarding box would break the existing workflows of practically every JavaScript-using developer in the world!
You mention 2FA. npm requires that for maintainers of the top 100 packages (since 2022), would you like to see that policy increased to the top 1,000/10,000/everyone? https://github.blog/security/supply-chain-security/top-100-n...
You also mention code signing. I like that too, but do you think that would have a material impact on supply chain attacks given they start with compromised accounts?
The investment I most want to see around this topic is in sandboxing: I think it should be the default that all code runs in a robust sandbox unless there is as very convincing reason not to. That requires investment that goes beyond a single language and package management platform - it's something that needs to be available and trustworthy for multiple operating systems.
The biggest problem with npm is that it is too popular. Nothing else. Even if you "mitigate" some of the risks by removing features like postinstall, it barely does anything at all -- if you actually use the package in any way, the threat is still there. And most of what we see recently could happen to crates.io, pypi etc as well.
It is almost frustrating to see people who don't understand security talk about security. They think they have the best, smartest ideas. No, they don't, otherwise they would have been done a long time ago. Security is hard, really hard.
Npm could add this as an automated step during publishing. Sure, there's a manual review needed for anything flagged, but you can easily fix this as well by having smth like a trusted contributor program where let's say you'd need 5 votes to overrule a package being flagged as malware
2FA isn't a solution to security, it's a solution to hinder and dissuade low-effort hackers from compromising accounts - it's still subject to social engineering (like spearphishing).
I tend to agree with your broader point - sandboxing will be the way to go, I've been having that very discussion today. we're also now enforcing CI pipelines with pinned dependencies (which we do with our helm charts, but npm by default will install with ^ semver and putting that on the developer to disable isn't good enough - the problem of course is that requires the OS vendors to agree on what is common.
This is a riff - not sure how possible this is, but it's not coming from nowhere, it's based on work I did 8 years back (https://github.com/takeoff-env/takeoff) - using a headless OS container image with a volume pointing to the source folder, run the install within the container (so far so good, this is any multi-stage docker build)
The key part would be to then copy the node_modules in the volume _data folder back to the host - this would of likely require the OS vendors to provide timely images with each release of their OS to handle binary dependencies, so is likely a non-starter for OSX.
I know there are some reports about the lockfile not always working as expected. Some of those reports are outdated info from like 2018 that is simply not true anymore, some of that is due to edge cases like somebody on team having outdated version of npm or installing a package but not committing the changes to lockfile right away. Whatever the reason, pinned version ranges wouldn't protect against that. Using npm ci instead of npm install would.
That's sort of the thing - all of these measures are just patches on the fundamental problem that npm has just become too unsafe
This helps with localized risk, and some production risk - but not all of it.
NPM packages have become a huge nuisance security wise.
This is another huge pet peeve of mine is how hard it is to have a good container pipeline to build containers without running root - we tried some of the alternatives but they all had drawbacks - easiest is to just use GitHub Ubuntu images and hope for the best (although I recently saw some improvement in this area we want to investigate)
At the time, I was focusing more on the approach of reducing the number of people you have to trust when you depend on a particular package.
* The endless arms race.
But, nevermind. It's been 2 years since Jia Tan and the amount of such 'occurrences' in the npm ecosystem in the past 10 years are bordering on uncountable at this point.
And yet this hack got through? This amateuristic and extremely obvious attempt? The injected function was literally named something like 'raidBitcoinEthWallet' or whatnot.
npm has clearly done absolutely nothing in this regard.
We haven't even gotten to the argument of '... but then hackers will simply use the automated tools themselves and only release stuff that doesn't get flagged'. There's nothing to talk about; npm has done nothing.
Which gets us to:
* Web of trust
This seems to me to be a near perfect way for the big companies that have earned so, so much money using the free FOSS they rely on, to contribute.
They spend the cash to hire a team that reviews FOSS stuff. Entire libraries, sure, but also updates. I think most of them _already do this_, and many will even openly publish issues they found. But they do not publish negative results (they do not publish 'our internal team had a quick look at update XYZ of project ABC and didn't see anything immediately suspicious').
They should start doing that. And repos like npm, maven, CPAN, etcetera should allow either the official maintainer of a library, or anybody, to attach 'assertments of no malicious intent'.
Imagine that npm hosts the following blob of text for NPM hosted projects in addition to the javascript artefacts:
> "I, google dev/security team, hereby vouch for this update in the senses: not-malicious. We have looked at it and did not see anything that we think is suspicious or worse. We make absolutely no promises whatsoever that this library is any good or that this update's changelog accurately represents the changes in it; we merely vouch for the fact that we do not think it was written with malicious intent. We explicitly disavow any legal or financial liability with this statement; we merely stake our good name. We have done this analysis on 2025-09-17 and did it for the artefact with SHA512 hash 1238498123abac. Signed, [public/private key infrastructure based signature, google.com].
And a general rule that google.com/.well-known/vouch-public-key or whatever contains the public key so anybody can check.
Aside from Jia Tan/xz (which always stops any attempt; Jia Tan/xz was so legendary, exactly how the fuck THIS still happens given that massive wakeup call boggles my mind!), every supply chain attack was pretty dang easy to spot; the problem was: Nobody was looking, and everybody configures their build scripts to pick up point updates immediately.
We should update these to 'pick them up after a certain 'vouch' score is reached'. Where everybody can mess with their scoring tables (don't trust google? reduce the value of their vouch to 0).
I think security-crucial 0day fixing updates will not be significantly hampered by this; generally big 0-days are big news and any update that comes out gets analysed to pieces. The vouch signatures would roll in within the hour after posting them.
Not every project and team can do that. But when feasible, it's a strong mitigation layer.
What worked was splitting dependency diff review among the team so it's less of a burden. We pin exact versions and update judiciously.
You are doing the work. These automatic library installing services seem to have a massive free-rider problem.
ESLint would be another culprit, adding 80 packages.
It quickly gets out of hand.
To me it seems like the fewest projects could use this approach you described.
https://github.com/lukeed/uvu is a testing library with almost no dependency.
https://github.com/biomejs/biome is a linter written in Rust which in theory has a smaller attack surface.
And as long as you stay some versions behind bleeding edge, you can use time in your favor to catch supply chain attacks before they reach your codebase.
Maybe you can.
Or you're talking about an approach you utilized in some side projects rather than moderately sized commercial web applications? I don't imagine there's many out there that have less than hundreds of dependencies.
Just because the project is large doesn't mean we should give up on reducing dependencies.
Hundreds is much better than thousands.
But occasionally I'll use vitest as well which has the same api as jest, and is much simpler. Especially if vite is already being used. It has a much smaller dependency tree.
npm as designed really aggressively likes to upgrade things, and the culture is basically to always blindly upgrade all dependencies as high as possible.
It's sold as being safer by patching vulnerabilities, but most "vulnerabilities" are very minor or niche, whereas a lot of risk is inherent in a shifting foundation.
Like it or not it's kind of a cultural problem. Recursively including thousands of dependencies, all largely updating with no review is a problem.
The thing I find particularly frightful and distinctive from the other package managers I regularly use is there is zero guarantee that the code a library presents on GitHub has anything to do with it's actual content in NPM. You can easily believe you've reviewed an items code by looking at it on GitHub, but that can have absolutely zero relation to what was actually uploaded to npm. You have to actually review what's been uploaded to npm as its entirety disconnected.
Crates.io and several other popular package managers have the exact same problem. Submitted packages are essentially a blob of loose files with the source code being mere metadata provided by the uploader (or attacker!)
The logic behind this is that not every package comes from a source repository that is based on Git and there may not be a convenient and trustworthy "web link" back to the matching commit. Some SCM systems don't even have cryptographically hashed commits with the same level of "stability" as a Git commit id!
IMHO all such public package repositories should do their own Git hosting for the package file contents. I.e.: to publish you'd have to push your code to their repo instead of uploading files.
Ideally they should also scan all uploads in various ways, run reproducible builds for platforms where that makes sense, etc...
Shai-Hulud malware attack: Tinycolor and over 40 NPM packages compromised
1. "2FA doesn't work". Incorrect. MFA relying on SMS or TOTP is vulnerable to phishing. Token or device based is not. And indeed GitHub sponsored a program to give such tokens away to critical developers.
In 2021.
2. "There's no signing". Sigstore support shipped in like 2023.
The underlying view is that "Microsoft isn't doing anything". They have been. For years. Since at least 2022, based on my literal direct interactions with the individuals directly tasked to do the things that you say aren't or haven't been done.
I have no association with npm, GitHub or Microsoft. My connection was through Shopify and RubyGems. But it really steams me to see npm getting punched up with easily checked "facts".
Well said.
cube00•4mo ago
The companies? More like the unpaid open source community volunteers who the Fortune 500 leech off contributing nothing in return except demands for free support, fixes and more features.
delduca•4mo ago
austin-cheney•4mo ago
pavel_lishin•4mo ago
Can you say more about this?
austin-cheney•4mo ago
tcoff91•4mo ago
With just a bit of retraining those engineers that could not be productive without a ton of npm packages could ship an iPhone app written in Swift.
JS’ standard library is abysmal.
austin-cheney•4mo ago
Its just a software platform. Would you really blame society for being too harsh if doctors, lawyers, police, teachers cannot do their jobs? It is weird to see so many people blame the web platform for hostility when its so much less challenging than it used to be.
The most common cause of these frustrations I encountered while working in JavaScript is that developers are educated in something that looks like A, but JavaScript is not A, there is no training for JavaScript/Web, so therefore JavaScript/Web is hostile. As a self-taught developer that never made sense to me.
marcosdumay•4mo ago
jlaternman•4mo ago
tcoff91•4mo ago
Without libraries it’s incredibly hard to be productive building applications. It’s only with dependencies that the web becomes an acceptable application platform.
Look at how much JS it takes to implement a material-ui textfield that automatically grows and shrinks. Building a date picker is a pain in the ass. Making sure those things follow all the arcane aria standards to be accessible is difficult. There’s no good reason why everyone should have to rebuild their own date picker.
Without libraries the web is the hardest application platform to use by far if you are trying to build actual apps and not just websites with content.
austin-cheney•4mo ago
I would do that with CSS.
> Building a date picker is a pain in the ass. Making sure those things follow all the arcane aria standards to be accessible is difficult.
If you want to display a visual calendar then yes, mostly. However, if instead you make date picking relative to now then it becomes very simple. It’s just adding or subtracting numbers from Date.now(). You can even produce date spans super easily.
I understand where you are going. When everything starts from a visual UI perspective the code is just an implementation detail except that it’s dense. If instead you start at from the implementation perspective of how it really works at the lowest level then everything just appears step by step. Nobody starts building beautiful skyscrapers from the visual exterior first. No, they lay the foundation, a boring slab of concrete around some grounding poles.
pavel_lishin•4mo ago
austin-cheney•4mo ago
The actual engineers just walk around to validate the work conforms to the written plans. That is why these large engineering companies prefer to hire only from labor unions in a location that is extremely anti-union, because the union has the social resources to validate the candidates in ways the employer does not.
Even in that environment there are more experienced people who are 10x producers.
em-bee•4mo ago
user34283•4mo ago
austin-cheney•4mo ago
But the fact the concerns of superiority come up so repeatedly just serves to illustrate how much insecurity is baked into the workforce. Confident people don’t worry about high confidence in other people.
giantg2•4mo ago
rkagerer•4mo ago
MrGilbert•4mo ago
I'd erase that part entirely, as it is not true, from my point of view. My day, as has every other person's day, has exactly 24 hours. As an employee, part of that time is dedicated to my employer. In return, I receive financial compensation. It's up to them to decide how they want to spend the resources they acquired. So yes, each and every company could, in theory, contribute back to Open Source.
But as there is no price tag attached to Open Source, there is also no incentive. In a highly capitalized world, where share holder value is more worth than anything else, there are only a few companies that do the right call and act responsible.
watwut•4mo ago
It is not just that. In a well functioning theoretical free market, no one is going to have time either. The margins are supposed to end up being tight and the competition is supposed to weed out economic inefficiency. Voluntary pro-social behavior is a competitive disadvantage and an economic inefficiency. So, by design, the companies end up not "having time for that".
You need a world that allows for inefficiency and rewards pro-social behavior. That is not the world where we are living in currently.
Ajedi32•4mo ago
watwut•4mo ago
Second, working job is about earning money not about helping others.
Ajedi32•4mo ago
My point is if you explicitly choose to work for free you're opting out of that reward structure. It seems odd to do that and then turn around and complain that "the world where we are living in" isn't rewarding you for your work.
FuriouslyAdrift•4mo ago
giantg2•4mo ago
bonoboTP•4mo ago
cap11235•4mo ago
tedggh•4mo ago
watwut•4mo ago
clbrmbr•4mo ago
Not a lot of applications being maintained by altruists, but look under the hood in Linux/GNU/BSD and you fill find a lot of volunteers motivated by something other than money.
izacus•4mo ago
xrisk•4mo ago
clbrmbr•4mo ago
graemep•4mo ago
I think there are very few projects that do not accept support in any form.
izacus•4mo ago
For a lot of single developers that's not a thing they're ready or able to do. Those that can, usually have companies established as a revenue source for their OSS project.
pessimizer•4mo ago
The need for this invoice is because companies cannot justify irrational spending. The have no process for gift-giving. There is almost nothing that will make spending on OSS not irrational, unless you're paying for specific bugfixes or customization work. You can't issue an invoice for nothing. How much would the invoice be for?
edit: that being said, please continue to make up any pretense to get OSS contributors paid if that's working for anyone.
Arch-TK•4mo ago
clbrmbr•4mo ago
cube00•4mo ago
graemep•4mo ago
austin-cheney•4mo ago
Most of the Linux foundation projects, which includes Node are volunteers. Most of the Apache foundation software is from volunteers. Most NPM packages are from volunteers. OpenSSL is volunteers.
There is also a big difference between the developers who are employees on salary versus those that receive enough donations to work in open source full time.
watwut•4mo ago
The survey found that specifically linux code is dominated by people who are paid for it.
> Most of the Apache foundation software is from volunteers.
Large Apache project specifically are backed by companies per Apache rules. Each project must have at least three active backing companies. They contribute the most of the code.
throw-qqqqq•4mo ago
Yes the kernel code, but the Linux Foundation projects (mentioned in the comment you quote) are MUCH more than the kernel.
See the list on https://www.linuxfoundation.org/projects
izacus•4mo ago
davedx•4mo ago
josephg•4mo ago
I think there are a lot of high profile opensource projects which are either run by corpos (like React) or have a lot of full time employees submitting code (Linux). But there’s an insanely long tail of opensource projects on npm, cargo, homebrew etc which are created by volunteers. Or by people scraping by on the occasional donation.
watwut•4mo ago
josephg•4mo ago
watwut•4mo ago
bonoboTP•4mo ago
"unpaid volunteer working full time" also doesn't sound like something that someone would believe. Full time and unpaid rarely go together.
rs186•4mo ago
ricardobeat•4mo ago
It’s also ok to release paid free software, or closed software, restrictive licenses, commercial licenses, and sell support contracts. It’s a choice.
sarchertech•4mo ago
There’s also lot of pressure for devs not to use licenses that restrict use by large companies. Try adding something to your license that says companies making over $10 million per year in revenue have to pay, and half of the comments on show HN will be open source warriors either asking why you didn’t use a standard license or telling you that this isn’t open source and you have brought dishonor to your family.
ricardobeat•4mo ago
This implies some kind of fairness/moral contract in a license like MIT. There is none. It’s the closest thing to donating code to the public domain, and entirely voluntary.
There are plenty of standard licenses with similar clauses restricting commercial use, no need to create a custom one.
But indeed, the truth is that a restrictive license will massively reduce the project’s audience. And that is a perfectly fine choice to make.
sarchertech•4mo ago
The license tells you what you are legally allowed to do. It doesn’t supersede basic concepts of fairness.
The average person would say that if you directly make millions of someone else’s work, the fair thing to do is to pay that person back in some way.
Calling someone a leech is just saying that they aren’t following the the accusers model of fairness. That’s all. There’s no legal definition.
We say things like “my company screwed me over when they fired me right before my RSUs vested” despite that being perfectly legal.
ricardobeat•4mo ago
It is not “their” work anymore (IP rights discussions aside) once they published with an unrestricted license. That’s the point. You do it expecting nothing in return, and do it willingly. Expecting “fairness” is a misunderstanding of the whole spirit of it.
brookst•4mo ago
It comes about because “work” is overloaded to mean both the activity of creating and the product/result of that activity.
sarchertech•4mo ago
Let’s ignore that no one contributes to open source expecting nothing in return.
I can help someone out expecting nothing in return. Then if my situation changes and I need help, but they look at me and say “sorry your help was a gift so I’m not going to return the favor even though I can”. That person is a dick.
The problem is you are taking the act of applying a permissive license as some kind of ceremony that severs open source software from all normal human ideas of fairness. You may view it that way. Most people don’t.
It’s perfectly reasonable to put something out in the world for other people to enjoy and use. And yet still think that if someone makes a billion dollars of it and doesn’t return anything they are displaying bad manners.
Chris2048•4mo ago
It sounds like you did expect something in return, conditional on your circumstances. Maybe it's good-will or something, but some kind of social insurance in any case.
sarchertech•4mo ago
But in the example above it’s entirely possible that you helped someone out with no expectation of being paid back. Let’s say you’re rich and the person you helped is a chronic drug addict. You have no expectation of every needing help and no expectation that the person you helped will ever be in a position to help you.
Let’s say I give a homeless person a dollar. He turns around and uses that dollar to buy a lottery ticket and wins 100 million dollars. Years later, I’m homeless and the former homeless guy walks past me and gives me a lecture about how I should have put conditions on my donation.
In that situation there was no reasonable expectation for anything except as you said maybe good will. But of course open source developers also expect good will.
Chris2048•4mo ago
Are you saying you are entitled to the winning because you gave him the dollar Or because you gave him anything at all? Would you be happy to get $2 back?
If they spent the money on crack and overdose, are you to blame?
nemomarx•4mo ago
As a bonus maybe you can get some proprietary software open sourced too.
fleebee•4mo ago
[1]: https://opensource.google/documentation/reference/using/agpl...
nemomarx•4mo ago
em-bee•4mo ago
putting aside the argument about how infectious the GPL is in general, the the current AGPL is based on the GPL v3. it adds additional requirements. so how can it be less infectious than the GPL v3?
SAI_Peregrinus•4mo ago
JTbane•4mo ago
marcosdumay•4mo ago
Are you talking about promoting some software as open source when it's in fact not? Because yes, there's something wrong with that, you shouldn't do it, and people will rightfully react loudly if you try.
People don't complain about proprietary software honestly communicated as that.
sarchertech•4mo ago
If I license my software as MIT but with an exception that you can’t use it for commercial purposes if you make more than $100 million a year in revenue, that’s a lot closer to open source than proprietary.
We should be normalizing licenses that place restrictions on large corporations.
I think the world would be a much better place if we just changed the definition of open source to include such licenses. We don’t even really need to change the definition because normal everyday use of the term would already include them.
marcosdumay•4mo ago
If your software isn't open source, don't claim it is. You are free to try to normalize your licensing preferences. Even better if you have a nice name for them that don't try to mislead people into thinking they are something they clearly aren't.
> I think the world would be a much better place if we just changed the definition of open source to include such licenses.
You are free to think that. I'm quite certain it's not correct, but nothing stops you. Anyway, you can make a positive change on the world you actually live on by being honest and clear about what your license does, and communicating why you think it's a good thing.
Again, it's a huge plus if you get some nice name that can actually mean the thing your license is.
> normal everyday use of the term would already include them
Normal and everyday use of "open source" does absolutely not include the licenses you are talking about.
sarchertech•4mo ago
>Open source is open source. There exists no gradient there.
Of course there does. If there wasn't a gradient there wouldn't be so many different licenses, there wouldn't be a huge debate anytime on HN anytime someone brings up the definition of open source, and there wouldn't be people constantly arguing about whether licensing requirements constitute restrictions.
OSI is just some group funded by initially by Tim O'Reilly (but now by, Google, MS, Intel and the rest of big tech) to co-opt the free software movement and turn it into something business friendly. They took an already in use phrase, built an ad campaign around it and added a specific bullet point to the definition that said you can't restrict open source software from commercial use.
>Normal and everyday use of "open source" does absolutely not include the licenses you are talking about.
I guarantee you that the majority of people using the term "open source" are using something closer to the dictionary definition than the OSI definition. The average software developer has never even read the OSI definition.
The dictionary definition is
"denoting software for which the original source code is made freely available and may be redistributed and modified."
Nothing in this precludes banning Nazis, or large businesses, or certain governments from using it.
The OSI definition includes that, but adds some technical specifics--one of which is that there can't be "discrimination between uses, including commercial use".
If you want to use the phrase "OSI approved open source", that's fine, but there's a reason OSI doesn't have a trademark for the term. They don't own it. Tim O'Reilly can kick rocks with this word policing.
watwut•4mo ago
sarchertech•4mo ago
tcoff91•4mo ago
JanneVee•4mo ago
grafmax•4mo ago
So companies’ profit motives contribute to this mess not just through the exploitation of open source labor (as you describe) but through externalizing security costs as well.
stingraycharles•4mo ago
It’s my take that over the past ~ decade a lot of these companies have been making things a lot better, Windows even requires secure boot these days as well.
snickerdoodle14•4mo ago
Secure Enclave = store the encryption keys to media in a place where you can't get them
Secure Boot = first step towards remote attestation so they can remotely verify you haven't modified your system to bypass the above
Advertising rules the world.
brookst•4mo ago
Is there such a thing as secure hardware that can prevent supply chain attacks (by enabling higher layers to guarantee security) and secure hardware that prevents copying data (by enabling higher layers to guarantee security)?
snickerdoodle14•4mo ago
acdha•4mo ago
That’s the path out of this mess: not just trying to catch it on NPM but moving sensitive data into OS-enforced sandboxes (e.g. Mac containers) so every process you start can’t just read a file and get keys, and using sandboxing features in package managers themselves to restrict when new installs can run code and what they can do (e.g. changing the granularity from “can read any file accessible to the user” to “can read a configuration file at this location and data files selected by the user”), and tracking capability changes (“the leftpad update says it needs ~/.aws in this update?”).
We need to do that long-term but it’s a ton of work since it breaks the general model of how programs work we’ve used for most the last century.
felixgallo•4mo ago
acdha•4mo ago
felixgallo•4mo ago
acdha•4mo ago
Using OS features to restrict access to sensitive data similarly gives you another chance to detect a compromise because a denied operation to, say, read your wallet by an app which doesn’t need to is both highly visible and unambiguous.
felixgallo•4mo ago
The problem is coming from inside the house.
theknarf•4mo ago
tanepiper•4mo ago
SaaS products don't enforce good security - I've seen some internally that don't have MFA or EntraID integration because they simply don't have those as features (mostly legacy systems these days, but they still exist).
I'm also an open-source author (I have the most used bit.ly library on npm - and have had demands and requests too), and I'm the only person you can publicly see on our [company github](https://github.com/ikea) - there's reasons for this - but not every company is leeching, rather there is simply no other alternative.
ants_everywhere•4mo ago
A lot of the spread of Shai-Hulud is due to s having overly broad credentials on NPM, GitHub and elsewhere. It's not that NPM doesn't support scoped credentials, it's that developers don't want to deal with it so it's not the default. There's no reason why, for example, a developer needs a live credential to publish their package when they're just hacking on code.
This is related to the `curl | bash` pattern. Projects like NPM want to make it easy to get started and hard to reach a failure case so they sacrifice well-known security practices during the growth phase.
pixl97•4mo ago
Security things will get hacked on later, but again it will cause all kinds of problems because the ecosystem wasn't built for it.
ants_everywhere•4mo ago
Yes they are, and it's hard to design good scopes especially when the project is new.
A better default might just be to have the write permission expire much more quickly than the read permission. E.g. the write token might be valid for an hour and the read token might be valid for 90 days.
danudey•4mo ago
1. Show me all the permissions that that token has been granted but has never used
2. Show me all the permissions that that token has tried to use but does not have
I would gladly accept the ability to turn on an audit mode for a given token, service account, etc., run the thing I'm trying to run, and then go look at the report to see what permissions I can remove - or, even better, have a giant "Create role from this profile" that lets me create a custom set of permissions based on all of the permissions I've used.
Google Cloud does have a thing where it shows you all the service accounts you have with "overly broad permissions", but it seems to be just "here are all the SAs with 'owner' access" so far. It didn't catch the service account we had that just needed to publish one file to one bucket but had been made a Storage Administrator with full read/write/update/delete access to every form of storage in Google Cloud.
_heimdall•4mo ago
I expect a company to put their current product in as good of a light as they can. They're going to over promise what it can do and show me the easiest "Getting Started" steps as they can. Its up to me to dig deeper and understand what they actually do and what the right solution is for my project.
fergie•4mo ago
Although I see where you are coming from, dismissing unaudited libs as dangerous is slightly missing the point. In fact, the world is a safer place for their existence- the value lost by security exploits is insignificant compared to the value protected by the existence of the libs they exploit. Also, I suspect that you could replace "value" with "lives" in the previous sentence.
ants_everywhere•4mo ago
People who work on permissively licensed software are donating their time to these Fortune 500 companies. It hardly seems fair to call the companies leeches for accepting these freely given donations.
JensRantil•4mo ago
ants_everywhere•4mo ago
If so I think this is a good point. It's easy to see from any one open source project's perspective how a little help would go a long way. But it's really hard to see from the perspective of a company with a massive code base how you could possibly contribute to the ten gajillion dependencies you use, even if you wanted to.
People will say things like "Why doesn't Foo company contribute when they have the resources?" But from what I've seen, the engineers at Foo would often love to contribute, but no team has the headcount to do it. And acquiring the headcount would require making a case to management that contributing to that open source project is worth the cost of devoting a team to it.
trolleski•4mo ago
em-bee•4mo ago
counterpoint: you don't need to actively contribute to all upstream projects, but you do need to be prepared to maintain, fix, or replace any dependency you have. if you can't do that, you should pay someone to do it. if you can't do that either then you should not be using the dependency in the first place.
yes, it can happen that you underestimate the resources needed for that, or that a project you use looked very stable and supported, but suddenly you can't find anyone who has the knowledge to fix the issue you have, but then that's simply bad luck. it can happen with company backed projects too. you need to deal with that. have no sympathy if you can't.
michaelmrose•4mo ago
48terry•4mo ago
They can use the software as they like, that's what the license is for. I don't recall a license or contract where I have to care about their problems, however.
If they depend on my software and it makes their product blow up in their faces and they're losing more money per minute than I'll ever make in my lifetime? Sucks to be them. I'll handle support or fixes when I very well feel like it, I'm off to play Silksong.
They can, of course, fix this attitude problem of mine by paying me.
keybored•4mo ago
psunavy03•4mo ago
You can't give permission for them to use the stuff for free and then accuse them of "leeching." If the expectation is contribution in kind, that needs to be in the license agreement.
thayne•4mo ago
But also, most OSS Software is provided without warranty. Commercial companies should either be held accountable for ensuring the open source components are secure or paying someone (either the maintainer directly, or a third party distributor) to verify the security of the component.