This can be solved quite easily for open source extensions: https://github.com/EclipseFdn/open-vsx.org/wiki/Auto-Publish...
Vetting however is trickier. I hope Cursor can fund this effort!
As a vscode extension author, I am scared by the power I have. I am not at all surprised by what happened in this story.
https://github.com/microsoft/vsmarketplace/blob/main/Removed...
If I were doing this I would copy the real extension, give it a name that made it sound official but in the README say it is a tweaked version with some improvements or whatever. Also actually add some improvements, but hide the malware in those changes.
Good luck finding that. (brb going to try this)
But cursor isnt off the hook. It wasnt a malicious copy, it was a legit copy of the cursor IDE distirbuting a package they allowed on the extension store. This is on them.
The lesson here is to not make a vscode fork if you arent able to maintian it the way microsoft does. Move fast and break (the user's) things i guess
It seems that an attacker was able to easily manipulate download counts, placing their malicious extension high in search results.
And this is far from the first open-vsx vulnerability in the past month. See: https://blog.koi.security/marketplace-takeover-how-we-couldv... which describes how open-vsx was installing arbitrary packages and running their build scripts in a privileged environment.
And the instructions to report malicious extensions, even now, are practically nonexistent: https://github.com/EclipseFdn/open-vsx.org/wiki/Guidelines-o...
With billions of dollars being poured into this ecosystem, it's mind-boggling that security is being treated as such an afterthought. Consider this when choosing tools.
And for sure, Cursor and others should have funded security hardening of their extension marketplace. The lion's share of the blame lies on that. But the Eclipse Foundation is in a position to incentivize that investment by making it clear to end users that open-vsx is still at an experimental level of stability and security, rather than promoting it as an enterprise-ready product with white papers and all.
(And a fun but irrelevant bonus fact: Eclipse was originally made by IBM)
In any case, Cursor didn't pay any money here, so they get to keep all the pieces when the code they used for free breaks.
Now do we need better solutions? Definitely and I do hope cursor will contribute towards it but I won’t hold them to it. They switched to OpenVSX less than a month ago, too soon to really say much at this point.
Sure sounds like you are moving goalposts around. Of course I hope Cursor contribute back but it’s been 20days and I am not an insider I have no idea what the plan is.
If you're running code without reading it, that's on you.
The exploiter is evil. Cursor has no culpability here.
If you have to read it, then your system has already failed.
I want people to release cool software without the insane burden you describe. If they want to delegate that burden to users or ask them to pay for someone else to assume the burden, great.
I love Cursor. They haven't failed me. I'm not running arbitrary code and I suffer none of the consequences.
Furthermore, it probably literally says you're running random 3rd party code when you use extensions and Cursor is not liable. This is basic human responsibility 101. You are responsible for your own actions.
I trust Cursor isn't trying to screw me.
I don't trust random 3rd party extensions. They might be trying to screw me. This is the exact reason why I don't touch npm.
I'm not prescribing a formal set of rules by which you should or shouldn't trust things. I'm just a reasonable person.
Cursor is an unrelated 3rd party to this situation, which is probably clearly described in their Terms of Service. Blaming them reeks of denying responsibility for your own actions. If you want Cursor to audit every 3rd party extension, they'd probably want you to pay them for it. Just like every commercially licensed Linux distro.
It was a mistake that he installed the duplicate fraudulent extension. For all we know he could have checked the intended extension code line by line, and then went on to install the trojan horse extension by accident.
It is easier than ever to do a DIY malware analysis on the tools you use.
“Hi Claude - you are a security researcher and malware analyst. Analyze the FooBar Chrome Browser extension / git repository I just downloaded for security threats and provide me a report on whether this is OK to use”
I know browser / IDE extensions are not usually audited and approved by the tool owner unless specifically noted otherwise. Even phone apps can sneak stuff in. So I am careful to only install things I trust or will audit myself or am willing to take the risk on.
Again, it's the system's responsibility to make sure you don't fail, not your responsibility.
But it is true that certain types of developers will just download anything and integrate it into their development process. And it's also true that this would have been avoided by executing in a sandbox.
Until someone runs `cursor ~/.where_i_store_a_bunch_of_secrets` or maybe even `cursor ~/.bashrc`
It's reasonable to assume Cursor isn't trying to screw you over and you don't need to audit their code.
It's also reasonable to assume some of the arbitrary 3rd party extensions are trying to screw you over.
You don't have to be so rigid and extreme in your thinking. You can take the reasonable middle ground and make good guesses yourself.
Holding that much money on a machine that is not ultra secure is borderline insane.
It's similar to how many crypto businesses will have a hot wallet with some fraction of their more secure cold wallet that they're okay losing.
Even if the compromise wasn't on the developer's machine, it could have enabled a supply chain attack post-deployment.
it wasn't even a cursor specific extension it was a vscode one. completely misleading
I naively assumed the extensions were 'sandboxed' to some degree.
Any extension has full access to execute programs as the user.
Your operating system might have some security measures in place.
Well, it’s absolutely not and you can access the full filesystem. Which is handy if you are legit, but very permissive & much more a security threat than I imagined.
Be careful what extensions you install people :)
I agree, this seems bad! Sandboxing is still a very weakly implemented craft for most applications, especially those that run extensions or plugins.
(I build a lot of software that runs plugins and has no sandboxing at all, and it really frustrates me. I'm constantly looking out for cross-platform Python-friendly sandboxing tech that might help with this in the future.)
voice of decades past -- sandboxing is very well known and deeply implemented in many aspects of ordinary daily computing; sandboxing is endlessly difficult and can be mis-applied; people who want to break into things and steal and wreak havoc ruin software environments for everyone else.
I actually tried running clickhouse in container2wasm and it crashed because it only had one CPU core, so YMMV—although that shouldn’t be a problem for Python (or any code custom built for your plugin framework).
For me, I want to avoid separate processes. I definitely want to avoid separate VMs.
[1] https://github.com/extism/python-sdk
I’ve become very paranoid with extensions as of late. It’s great that llms have gotten so good and banging out personal tools. I am using a few home grown extensions in my own setup.
Yes.
> I naively assumed the extensions were 'sandboxed' to some degree.
No. This is fairly obvious if you have used more than a few extensions - often they'll ask you to download and install binaries.
I honestly thought that was how the Javacsript and Python ecosystems worked? And surely many others.
EDIT: also student's unions apparently, which kinda makes sense
Uses Cursor. Downloads random extensions.
Brand new laptop, stock Ubuntu on it, nothing else. If you don't want to go the Qubes OS way.
I wouldn't feel particularly comfortable even having 5 figures of tradfi cash lying around in my house let alone carrying it on my laptop where someone could steal my bag or machine and that's before it is connected
For small amounts all these mobile/addons/desktop software are fine (with minimum caution like avoiding reckless behavior described in the OP). For larger amounts cold storage (of which hardware wallet are the easiest to deploy) will protect your funds.
When you put cash in your physical wallet you assume that this could be lost to a robber in the streets, with little to no recourse. You wouldn't put all your belongings in a big bag you would carry everywhere you travel, or if you did you would increase your security proportionally to this increased risk... if you don't, nobody would shed a tear over your potential losses.
Not sure how this is different with crypto, I guess people assume everything is safe by default because it has no physical form, despite the 20 warnings and security reminders they get when they setup any crypto wallets.
It's like trying to do vehicle maintenance while your car is running.
It might be technically possible.. but why would you ever do that?
Ranking manipulation, using recency and inflated download counts, to outrank the legitimate Solidity package is a clever exploit of how developers search. It makes me wonder: should IDEs start validating package authorship or offer signed extensions as a default?
Also, the fact that this happened on a freshly imaged system with no antivirus suggests we need to rethink trust models for extension marketplaces. Not just for crypto devs, but for any industry sensitive to code integrity.
It is a social problem not a technical problem.
Most likely that includes your IDE?
Disclaimer: I’m not sure whether Cursor inherits iTerm’s permissions when launched from CLI. The TCC system is pretty mysterious to me in general.
Freeloading on (and blaming) volunteer infrastructure is irresponsible, especially when you have so much funding.
Apple did it 15 years ago, time for the rest to catch up. They can turn it into a business by offering enterprise subscriptions for higher guarantees or a warranty.
why would they start investing now when they can just continue to plunder the commons uninterrupted?
That goes for the AI industry itself, but equally for everyone using it.
Microsoft won when it found a way to extract software fees as a tax from hardware manufacturers.
FANG won when it found a way to extract software writing and hosting fees from advertisers, effectively making it a tax on everything you buy.
Both of these (Operating systems and basic cloud services like email hosting) can be done for a lot cheaper if they were paid for by end users, but those just won't pay. In fact, for a while they were paid by end users (microsoft did that, gmx.net, infomaniak, ...). Then everyone switched to "free" and here we are.
And we all know there's no way back, so what's the point discussing it? We all know most people will just not have email or web search if they had to pay even 5$ per year to get it, and I seem to recall an article stating Google effectively earns over $100 per year per account.
Reality is: give it another 2 years and the "art, music, articles, newspapers, books and open source code" industries will reach absolutely nobody except through AI providers. That could be avoided if every creator paid $1 per year to have free infrastructure for their services, but there's no way in hell they will do that ... so here we are. In 2 years instead they'll pay $1000 every time they want someone to actually look at their art.
And yet, the situation with banking services is far worse, imho. So bad, in fact, that even charging $0.01 per year for internet services would be a nonstarter.
Concerning Apple, their review process is so hard and unjust, I've seen startups give up apps after months of work just because of that.
Maybe sandboxing and runtime-level permissions are a better compromise?
I agree. If you're going to fork vscode, it's not that much harder to add a sandbox. Even a docker container would be better than nothing.
Nonetheless, I think this is more a vulnerability in the Open VSIX registry side, than Cursor AI. If anything, the forks and VS Code should block/sandbox extensions by default, or have a granular permission system to allow users to consciously choose whether to allow an extension to use network resources or not.
Seems like software development industry in a nutshell: multi-millionaire companies freeloading on volunteer work :)
[0] https://code.visualstudio.com/docs/setup/enterprise#_configu...
Yet, the extension dilemma is also utterly shit. That's why I stay far away from "VSCode and friends"
I have code, passwords and certificates separated in virtual machines, even IDE GUI app is virtualized, and has no rights to access GitHub, internet or filesystem directly.
But I get a lot of flack from coworkers. They say it is unintuitive and uses x86 CPU which is uncool. Mac has no reasonable VM software or secure containers!
I do use Cursor at work and I have various extensions installed.
christophilus•4h ago
I do have a fair number of Neovim plugins on my host machine, and a number of Arch packages that I probably could do without.
I’ve considered keeping my host’s Neovim vanilla, but telescope is hard to live without.
aldur•4h ago
I started using throwaway environments, one per project. I try keeping the stuff installed in the host OS to the bare minimum.
For the things I need to run on the host, I try to heavily sandbox it (mostly through the opaque macOS sandbox) so that it cannot access the network and can only access a whitelist of directories. Sandboxing is painful and requires trial an error, so I wish there was a better (UX-wise) way to do that.
bravesoul2•4h ago
This and MCP, IoT all the things, vibe coding, AI impersonation for social attacks and cryptocurrency rewards it's a golden age for criminal hackers!
throwawayffffas•2h ago
chasd00•2h ago
dns_snek•1h ago
fc417fc802•3h ago
xinayder•3h ago
christophilus•25m ago
From what I saw of devcontainers, they basically grant access to your entire system (.ssh, etc). May be wrong. That’s my recollection, though.