But it’s safe to say that the interaction layer between the two is extremely painful. We have nicely modeled type-safe code in both the Rust and TypeScript world and an extremely janky layer in between. You need a lot of inherently slow and unsafe glue code to make anything work. Part is WASM related, part of it wasm-bindgen. What were they thinking?
I’ve read that WASM isn’t designed with this purpose in mind to go back and forth over the boundary often. That it fits the purpose more of heaving longer running compute in the background and bring over some chunk of data in the end. Why create a generic bytecode execution platform and limit the use case so much? Not everyone is building an in-browser crypto miner.
The whole WASM story is confusing to me.
So on the one side you have organizations that definitely don't want to easily give network/filesystem/etc. access to code and on the other side you have people wanting it to be easier to get this access. The browser is the main driving force for WASM, as I see it, because outside of the browser the need for sandboxing is limited to plugins (where LUA often gets used) since otherwise you can run a binary or a docker container. So WASM doesn't really have much impetus to improve beyond compute.
I don't think this is entirely fair or accurate. This isn't how Wasm runtimes work. Making it possible for the sandbox to explicitly request specific resource access is not quite the same thing as what you're implying here.
> The browser is the main driving force for WASM, as I see it
This hasn't been the case for a while. In your first paragraph you yourself say that 'the people furthering WASM are [...] building a whole new VM ecosystem that the browser people aren't interested in' - if that's the case, how can the browser be the main driving force for Wasm? It's true, though, that there's verey little revenue in browser-based Wasm. There is revenue in enterprise compute.
> because outside of the browser the need for sandboxing is limited to plugins (where LUA often gets used) since otherwise you can run a binary or a docker container
Not exactly true when you consider that docker containers are orders of magnitude bigger, slower to mirror and start up, require architecture specific binaries, are not great at actually 'containing' fallout from insecure code, supply chain vulns, etc.. The potential benefits to enterprise orgs that ship thousands of multi-gig docker containers a week with microservices architectures that just run simple business logic, are very substantial. They just rarely make it to the hn frontpage, because they really are boring.
However, the Wasm push in enterprise compute is real, and the value is real. But you're right that the ecosystem and its sponsorship is still struggling - in some part due to lack of support for the component model by the browser people. The component model support introduced in go 1.25 has been huge though, at least for the (imho bigger) enterprise compute use case, and the upcoming update to the component model (wasi p3) should make a ton of this stuff way more usable. So it's a really interesting time for Wasm.
What are you talking about? Alpine container image is <5MB. Debian container image (if you really need glibc) is 30MB. wasmtime is 50MB.
If a service has a multi-gig container, that is for other stuff than the Docker overhead itself, so would also be a multi-gig app for WASM too.
Also, Docker images get overlayed. So if I have many Go or Rust apps running on Alpine or Debian as simple static binaries, the 5MB/30MB base system only exists once. (Same as a wasmtime binary running multiple programs).
Trying to shoehorn Rust as a web scripting language was your second mistake
Your first mistake was to mix Rust, TypeScript and JavaScript only just to add logic to your HTML buttons
I swear, things get worse every day on this planet
Dioxus is another: https://dioxuslabs.com/
C# with Avalonia for a different use case: https://avaloniaui.net/
Avalonia solitaire demo: https://solitaire.xaml.live/
Avalonia Visual Basic 6 clone: https://bandysc.github.io/AvaloniaVisualBasic6/
Blazor can run as WebAssembly on the client side if you choose that runtime mode: https://dotnet.microsoft.com/en-us/apps/aspnet/web-apps/blaz...
Beyond the browser, Wasmer does WebAssembly on the serverside: https://wasmer.io/
Fermyon too: https://www.fermyon.com/
Extism is a framework for an application to support WebAssembly plugins: https://extism.org/
If the logic is merely about validation, then an IDL with codegen for TS and some backend language is probably better. There are also some more advanced languages targeting transpilation to both JS and a backend language such as Haxe, but they all have some trade-offs.
> It gets us closer to the web as the universal platform.
As a target
I don't want a pseudo 'universal' platform owned by Big Tech; or by governments as a substitute
Google/Chrome controlled platform, no thanks
How would you make such a thing without limiting it in some such way?
I agree WASM has it’s drawbacks but the execution model is mostly fine for these types of task where you offload the task to a worker and are fine waiting a millisecond or two for the response.
The main benefit for complex tasks like above is that when a product needs to support isomorphic web and native experience - quite many use cases actually in CAD, graphics & gis) - based on complex computation you maintain, the implementation and maintenance load drops to a half. Ie these _could_ be eg typescript but then maintaining feature parity becomes _much_ more burdensome.
It's fine and fast enough as long as you don't need to pass complex data types back and forth. For instance WebGL and WebGPU WASM applications may call into JS thousands of times per frame. The actual WASM-to-JS call overhead itself is negligible (in any case, much less than the time spent inside the native WebGL or WebGPU implementation), but you really need to restrict yourself to directly passing integers and floats for 'high frequency calls'.
Those problems are quite similar to any FFI scenario though (e.g. calling from any high level language into restricted C APIs).
Has been used by most of the Rust web frontend frameworks for years.
It all has to go through JS shims though, limiting the performance potential.
In any case, I would probably define a system which doesn't simply map the DOM API (objects and properties) into a granular set of functions on the WASM side (e.g. granular setters and getters for each DOM object property).
Instead I'd move one level up and build a UI framework where the DOM is abstracted away (quite similar to all those JS frameworks), and where most of the actual DOM work happens in sufficiently "juicy" JS functions (e.g. not just one line of code to set a property).
Does anybody know why is it such a big problem to add dom access to wasm?
In worst case, we should have a second option to Js (which is not typescript - typescript is just a lipstick on a pig). If wasm is not it, why not something different? Having dart in would be great.
Well, the article does a pretty good job of answering this specific question ;)
It gives good reasons why we can't have specific parts. Having the JavaScript Standard library in WebAssembly would be hard (was anyone actually asking for that?), and some of the modern APIs using promises or iterators wouldn't have a clear mapping. Also not everything could be zero-copy for every language
But the article doesn't do a very good job explaining why we can't have some dom access, at least for the 90% of DOM APIs not using JavaScript-specific features.
Most of the argument boils down to "you shouldn't want direct DOM access, because doing that would be work for these people, and we can instead have those other people do lots of work making the JavaScript bridge less painful. And anyways it's not clear if having these people make proper APIs would actually result in faster code than having those people do a separate sophisticated toolchain for each language"
It reads very much like a people and resource allocation problem rather than a technical challenge
The DOM is a Javascript API, so it uses 100% Javascript-specific features (every DOM manipulation requires accessing JS objects and their properties and lots of those properties and function args are Javascript strings) - none of those map trivially to WASM concepts.
It's a bit like like asking why x86 assembly code doesn't allow "C++ stdlib access", the question doesn't even make much sense ;)
Or is there something in the browser architecture that requires them to be JavaScript objects with the memory layout of the JavaScript engine, rather than just conceptually being objects?
On the one hand, JS DOM objects are idl generated wrappers for C++. In theory we can generate more WASM friendly wrappers.
On the other, the C++ code implementing the API will be tightly coupled to the entire JS type system and runtime. Not just the concept of an object but every single design decision from primitives to generators to dynamic types to prototypical inheritance to error handling...
Also, I believe the C++ DOM implementation itself is pretty tightly integrated with javascript and it's memory management e.g. they have references into the managed heap to use JS directly like EventListeners and js functions.
Creating a new non-JS DOM API doesn't sound intractable to me... but browsers annihilate my assumptions so it's probably millions of hours of effort and close to a rewrite...
By default maybe, but JS obfuscators exist so not really. Many websites have totally incomprehensible JS even without obfuscators due to extensive use of bundlers and compile-to-JS frameworks.
I expect if WASM gets really popular for the frontend we'll start seeing better tooling - decompilers etc.
I felt something was really lost once css classes became randomised garbage on major sites. I used to be able to fix/tune a website layout to my needs but now it's pretty much a one-time effort before the ids all change.
One of the reasons I’m interested in wasm is to get away from the haphazardly evolved JS ecosystem…
Just a note, but there is burgeoning support for this in "modern" WebAssembly:
https://github.com/bytecodealliance/jco/tree/main/examples/c...
If raw WebIDL binding generation support isn't interesting enough:
https://github.com/bytecodealliance/jco/blob/main/packages/j...
https://github.com/bytecodealliance/jco/blob/main/packages/j...
https://github.com/bytecodealliance/jco/blob/main/packages/j...
Support is far from perfect, but we're moving towards a much more extensible and generic way to support interacting with the DOM from WebAssembly -- and we're doing it via the Component Model and WebAssembly Interface Types (WIT) (the "modern" in "modern" WebAssembly).
What's stopping us the most from being very effective in browsers is the still-experimental browser shim for components in Jco specifically. This honestly shouldn't be blocking us at this point but... It's just that no one has gotten around to improving and refactoring the bindings.
That said, the support for DOM stuff is ready now (you could use those WIT interfaces and build DOM manipulating programs in Rust or TinyGo or C/C++, for example).
P.S. If you're confused about what a "component" is or what "modern" WebAssembly means, start here:
https://component-model.bytecodealliance.org/design/why-comp...
If you want to dive deeper:
[*]Yeah, the toolchains help solve this a bit, but it still makes me ship JS and wasm side-by-side.
You mean like a list of JS functions that are imported into the Wasm binary? This has been there since day one:
(module
...
(import "env" "foo" (func (;0;) (type 1)))
(import "env" "bar" (func (;1;) (type 2)))
...
)
> Everyone having to write all their own glue[*] is just nuts at this point.Did you mean for the specific programming language you use? If so then that seems like a problem for the language implementor, not a problem with Wasm. Rust has wasm bindgen, Emscripten has their thing, and so on.
Would it be nice? Yes. But.
Every added feature is a trade-off between need -vs- outlay, overhead, complexity & other drawbacks. In order to justify the latter things, that "need" must be significant enough. I'd like to have DOM, but I don't feel the need is significant.
Some thoughts on use-cases:
1. "Inactive" or "in-instance" DOM APIs for string parsing, document creation, in-memory node manipulation, serialisation: this is all possible today in WASM with libraries. Having it native might be cool but it's not going to be a significantly different experience. The benefits are marginal here.
2. "Live / active" or "in-main-thread" direct access APIs to manipulate rendered web documents from a WASM instance - this is where the implementation details get extremely complex & the security surface area starts to really widen. While the use-cases here might be a bit more magical than in (1), the trade-offs are much more severe. Even outside of security, the prospect of WASM code "accidently" triggering paints, or slow / blocking main thread code hooked on DOMMutation events is a potential nightmare. Trade-offs definitely not worth it here.
Besides, if you really want to achieve (2), writing an abstraction to link main-thread DOM APIs to WASM postMessage calls isn't a big lift & serves every reasonable use-case I can think of.
Wasm is the perfect example of this - it has the potential to revolutionize web (and desktop GUI) development but it hasn't progressed beyond niche single threaded use cases in basically 10 years.
Oh, and breaking changes between versions meaning you needed multiple runtimes and still got weird issues in some cases.
It's not just applets; we also had Flash, which was a huge success until it was suddenly killed.
As far as I can tell, the difference between java applets and Flash is that you, the user, have to install java onto your system to use applets, whereas to use Flash you have to install Flash into your browser. I guess that might explain why one became more popular than the other.
It was eventually killed because Apple decided it won't support it on the iPhone.
Crypto miners weren’t a thing for Java applets
Nothing "vague" or "somehow" about that.
Applets were insecure, because A) they were based on the Netscape browser plugin API, which had a huge attack surface, and B) they ran in a normal JVM with a standard API that offeres full system access, restricted by a complex sandbox mechanism which again had a huge attack surface.
This IS, in fact, not an issue for wasm, since A) as TFA describes it has by default no access at all to the JavaScript browser API and has to be granted that access explicitly for each function, and B) the JavaScript browser API has extremely restricted access to OS functionality to begin with. There simply is no API at all to access arbitrary files, for example.
WASM sandboxes the entire VM, a safer model. Java ran trusted and untrusted code in the same VM.
Flash, while using the whole-VM confinement model, simply had too many "boring" exploits, like buffer overflows and so on, and was too much of a risk to keep using. While technically nothing prevented Flash from being safe, it was copyright Adobe and Adobe didn't make it safe, and no one else was allowed to.
Java applets loading on a website started as a gray rectangle, which loaded very slowly, and sometimes failed to initialize with an "uninited" error. Whenever you opened a website with a java applet (like could happen with some math or physics related ones), you'd go "sigh" as your browser's UI thread itself halted for a while
Flash applets loading on a website started as a black rectangle, did not cause the UI thread to halt, loaded fast, and rarely gave an error
(the only reason I mention the gray vs black rectangle is because seeing a gray rectangle on a website made me go "sigh")
JavaScript was not yet optimized but the simple JS things that worked, did work without loading time.
Runescape (a 3d MMORPG from the early 2000s that still exists) used Java though and somehow they managed to use it properly since that one never failed to load and didn't halt the browser's UI either despite being way more complex than any math/physics Java applet demo. So if Java forced their applets to do whatever Runescape was doing so correctly, they'd not have had this perception issue...
Javascript from 20 years ago tends to run just fine in a contemporary browser.
Generalized Assembly? GASM?
I’ve personally felt like it has been progressing, but I’m hoping you can expand my understanding!
Before WASM, the options were:
- require everyone to install an app to see visualizations
- just show canned videos of visualizations
- write and maintain a parallel Javascript version
Demo at https://throbol.com/sheet/examples/humanoid_walking.tb
Low memory usage and low CPU demand may not be a requirement for all websites because most are simple, but there are plenty of cases where JavaScript/TypeScript is objectively the wrong language to be using.
Banking apps, social network sites, chat apps, spreadsheets, word processors, image processors, jira, youtube, etc
Something as simple as multithreading is enough to take an experience from "treading water" to "runs flawlessly on an 8 year old mobile device". Accurate data types are also very valuable for finance applications.
Another use case is sharing types between the front and back end.
But CRUD developers don’t know/care about those, I guess.
This subject is interesting because your typical college educated developer HATES the DOM with extreme passion, because it’s entirely outside their area of comfort. The typical college educated developer is typically educated to program in something like Java, C#, or C++ and is how the world is supposed to work. The DOM doesn’t work like that. It’s a graph of nodes in the form of a tree model, and many developers find that to be scary shit. That’s why we have things like jQuery, Angular, and React.
These college educated developers also hate JavaScript for the same reasons. It doesn’t behave like Java. So for many developers the only value of WASM is as JavaScript replacement. WASM was never intended or positioned to be a JavaScript replacement so it doesn’t get used very often.
Think about how bloated and slow the web could become if WASM were a JavaScript replacement. Users would have to wait on all the run time and dependencies to download into the WASM sandbox and then open like a desktop application, but then all that would get wrapped in something like Angular or React because the DOM is still scary.
Maybe we should stop overdesigning things and keep it simple. WASM needs more tooling around primitive types, threading, and possibly a more flexible memory layout than what we have now.
I ended up having to rewrite the entire interfacing layer of my mobile application (which used to be WebAssembly running in WebKit/Safari on iOS) because I was getting horrible performance losses each time I crossed that barrier. For graphics applications where you have to allocate and pass buffers or in general piping commands, you take a horrible hit. Firefox and Chrome on Windows/macOS/Linux did quite well, but Safari...
Everything has to pass the JavaScript barrier before it hits the browser. It's so annoying!
jauntywundrkind•5h ago
Article l also discussed ref types, which do exist and do provide... Something. Some ability to at least refer to host objects. It's not clear what that enables or what it's limitstions are.
Definitely some feeling of being rug-pulled in the shift here. It felt like there was a plan for good integration, but fast forward half a decade+ and there's been so so much progress and integration but it's still so unclear how WebAssembly is going to alloy the web, seems like we have reams of generated glue code doing so much work to bridge systems.
Very happy that Dan at least checked in here, with a state of the wasm for web people type post. It's been years of waiting and wondering, and I've been keeping my own tabs somewhat through twists and turns but having some historical artifact, some point in time recap to go look at like this: it's really crucial for the health of a community to have some check-ins with the world, to let people know what to expect. Particularly for the web, wasm has really needed an update State of the Web WebAssmebly.
I wish I felt a little better though! Jco is amazing but running a js engine in wasm to be able to use wasm-components is gnarly as hell. Maybe by 2030 wasm & wasm-components will be doing well enough that browsers will finally rejoin the party & start implementing new.
weinzierl•4h ago
Definitely feeling rug-pulled.
What I think all the people that hark on the "Don't worry, going through JS is good enough for you." are missing is the subtext of their message. They might objectively be right, but in the end what they are saying is that they are content with WASM being a second class citizen in the web world.
This might be fine for everyone needing a quick and dirty solution now, but it is not the kind of narrative that draws in smart people to support an ecosystem in the long run. When you bet, you bet on the rider and not the domestique.
flohofwoe•3h ago
Tbh, most of the ideas so far to enable more direct access of Javascript APIs from WASM have a good chance of ruining WASM with pointless complexity.
Keeping those two worlds separate, but making sure that 'raw' calls between WASM and JS are as fast as they can be (which they are) is really the best longterm solution.
I think what people need to understand is that the idea of having 'pure' WASM browser applications which don't involve a single line of Javascript is a pipe dream. There will always be some sort of JS glue code, it might be generated and you don't need to directly deal with it, but it will still be there, and that's simply because web APIs are first and foremost designed for usage from Javascript.
Some web APIs have started to 'appease' WASM more by adding 'garbage free' function overloads, which IMHO is a good thing because it may help to reduce overhead on the JS side, but this takes time and effort to be implemented in all browser (and most importantly, a will by mostly "JS-centric" web people to add such helper functions which mostly only benefit WASM).
zihotki•1h ago
I wish it was possible to disable WASM in browsers.
hoodchatham•4h ago
And JSPI is a standard since April and available in Chrome >= 137. I think JSPI is the greatest step forward for webassembly in the browser ever. Just need Firefox and Safari to implement it...