This is really impressive! Running PyTorch models in the browser without ONNX export is a game-changer for web-based ML apps.
Quick question: What's the performance like compared to native PyTorch? Are there specific model types that work better than others on WebGPU?
Would love to try this with some smaller models I've been working with!
yu3zhou4•11m ago
Performance is not there yet, honestly. I haven’t focus on running things fast, but it’s one of next directs steps I’ll take. The problem with running it in a browser is that we need to find a way to run the PyTorch itself in a browser and to my best knowledge it’s not there yet. I’m looking at it to close the gap, feel welcome to reach out to collaborate if you’re interested!
reena_signalhq•1h ago
Quick question: What's the performance like compared to native PyTorch? Are there specific model types that work better than others on WebGPU?
Would love to try this with some smaller models I've been working with!
yu3zhou4•11m ago