The goal was over-the-board (OTB) realism. It’s a physics-based sandbox where you manually drag pieces, move the rook for castling, and clear captured pieces yourself. You can even knock them over. The original idea came from my friend, Drew Olbrich (lunarskydiving.com), who built an SGI-based version of this concept at PDI in the '90s.
I spent three decades writing rendering code at PDI/DreamWorks and NVIDIA. I wrote the rendering pipeline from scratch in WebGPU. It defaults to an optimized rasterized "fast path" but if you have a good GPU, you can enable a real-time path tracer in the settings. I implemented this using a 4-layer depth-peeled G-Buffer and Hierarchical Z-Buffer DDA for ray marching, supporting multi-bounce GI and environment map importance sampling.
The scene is lit entirely by a single HDRI environment map, with no local light sources. Since I’m a programmer and not a lighting artist, I’ve exposed all material settings for you to tweak and share.
For multiplayer, I used WebRTC (via PeerJS) to avoid central server lag. The app integrates with Lichess.org to challenge your existing friends, or you can play a local Stockfish web worker. Each client runs its own rigid body simulation to keep the physics responsive while the logical game state stays synced.
The app requires WebGPU (Chrome 113+, Edge 113+, or Safari 17.4+). Tested on latest Windows, MacOS, iOS and Android. It might take several seconds to load the graphics resources if you have a slow network connection.
I’d love to hear how the performance holds up on your hardware, and suggestions for future features, like adding refractions, allowing spectators, or annotations?