Most XR tracking stacks are fragmented or tied to proprietary hardware, so this project explores a unified, extensible system that gives developers low-level control without relying on closed ecosystems.
Right now, the system has a working eye tracking implementation, with SLAM tracking actively in development. The goal is to make the stack modular enough that it can be adapted or integrated into different headset setups with minimal overhead for developers.
This is being built by a small R&D team at Walker Industries alongside EOZVR, working across perception, tracking, and tooling. I’m currently focused on the Unity/Godot SDK and the companion application layer. The eye tracking system is primarily developed by John, with collaboration across the broader tracking stack. It's been a long road getting things to work, but it's been very rewarding!
We also plan to build a dataset using recorded sessions across a small group of paid volunteers and multiple headset models once the pipeline stabilizes.
Eye tracking demo: https://www.youtube.com/watch?v=QlfCfkzkBB4
jtxt•4h ago
WalkerDev•4h ago
Our goal is to use 40 people to create around 1200 minutes of training footage we can use to make things more robust on some serious hardware! We are going to use around 5 headset models to do this alongside a varied enough demographic to account for ethnicities often ignored with these solutions (Chinese, Japanese, anyone with monolids really alongside pale and dark skin)