WebXR Hand Tracking

Working on Hand Tracking API for WebXR.
This is currently (26.07.20) available as and experimental feature on Oculus Quest with chrome://flags Hand Tracking enabled.

API will help developers to get access to hands, render them and do any convenient operations, like physics interactions, interacting with UI, rendering hands, etc.

Here is a tweet, retweets appreciated:


Here is a PR: https://github.com/playcanvas/engine/pull/2316
Once merged, it adds new example for hands rendering.

Any XR app that used Ray and Select events, will just work without any extra code (only code needed for hands rendering and advanced hands interactions, gestures, etc).


This is so rad! Can you explain ‘Once Merged’ I’m not familiar.

1 Like

It means “coming soon”.

Yay! It is released: https://github.com/playcanvas/engine/releases/tag/v1.33.0
If you have Oculus Quest, enable hand tracking in chrome://flags and in device settings enable auto hands detection.
Then using controllers in Browser open this build: https://playcanv.as/p/VmHVW3Wb/ and then switch to hands (either automatically by placing controllers down and presenting hands, or by going into Oculus quick settings and switching there).
Source for hands tracked project: https://playcanvas.com/project/705931/overview/webxr-hands