WebXR Hand Tracking

Working on Hand Tracking API for WebXR.
This is currently (26.07.20) available as and experimental feature on Oculus Quest with chrome://flags Hand Tracking enabled.

API will help developers to get access to hands, render them and do any convenient operations, like physics interactions, interacting with UI, rendering hands, etc.

Here is a tweet, retweets appreciated:


Here is a PR: https://github.com/playcanvas/engine/pull/2316
Once merged, it adds new example for hands rendering.

Any XR app that used Ray and Select events, will just work without any extra code (only code needed for hands rendering and advanced hands interactions, gestures, etc).


This is so rad! Can you explain ‘Once Merged’ I’m not familiar.

1 Like

It means “coming soon”.