WebXR Hand Tracking

Working on Hand Tracking API for WebXR.
This is currently (26.07.20) available as and experimental feature on Oculus Quest with chrome://flags Hand Tracking enabled.

API will help developers to get access to hands, render them and do any convenient operations, like physics interactions, interacting with UI, rendering hands, etc.

Here is a tweet, retweets appreciated:

7 Likes

Here is a PR: https://github.com/playcanvas/engine/pull/2316
Once merged, it adds new example for hands rendering.

Any XR app that used Ray and Select events, will just work without any extra code (only code needed for hands rendering and advanced hands interactions, gestures, etc).

4 Likes

This is so rad! Can you explain ā€˜Once Mergedā€™ Iā€™m not familiar.

1 Like

It means ā€œcoming soonā€.

Yay! It is released: https://github.com/playcanvas/engine/releases/tag/v1.33.0
If you have Oculus Quest, enable hand tracking in chrome://flags and in device settings enable auto hands detection.
Then using controllers in Browser open this build: https://playcanv.as/p/VmHVW3Wb/ and then switch to hands (either automatically by placing controllers down and presenting hands, or by going into Oculus quick settings and switching there).
Source for hands tracked project: https://playcanvas.com/project/705931/overview/webxr-hands

3 Likes

Oh my goodness, this is amazing. Thanks you so much, Moka <3

1 Like

Hey Moka! I have this working with Quest2 but do you know if it will be compatible with Vive Focus 3?

If it supports the WebXR handtracking API, then it should do.

1 Like