Working on Hand Tracking API for WebXR.
This is currently (26.07.20) available as and experimental feature on Oculus Quest with chrome://flags Hand Tracking enabled.
API will help developers to get access to hands, render them and do any convenient operations, like physics interactions, interacting with UI, rendering hands, etc.
Any XR app that used Ray and Select events, will just work without any extra code (only code needed for hands rendering and advanced hands interactions, gestures, etc).