Animated webxr input?


I hope you guys are well!

I’m playing with your hand/input tutorials and I’m not sure how to plug in the buttons animations?

Is it supported at all?

Thanks for the support!


I’ve just tried this with Quest 1.

Hand tracking with animation works fine.
The controller models from what I can see are not supposed to be animating in the project.

If you are looking to have an animated hand model around the controller when using the controller input, that’s not supported in the project.

Hello Yaustar.

I’m not looking for the animated hand but to controller itself, the buttons, the trigger, etc… I’m wondering how to connect that to playcanvas.

Thanks for the support

Looking at the what you have linked, it looks like they find the node that represents the button/stick on the model and animate directly by tweening/modifying the transform

So you would get the state of the buttons and sticks rotate/translate accordingly.

It looks like you can get that state from the gamepads and events on XrInputSource:

InputSource object is returned when the controllers are detected:

Or you can get a list from where XrInput can be retrieved from which is found on the application object

1 Like


They are talking about calling every frame :

what would be the equivalent in playcanvas of motionController?

Thank you?

@max will know more but AFAIK, that’s not needed here. Your animation code can just read use the state and/or events on the Xr Input Source

1 Like

OK then! I’ll give a try.

Thank you

Hi @memleak, models in the example above, are provided by the WebXR community, and do have individual buttons there. You would need to inspect the model, either by using container asset import settings in a project or using PlayCanvas glTF Viewer - so you can find the right node for each button.
Then you can access each axis and button state through gamepad property on input source: