Obviously there is increasing interest in VR and playcanvas, the webVR spec is improving daily and I think PlayCanvas is uniquely placed to attract developers providing that it’s production ready. The HTC controllers go beyond the simpleness of the current gamepad API (although that is definately the logical place for it) and I find myself having to build my own boilerplate code to access the controllers. The Rift’s controllers are only a short time away and more advanced controllers are only around the corner. Not only do we need the usual functionality like buttons etc, but there is the touch sensitive surfaces and the visibility of the controllers in the 3d space mapped to models based on the controller orientation etc. There is quite a bit to do 
ps. I also think the tutorial should cover game-pad input like it covers keyboard and mouse for completeness.
After some study it looks like the gamepad functionality only needs two things
gamepad.pose.position
and
gamepad.pose.orientation
following the same pattern as you currently use the playcanvas api might look something like:
getPosition(index)
getOrientation(index)
i tend to use the following code to update the positions and orientations of objects in the real world to map them to the controllers. its a bit messey but you get the idea
//get gamepad position and set entity to its position
var pos = this.gamepad.pose.position;
//add the initial camera height to the y position
this.entity.setPosition(pos[0],pos[1],pos[2]);
//get gamepad orientation and set entity to its orientation
var orient = this.gamepad.pose.orientation;
var q = new pc.Quat();
q.x = orient[0];
q.y = orient[1];
q.z = orient[2];
q.w = orient[3];
this.entity.setLocalRotation(q);
also needs gamepad.vibration(n) for haptic feedback on the vive