Advice on VR vs desktop UX

Hi, just getting familiar with PlayCanvas. Before committing to developing I am trying to tick off as many Q’s as I can so I don’t get stuck later. I have been through all the demos and tried to explore as thoroughly as I can but I’m wondering about the VR vs desktop user experience.

I would like the following to happen:

  • user hits site
  • device capability is detected
  • if it’s webXR compatible (Quest 2 in my case), the VR button is displayed (taking the user to the immersive, teleporting experience)
  • if the device doesn’t have XR capability the user is automatically given a WASD/arrow keys and a first or third person experience

This doesn’t seem to be how any of the demos are set up. For example VR lab on desktop has an awkward “click to replicate teleporting” mechanism which is pretty clunky IMHO. I wonder if there’s a reason for this that, as a newbie, I can’t see. If there isn’t and my logic is sound, has anyone seen any demos of this flow anywhere?

Your plan is sound I think, I don’t see a problem with this.

I’m experimenting with XR at the moment, and have a test project set up similarly. It’s a product configurator test, and when you hit the site, you see a normal 3d spinning model and you can orbit around / zoom in, use keyboard if needed. If XR is detected, there’s an AR button that switches to AR where other controls are used. On iOS where the XR is not available, we run time convert the loaded models / scene to USDZ, and show those in AR using built in QuickLook.

2 Likes

Ooooh I like that USDZ conversion sauce! I noticed the other day that the USDZ of something I was downloading was smaller than the gltf/glb. Would be cool if the de facto standard was also the most compact :slight_smile: . Thanks for your feedback, glad to know I’m not too far off with my thinking - will keep looking for a reference implementation.

When will this project be released? Such functionality is extremely needed.

The model viewer does majority of this already, check it out: GitHub - playcanvas/model-viewer: 3D model viewer supporting glTF 2.0 and PLY (3d Gaussian Splats)

@jmd Have you tried the following.

https://playcanvas.com/project/873904/overview/vr-interaction-framework

I have used this framework and adjusted different aspects to fit my Quest 2 needs.

VR Kit + orbit-camera VR vs desktop UX

Target: PC desktop using orbit-camera rotation, translation
Teleportation using a controller in a VR environment

I tried to combine the two examples of VR Kit and orbit-camera,
Examples of integration is the link: PlayCanvas | HTML5 Game Engine

I added the orbit-camera example control camera code to the camera in the VR Kit example

You can rotate and pan normally with the mouse on the PC desktop
Use quest2 and click the VR button to enter the VR environment
Cannot be teleported, the handle is detected, the teleported icon is present, but cannot be teleported after clicking the trigger key

What code needs to be modified to solve these conflicts, and can the controller be used to teleport in the VR environment?