I wanted to know if there are any sample Apple Vision Pro examples that I can use for learning. I am trying to develop some things in Playcanvas that can be used in Apple Vision Pro and I am very new to VR experiences in Playcanvas. Any help would be appreciated.
Just tried various XR examples on the AVP and it seems there is an issue with the depth buffer for all examples starting with engine version 1.66. It looks like everything is drawn without depth test. I am investigatingā¦
Yes. With enabled Anti Aliasing Option the screen renders just pitch black.
With AA disabled, the scene is rendered without depth test (supposedly) you only see the skybox. If the skybox gets disabled the hand mesh is shown but with typical disabled depth buffer artifacts.
Yeah, update to visionOS 2 (beta). It fixes stuff like this.
It also improves on the black-screen-when-antialiasing-is-enabled thing too - it now renders stuffā¦but the eye projections are not quite right! So you still need to disable Anti-Alias for now.
Another observation I made is that a
pc.XRSPACE_LOCAL session on the AVP is the same as pc.XRSPACE_LOCALFLOOR
I would expect a pc.XRSPACE_LOCAL to set the local camera position to 0,0,0 (as it does on Quest and other devices) but it behaves like local-floor, hence adding the head position on top.
This can experienced in the VR kit demo that uses XRSPACE_LOCAL. When executed on AVP the initial viewing position is much higher than on a Quest.
This can be a problem when an App should work the same for sitting and standing experience etc.
I agree, PS only passes the xr-space type it to the WebXR API which than reports āmisalignedā camera coordinates.
Something else: Gaussian Splats work great on AVP and PS which is super cool, but they seem to be stretched. I guess the projection matrix needs to be adapted for immersive mode. See the screenshot: