Need help getting started with Playcanvas AR/VR

Is it possible to create a markerless AR experience with a VR camera?

Basically I’m trying to create an experience where the phone uses the camera to track the surrounding and create a world around the player so that it looks like it’s VR but it’s actually markerless AR.

The reason I want to do it this way is because in VR if you move the phone around, the camera in the game doesn’t pan around but if I create it using AR the camera will pan in game.

These aframe examples will help in understanding:

In VR mode you can only look around but in AR you can look as well as move around.
Is it possible to create the AR version of the above example with split-screen camera?

There are two ways that I know of, using WebXR which is not widely supported yet.

There’s a few projects by @moka of AR projects with WebXR

Or using something like 8th Wall which is an external library and service.

1 Like

Thanks. I’ll take a look at them.

I tried the AR projects but I’m not really satisfied with them. The AR objects don’t stay in one place, they keep moving with my phone. Whereas in A-Frame and ThreeJS AR the objects placed in the real world stay in the same place.

The VR works properly, although I was wondering if we could get 6DoF on phones. Right now the VR only tracks the phone’s rotation. Is there a way to also make it track the phone’s position?

Have you tried the WebXR examples from PlayCanvas for AR?

Try this project: https://playcanvas.com/editor/scene/976117

Look around for the floor a bit and tap the screen to place grass.

Bear in mind, this only works on Android Chrome only so far.

1 Like

The project you linked works perfectly.
However this is the one that I tried earlier which wasn’t perfect:

Any ideas about this?

1 Like

Oh, that’s odd :thinking: Will have to have a look another time at that one

1 Like

Can’t be done yet for VR on phones in browsers. Currently position for AR is based on image algorithms and it’s just not exposed at the moment.

1 Like

In PlayCanvas we have APIs that make WebXR interfaces easier to use, taking care of many low-level stuff for you.
It does not implements any polyfills or anything extra over official WebXR functionality.

It is up to developers to come up with creative solutions. AR session with dual rendering will be pretty bad, as AR makes emphasis on image analysis and has high latency for tracking, makes it unusable for VR.
Inside-out tracking - is a thing, and been successfully used by VR headsets, like Oculus Quest and many others, which is also supported by PlayCanvas.
But they are using array of cameras for this. On the phone we have either one or few very close cameras, and if phone manufacturers will figure out inside-out tracking for phones - that will be great. Personally, I doubt this will be anytime soon. VR on mobiles, is fun for first few minutes, but has very low retention, so manufacturers and VR industry is not focusing on 3DoF/mobile VR anymore.

I believe some time in the future, inside-out low-latency tracking for mobile phones will be a thing.

2 Likes