Launching AR session using WebXr API


So, I’ve recently got to use a device that supports WebXR API, so I’m trying to launch a basic AR scene in PlayCanvas.

I have been looking at:

and trying to make sense of it.

It seems like my AR Session is launching (or at least the promise is returning something) but i think there is a problem with a camera.

This is my code:

// jshint esversion: 6, unused: true, varstmt: true
const CheckXr = pc.createScript('checkXr');
const webXr = navigator.xr;

CheckXr.prototype.initialize = function() {'touchstart', this.startAr, this);

CheckXr.prototype.startAr = function() {
    const camera = this.entity.findOne('name', 'Camera');

    webXr.requestSession('immersive-ar').then((session) => {
        console.log(session);, pc.XRTYPE_AR, pc.XRSPACE_LOCAL);

and here is the project.

What I’m trying to do, is to launch AR session when the users taps the device and just have a simple cube visible.

When I try to do this on my phone, nothing happens, and the camera does not fire. Above is a console log with errors. It looks to me like there is some sort of function that tries to “look at the camera” to set a position but it cannot find a camera feed hence it cannot setPosition of undefined. At least this is my undersanding.

Where am I going wrong with this?

For the record, my device does support all the examples on here so I know that this isn’t any device-related issue but rather my code.

This is how I’m doing it. Can you give it a try?

if ( {
    // start session, pc.XRSPACE_LOCAL);
else {
    console.log("XR not supported");

I pass the camera entity as an attribute.


Thanks @Gamer_Wael :slight_smile:

So looking at your code, I discovered that my issue was that I was referring to camera entity, whereas I should have been referring to camera entity ==> camera component.

So changing:

const camera = this.entity.findOne('name', 'Camera');


const camera = this.entity.findOne('name', 'Camera').camera

seems to fix the error.

Now I can do either:, pc.XRTYPE_AR, pc.XRSPACE_LOCAL);


camera.startXr(pc.XRTYPE_AR, pc.XRSPACE_LOCAL);

Having said that, this still doesn’t work like I thought it would. There is no camera feed and the camera behaves a bit wonky when you move your phone.

If if anybody could take a look at on their phone (must be compatible with webXR) and let know if you have any suggestions what is going wrong here.

Also, it seems like you need to swipe the screen rather than tapping in to start the session and also, it seems like this wrapping function in my code makes it so the session does not launch on a phone ( but it does in the emulator).

  webXr.requestSession('immersive-ar').then((session) => {

This is what it looks like in the emulator. Notice the weird placement of the cube.

1 Like

The reason you can’t see the camera feed is because of the camera’s clear color. You need make it transparent by making Alpha 0. You can do it programmatically with:

camera.clearColor = new pc.Color(0, 0, 0, 0);

There should also be a similar option in the editor under the camera’s properties.

You don’t need to swipe, it just feels that way because it takes time for the AR session to start. So just start the app, wait for a few seconds and then tap the screen. But for some reason your app starts on the second tap…I’m not sure why. Maybe try to find the cameraEntity in the init function…


Oh man, you were absolutely right. Now it works and it does show the camera feed. Now, I just need to figure out how to change the camera starting position.

What happens now, is that the the AR scene starts exactly in the middle, inside the cube. So I need to walk quite a few steps back to see the cube :sweat_smile:. And it seems the the position of the camera entity in the scene does not affect this at all (I have been experimenting).

I will probably need to look into pc.XRSPACE_LOCAL, maybe this is casing it.

Anyway, I am so grateful to this community. If I were on my own, I would probably been stuck for many days.


I got this to a pretty OK state: You can walk around the room, and you can walk around virtual objects. It still just projects a plane and cubes on your camera but at least it is something to get started!

1 Like

I think this might help.


When you are presenting in XR, the position and orientation of the camera are overwritten by data from the XR session. If you want to implement additional movement and rotation of camera, you should add a parent entity to your camera and apply your manipulations to that entity.