WebXR screen mirroring

When entering WebXR the rendering to the canvas gets paused.
Is there a simple way to mirror one eye to the canvas, so other persons can see what the user is doing in VR?
I’m also ok with adding a second camera and parenting it to the HMD entity and let it render to the canvas. This will introduce some performance costs, but I can live with that.

My Setup is SteamVR + HTC Vive and WebXR with Chrome

I’ve found this online which works pretty well.

In theory this is possible, but not something that is supported by the engine at the moment - so you’d need to implement it yourself. Currently engine renders to a single backbuffer, that comes either from the canvas, or from the XR.

Would it be feasible to access that backbuffer and mirror it to a new canvas?
Altough I guess that backbuffer would contain left and right eye views.

Just had a look at the docs and I guess using the colorbuffer of the cameras rendertarget and drawing that to the canvas should work.
Gonna try that next week.

Yeah maybe you can get that happening using a blit.

Ok so the cameras rendertarget always seems to be null.
How do I access the internal renderTarget or do I have to create my own?

This code per frame decides what is the main framebuffer we render to:

For XR we render to a framebuffer supplied by XR session, otherwise we render to a default canvas framebuffer (null).

So I suspect somewhere in the code you can blit from XR to canvas framebuffer, but not sure if that will work.

Likely something like this will need to execute:

But i’m not sure where’s a good place / time to execute this. Perhaps some on post rendering callback.

this.app.graphicsDevice.renderTarget also returns null.
I’ve also tried the creating my own rendertarget but calling getSource() on the colorbuffer also returns null.
Same is true when I use the ‘uSceneColorMap’ of one of the graphicsDevice targets. Also returning null.

I currently don’t have a VR headset connected to my pc. So I’m just trying to mirror the camera on postrender to a secondary canvas.
This is what I have so far:

var ScreenMirroring = pc.createScript('screenMirroring');

ScreenMirroring.attributes.add('camera', { type: 'entity' });

ScreenMirroring.prototype.initialize = function () {
    this.cam = this.camera.camera;

    this.app.on('postrender', this.postRender, this);

    let style = document.createElement('style');
    document.head.appendChild(style);
    style.innerHTML = '#mirroring-canvas{position:absolute;top:0;left:0;width:100%;height:100%;}';

    this.canvas = document.createElement('canvas');
    this.canvas.id = 'mirroring-canvas';
    document.body.appendChild(this.canvas);
    this.gl = this.canvas.getContext("webgl");
    this.buffer = this.gl.createFramebuffer();
};

ScreenMirroring.prototype.postRender = function () {
    let gd = this.app.graphicsDevice;

    gd.gl.bindFramebuffer(gd.gl.READ_FRAMEBUFFER, null);
    gd.gl.bindFramebuffer(gd.gl.DRAW_FRAMEBUFFER, this.buffer);
    gd.gl.blitFramebuffer(0, 0, gd.width, gd.height, 0, 0, gd.width, gd.height, gd.gl.COLOR_BUFFER_BIT, gd.gl.NEAREST);
    gd.gl.bindFramebuffer(gd.gl.FRAMEBUFFER, null);
};

I’m getting the error bindFramebuffer: object does not belong to this context
So it seems like it’s not possible to blit from one context to another.

Also I’m not quite sure how to reset the framebuffers. In your example it’s reseting to the framebuffer of the renderTarget of the graphicsDevice. But the rendertarget always is null for me. At least in postrender callback.
Edit: Binding null to reset the framebuffers should work. I’ve added it to the script above.

Yep, blitting between resources of different cavases won’t work, but that is not needed for what you originally wanted to do anyways.

That’s correct. Alright so let’s make this a bit simpler:

var ScreenMirroring = pc.createScript('screenMirroring');

ScreenMirroring.attributes.add('camera', { type: 'entity' });

ScreenMirroring.prototype.initialize = function () {
    this.cam = this.camera.camera;

    this.buffer = this.app.graphicsDevice.gl.createFramebuffer();

    this.app.on('postrender', this.postRender, this);
};

ScreenMirroring.prototype.postRender = function () {
    let gd = this.app.graphicsDevice;

    gd.gl.bindFramebuffer(gd.gl.READ_FRAMEBUFFER, null);
    gd.gl.bindFramebuffer(gd.gl.DRAW_FRAMEBUFFER, this.buffer);
    gd.gl.blitFramebuffer(0, 0, gd.width, gd.height, 0, 0, gd.width, gd.height, gd.gl.COLOR_BUFFER_BIT, gd.gl.NEAREST);
    gd.gl.bindFramebuffer(gd.gl.FRAMEBUFFER, null);
};

This gives the following error:
Framebuffer is incomplete: No attachments and default size is zero.
How can I specify the size and what exactly do I have to do with attachments?
Sorry I’m new to all the low level gl calls :grimacing:

To be honest, I don’t any easy answer here … framebuffers are not straighforward to work on in WebGL, and I didn’t expect this to be easy. EIther way, I would not expect you to call gl.createFramebuffer as you should not need to create a new framebuffer I think, but I’m not sure what happens under the hood in WebGL with the default framebuffer when XR is active, I have not touched that area.

I assumed that I have to draw to some extra buffer, which I then can show on the canvas. As the backbuffer isn’t shown.
I’ll try around a bit more and post a solution if I find one.

Update:

var ScreenMirroring = pc.createScript('screenMirroring');

ScreenMirroring.prototype.initialize = function () {
    this.app.on('postrender', this.postRender, this);
};

ScreenMirroring.prototype.postRender = function () {
    if (this.app.xr.active) {
        let gd = this.app.graphicsDevice;

        gd.gl.bindFramebuffer(gd.gl.READ_FRAMEBUFFER, this.app.xr.session.renderState.baseLayer.framebuffer);
        gd.gl.bindFramebuffer(gd.gl.DRAW_FRAMEBUFFER, null);
        gd.gl.blitFramebuffer(0, 0, gd.width, gd.height, 0, 0, gd.width, gd.height, gd.gl.COLOR_BUFFER_BIT, gd.gl.NEAREST);
        gd.gl.bindFramebuffer(gd.gl.FRAMEBUFFER, null);
    }
};

I’m now trying to use this.app.xr.session.renderState.baseLayer.framebuffer as read buffer. However when usin this framebuffer, the XR Session seems to crash/not start silently.

1 Like

I’d say that this XR context is on a different device (HTC Vive) and so blitting this to desktop won’t work.
It might work using some XR emulator in the browser, as that’d run on the same device.

I’ve gotten pretty close now:

var ScreenMirroring = pc.createScript('screenMirroring');

ScreenMirroring.prototype.initialize = function () {
    this.app.on('postrender', this.postRender, this);

    this.app.xr.on('start', function () {
        this.isActive = true;
    }, this);
    this.app.xr.on('end', function () {
        this.isActive = false;
    }, this);

    this.isActive = false;
};

ScreenMirroring.prototype.postRender = function () {
    if (this.isActive) {
        let gd = this.app.graphicsDevice;

        gd.gl.bindFramebuffer(gd.gl.READ_FRAMEBUFFER, this.app.xr.session.renderState.baseLayer.framebuffer);
        gd.gl.bindFramebuffer(gd.gl.DRAW_FRAMEBUFFER, null);
        gd.gl.blitFramebuffer(0, 0, gd.width, gd.height, 0, 0, gd.width, gd.height, gd.gl.COLOR_BUFFER_BIT, gd.gl.NEAREST);
        gd.gl.bindFramebuffer(gd.gl.FRAMEBUFFER, null);
    }
};

When using the start and end events off the XRManager it does correctly blit to the canvas! My guess is when using XRManager.active the framebuffer isn’t properly set up yet.

However this introduces another problem. I can now see the rendered image on the canvas but not in the headset anymore.
Also following warnings are printed:
[.WebGL-0000625000A3FF00] GL_INVALID_OPERATION: Invalid operation on multisampled framebuffer
Note: The XRSession has completed multiple animation frames without drawing anything to the baseLayer's framebuffer, resulting in no visible output.

Maybe try to disable antialiasing?

Disabling anti aliasing gets rid of the firrst warning.
Binding the framebuffer back to the buffer of the xrsession baselayer, restores rendering to the headset!

Here is a working example which displays both eyes:

var ScreenMirroring = pc.createScript('screenMirroring');

ScreenMirroring.prototype.initialize = function () {
    this.app.on('postrender', this.postRender, this);

    this.app.xr.on('start', function () {
        this.isActive = true;
    }, this);
    this.app.xr.on('end', function () {
        this.isActive = false;
    }, this);

    this.isActive = false;
};

ScreenMirroring.prototype.postRender = function () {
    if (this.isActive) {
        let gd = this.app.graphicsDevice;
        let webglLayer = this.app.xr.session.renderState.baseLayer;

        gd.gl.bindFramebuffer(gd.gl.READ_FRAMEBUFFER, webglLayer.framebuffer);
        gd.gl.bindFramebuffer(gd.gl.DRAW_FRAMEBUFFER, null);
        gd.gl.blitFramebuffer(0, 0, gd.width, gd.height, 0, 0, gd.width, gd.height, gd.gl.COLOR_BUFFER_BIT, gd.gl.NEAREST);
        gd.gl.bindFramebuffer(gd.gl.FRAMEBUFFER, webglLayer.framebuffer);
    }
};
1 Like

Here is my final script which will mirror the left eye to the html canvas:

var ScreenMirroring = pc.createScript('screenMirroring');

ScreenMirroring.prototype.initialize = function () {
    this.app.on('postrender', this.postRender, this);

    this.app.xr.on('start', function () {
        this.isActive = true;
    }, this);
    this.app.xr.on('end', function () {
        this.isActive = false;
    }, this);

    this.isActive = false;
};

ScreenMirroring.prototype.postRender = function () {
    if (this.isActive) {
        let gd = this.app.graphicsDevice;
        let webglLayer = this.app.xr.session.renderState.baseLayer;
        let cs = getComputedStyle(gd.canvas);
        let canvasWidth = parseInt(cs.width);
        let canvasHeight = parseInt(cs.height);
        let aspect = canvasWidth / canvasHeight;
        let croppedHeight = parseInt(gd.width / 2 / aspect);
        let heightOffset = (gd.height - croppedHeight) / 2;

        gd.gl.bindFramebuffer(gd.gl.READ_FRAMEBUFFER, webglLayer.framebuffer);
        gd.gl.bindFramebuffer(gd.gl.DRAW_FRAMEBUFFER, null);
        gd.gl.blitFramebuffer(0, heightOffset, gd.width / 2, gd.height - heightOffset, 0, 0, gd.width, gd.height, gd.gl.COLOR_BUFFER_BIT, gd.gl.NEAREST);
        gd.gl.bindFramebuffer(gd.gl.FRAMEBUFFER, webglLayer.framebuffer);
    }
};

Important: Anti aliasing needs to be disabled!

1 Like

Is there any way to get this working with antialiasing enabled?
In this stackoverflow link the problem seems to be the different resolutions of the HMD and the canvas.
Which somehow prevents a successful blit when antialiasing is enabled?

If aa is really undoable I probably try to get some kind of aa with a post processing shader,