WebXR Raw Camera Access

Hi, I’m trying to access the camera texture from within a WebXR session. There’s little information online about it other than these two resources I found:


The camera access seems to be happening on this line:

// Update camera image texture.
const texture = glBinding.getCameraImage(view.camera);

(From within a session’s frame’s view’s camera).
I can access the view.camera from within PlayCanvas but I’m not sure where glBinding is coming from. Is it part of the XRManager?

Any help would be appreciated.

I don’t think the camera feed is rendered by PlayCanvas but by the browser in the body of the page (hence we need to have our camera clear color alpha to 0).

When opening the HTML page in devtools: n line 190, it creates a XRWebGLBinding and assigns it to glBinding.

More info here: XRWebGLBinding - Web APIs | MDN

I’ve tried to adapt the example and also asked some help from GPT-4, though I’m not getting any tangible results. I’m not versed in lower level WebGL so I’m a bit lost, I’d really appreciate anyone who could point me in the right direction. :pray:

Here’s my current code:

const canvas = this.app.graphicsDevice.canvas;
const bindingContext = canvas.getContext("webgl2");

this.glBinding = new XRWebGLBinding(this.app.xr.session, bindingContext);

const gl = document.getElementById('debugCanvas').getContext('webgl');

this.app.xr.on ('update', (frame) => {
        if (!this.app.takeScreenshot) return;
        this.app.takeScreenshot = false;

        if (!xrRefSpace) {
          console.log ('no ref space');
        let pose = frame.getViewerPose(xrRefSpace);

        for (const view of pose.views) {
          if (view.camera) {
              let camera = view.camera,
                  webXRTexture = this.glBinding.getCameraImage(camera);

                // 2. Create a framebuffer
                const framebuffer = gl.createFramebuffer();
                gl.bindFramebuffer(gl.FRAMEBUFFER, framebuffer);

                // 3. Create a new destination texture
                const destinationTexture = gl.createTexture();
                gl.bindTexture(gl.TEXTURE_2D, destinationTexture);
                gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, canvas.width, canvas.height, 0, gl.RGBA, gl.UNSIGNED_BYTE, null);
                gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
                gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
                gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);
                gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);

                // 4. Attach the destination texture to the framebuffer
                gl.framebufferTexture2D(gl.FRAMEBUFFER, gl.COLOR_ATTACHMENT0, gl.TEXTURE_2D, destinationTexture, 0);

                // 5. Set up the viewport
                gl.viewport(0, 0, canvas.width, canvas.height);

                // 6. Create a vertex buffer and an index buffer
                const quadVertices = [
                  -1, -1, 0, 0,
                  1, -1, 1, 0,
                  1, 1, 1, 1,
                  -1, 1, 0, 1

                const quadIndices = [
                  0, 1, 2,
                  0, 2, 3

                const vertexBuffer = gl.createBuffer();
                gl.bindBuffer(gl.ARRAY_BUFFER, vertexBuffer);
                gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(quadVertices), gl.STATIC_DRAW);

                const indexBuffer = gl.createBuffer();
                gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, indexBuffer);
                gl.bufferData(gl.ELEMENT_ARRAY_BUFFER, new Uint16Array(quadIndices), gl.STATIC_DRAW);

                // 7. Create a shader program
                const vertexShaderSource = `
                  attribute vec2 aPosition;
                  attribute vec2 aTexCoord;
                  varying vec2 vTexCoord;
                  void main() {
                    gl_Position = vec4(aPosition, 0.0, 1.0);
                    vTexCoord = aTexCoord;

                const fragmentShaderSource = `
                  precision mediump float;
                  uniform sampler2D uTexture;
                  varying vec2 vTexCoord;
                  void main() {
                    gl_FragColor = texture2D(uTexture, vTexCoord);

                function createShader(gl, type, source) {
                  const shader = gl.createShader(type);
                  gl.shaderSource(shader, source);

                  if (!gl.getShaderParameter(shader, gl.COMPILE_STATUS)) {
                    throw new Error(`An error occurred compiling the shader: ${gl.getShaderInfoLog(shader)}`);

                  return shader;

                const vertexShader = createShader(gl, gl.VERTEX_SHADER, vertexShaderSource);
                const fragmentShader = createShader(gl, gl.FRAGMENT_SHADER, fragmentShaderSource);

                const shaderProgram = gl.createProgram();
                gl.attachShader(shaderProgram, vertexShader);
                gl.attachShader(shaderProgram, fragmentShader);

                if (!gl.getProgramParameter(shaderProgram, gl.LINK_STATUS)) {
                  throw new Error(`Unable to initialize the shader program: ${gl.getProgramInfoLog(shaderProgram)}`);

                // 8. Bind the buffers, texture, and shader program
                gl.bindBuffer(gl.ARRAY_BUFFER, vertexBuffer);
                gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, indexBuffer);

                const positionAttribute = gl.getAttribLocation(shaderProgram, 'aPosition');
                gl.vertexAttribPointer(positionAttribute, 2, gl.FLOAT, false, 16, 0);

                const texCoordAttribute = gl.getAttribLocation(shaderProgram, 'aTexCoord');
                gl.vertexAttribPointer(texCoordAttribute, 2, gl.FLOAT, false, 16, 8);

                const textureUniformLocation = gl.getUniformLocation(shaderProgram, 'uTexture');
                gl.bindTexture(gl.TEXTURE_2D, webXRTexture);  // Use webXRTexture as the source texture
                gl.uniform1i(textureUniformLocation, 0);

                // 9. Draw the textured quad
                gl.clearColor(0, 0, 0, 0);
                gl.drawElements(gl.TRIANGLES, quadIndices.length, gl.UNSIGNED_SHORT, 0);

                // 10. Unbind the framebuffer and display the result
                gl.bindFramebuffer(gl.FRAMEBUFFER, null);

                // 11. Draw the destination texture on the canvas
                gl.bindTexture(gl.TEXTURE_2D, destinationTexture);
                gl.clearColor(0, 0, 0, 0);
                gl.drawElements(gl.TRIANGLES, quadIndices.length, gl.UNSIGNED_SHORT, 0);


I had a look at this and managed to get this far: https://playcanvas.com/project/1071463

I can get the raw WebGlTexture from the camera feed (I think) but can’t work out how to create a PlayCanvas texture from a raw WebGlTexture to use yet. Looks like we would have to hack the pc.Texture class/creation process for it to work :frowning:

But maybe this can get you moving in the right direction?

Managed to get it working https://playcanvas.com/project/1071463. Key points:

Optional features when XR starts needs camera access:

AugmentedRealityManager.prototype._startAugmentedRealitySession = function () {
    // switch cameras
    this.defaultCameraEntity.enabled = false;
    this.arCameraEntity.enabled = true;

    // start XR session on the AR Camera
    this.arCameraEntity.camera.startXr(pc.XRTYPE_AR, this.spaceType, { optionalFeatures: ['camera-access'] });

And then the rest is in grabXrCameraFeed but the camera texture will be flipped on the Y axis so I scaled the plane -1 on the Z.

1 Like

Amazing! Thank you so much! Working nicely in your example.

Weirdly, I forked your project and didn’t change anything – and it just shows me a normal map texture?

Here’s the forked project link: PlayCanvas 3D HTML5 Game Engine

Worked fine on your link on an S10 running Chrome 113. :thinking:
Is there anything different I need to setup in my project to get it working? Does my project work on your device?

I’m very confused. If I try your project directly, I only get the camera feed if I look directly at the plane.

However, I just forked your project and that works fine :person_shrugging:


Can I add myself to your project to take a look?

Yep of course!

1 Like

Hmm, can’t work out why this is happening only on your project :thinking:

It’s almost as though it hasn’t got camera access. Might need to park this for a bit and see if I have any bright ideas

Out of ideas at the moment. The engine doesn’t support an official way of using a raw WebGL texture.

@mvaligursky or @slimbuck may have some ideas but just setting the texture.imp._glTexture doesn’t seem to work in your project specifically despite no changes :thinking:

Hi @yaustar, just revisiting this – I’ve noticed that in my project, the texture is only visible when the center of the plane is on the bottom half of my screen. Also, when I disable Frustum Culling on the AR Camera, it disappears and won’t show at all. So I don’t think this is specifically AR related.

I’m not sure where to look next, does this spark any new ideas from you? Also @mvaligursky or @slimbuck?

I can’t disable Frustum Culling in your project to see if there’s similar results but would be interesting to see.

Might be to the negative scale of the plane model? I’ve disabled frustum culling with no issues.

1 Like