Change material's render target

Hi,

I’m using playcanvas to render webcam data. And I want to use shader to do some image processing and get the pixel data back. I know this can be done using Render To Texture in webgl. In pc, a material with the processing shader may handle this work. The problem is, how to get the rendered pixel back? May be I should change the material’s render target to a texture and read pixel using it?

Hi @Kir_Frank and welcome,

Check this webcam based video texture example. It creates a pc.Texture from the video stream which is used on a material. That same texture can be easily accessed from any shader chunk since it’s a regular texture sampler now.

https://playcanvas.com/project/698371/overview/webcam-video-texture

Edit: in case I misunderstood and you want to extract the pixel data after you do all of your processing then yes, use a render target and extract the data from the canvas surface. You can use the following example as a guide:

https://developer.playcanvas.com/en/tutorials/capturing-a-screenshot/

1 Like

Thx for reply! I’ll see it!

1 Like

Hi, @Leonidas I’ve checked the second example and sadly I may not capture the canvas directly because there are some other entities displayed. By the way, I find https://playcanvas.com/editor/code/560797?tabs=12937848 and may be I can change the material.diffuseMap to my targetTexture, then upload the video texture on each frame and read pixel ?

Yes definitely, render to texture is aimed to be used for something like that. Just note read pixel is generally a slow WebGL method, since it stalls the GPU while communicating the pixels to the CPU.

I’m following the render to texture example, but the getLayerByName always returns null. It seems that I should create a layer first?

 this.targetTexture = new pc.Texture(app.graphicsDevice, {
            //....
            });
 this.targetMaterial = new pc.StandardMaterial();
            this.targetMaterial.shader = new pc.Shader(app.graphicsDevice, {
            //....
            });

let renderTarget = new pc.RenderTarget({colorBuffer:this.targetTexture});
let layer = app.scene.layers.getLayerByName('render');
layer.renderTarget = renderTarget;
this.targetMaterial.diffuseMap = layer.renderTarget.colorBuffer;
this.targetMaterial.update();

Yeah, readPixel is slow. It seems that the texture getSource will internally invoke this function?

thx!

Have you added the layer to your project in editor? Don’t forget to add it to your active camera and any other model that shouldn’t be rendered there.

image

Yes, that’s correct, there isn’t any other way for reading pixels right now. I was just mentioning the fact if you need to call it per frame make sure to take it into account in your rendering budget.

1 Like

Oh that’s right. But how can I add a layer with code? I’m not using the editor.

I try to render to those default layer but seems that it still fails to getSource in loop.

getResult()
{
        this.rawTexture.upload();// get new frame to raw texture

        console.log(this.targetTexture.getSource()); // null
}

If you are using the engine only then take a look at the following example. It showcases both creating a new layer and using a render target to render certain models on it:

https://playcanvas.github.io/#graphics/render-to-texture.html

1 Like

That helps! Thanks!

            let renderTarget = new pc.RenderTarget({colorBuffer:this.targetTexture});
            let renderLayer = new pc.Layer({ name: "Render" });
            app.scene.layers.push(renderLayer);
    
            renderLayer.renderTarget = renderTarget;

            this.targetMaterial.diffuseMap = renderLayer.renderTarget.colorBuffer;
            this.targetMaterial.update();

The last problem is how to read the pixel from the target texture after I upload the video to the rawTexture? getSource right after it not works. Seems that I should wait for the rendering finished?

Right, this may be of help, the pc.Picker class does exactly the same to find the mesh instances picked from a render target.

That is directly from the engine source code:

2 Likes

Cool! I’ve tried the following code but pixel array got full zero.

    getResults() {
        this._rawTexture.upload();

        let device = this._app.graphicsDevice;
        let preTarget = device.getRenderTarget();
        device.setRenderTarget(this._renderTarget);
        device.updateBegin();
        let px = new Uint8Array(180 * 320 * 4);
        device.gl.readPixels(0, 0, 180, 320, device.gl.RGBA, device.gl.UNSIGNED_BYTE, px);
        device.updateEnd();
        device.setRenderTarget(preTarget);

        console.log(px) // [0,0,....]
    }
}

Luckily, the other entities are rendered normally.

Should I force the material rendering and then read pixels?

So, if the layer the render target is rendering to is assigned to the active camera that should work I think.

For a more complex pipeline you can use a layer composition for your layer and control when rendering occurs. That’s how the picker class works:

I imagine though you won’t have to do that. If you are able to share a full code example we may be able to see where the problem is.

1 Like