I’m using playcanvas to render webcam data. And I want to use shader to do some image processing and get the pixel data back. I know this can be done using Render To Texture in webgl. In pc, a material with the processing shader may handle this work. The problem is, how to get the rendered pixel back? May be I should change the material’s render target to a texture and read pixel using it?
Check this webcam based video texture example. It creates a pc.Texture from the video stream which is used on a material. That same texture can be easily accessed from any shader chunk since it’s a regular texture sampler now.
Edit: in case I misunderstood and you want to extract the pixel data after you do all of your processing then yes, use a render target and extract the data from the canvas surface. You can use the following example as a guide:
Hi, @Leonidas I’ve checked the second example and sadly I may not capture the canvas directly because there are some other entities displayed. By the way, I find https://playcanvas.com/editor/code/560797?tabs=12937848 and may be I can change the material.diffuseMap to my targetTexture, then upload the video texture on each frame and read pixel ?
Yes definitely, render to texture is aimed to be used for something like that. Just note read pixel is generally a slow WebGL method, since it stalls the GPU while communicating the pixels to the CPU.
Have you added the layer to your project in editor? Don’t forget to add it to your active camera and any other model that shouldn’t be rendered there.
Yes, that’s correct, there isn’t any other way for reading pixels right now. I was just mentioning the fact if you need to call it per frame make sure to take it into account in your rendering budget.
If you are using the engine only then take a look at the following example. It showcases both creating a new layer and using a render target to render certain models on it:
let renderTarget = new pc.RenderTarget({colorBuffer:this.targetTexture});
let renderLayer = new pc.Layer({ name: "Render" });
app.scene.layers.push(renderLayer);
renderLayer.renderTarget = renderTarget;
this.targetMaterial.diffuseMap = renderLayer.renderTarget.colorBuffer;
this.targetMaterial.update();
The last problem is how to read the pixel from the target texture after I upload the video to the rawTexture? getSource right after it not works. Seems that I should wait for the rendering finished?