☑ How to loop back the image data in a fragment shader in PlayCanvas?

Hi,

I wish to have the output of a fragment shader to go back to its input again when it is re-drawn, then I can apply the same code on the same data over and over again.

For example, Conways game of life code here on ShaderToy. Shader - Shadertoy BETA

It outputs “Buf A” to “Render Buffer A” which it uses as input on the “Buf A” tab again as “iChannel0”. Here it is able to re-process its output data.

In the more general case I wish to be able to do buffers as are done in ShaderToy where I can send the output of one fragment shader to the input of another, or I should say the buffer it wrote to can be used as the buffer input. In the general case, buffers can go anywhere around to different fragment shaders.

I see that the method setShader exists on GraphicsDevice:

Should I first create a canvas and a new GraphicsDevice constructed with the canvas and then somehow input the canvas image data to the fragment shader? or are there other ways? Hmm on second though canvas seems a bad idea as I would want the flow to happen inside the graphics memory?..

Can PostEffect be used? or multiple pc.ShaderInput?.. looking through the code.
Maybe a Material can be passed in…

I made a public example here which is just exactly your code from your docs.
https://playcanvas.com/project/449634/overview/shader

https://playcanvas.com/editor/scene/488246/launch

Regards,Philip

The general idea is to render into a RenderTarget and use the texture from that render target as an input for the next shader.

A good example is the Bloom post effect: https://github.com/playcanvas/engine/blob/master/extras/posteffects/posteffect-bloom.js

1 Like

Hi Dave,

Thanks, so then I see the post effects all have render functions and in there the shaders are chained but for materials showing shader based images I do not find any render functions, so I’m not sure how to chain the shader chaining from one to the next. In my case, I wish to do shaders on the texture on the cube on screen, not doing shader on the camera.

A cut-down version of the Bloom script which no-longer does Bloom but is simplified I paste here
http://pastebin.com/Cr1jQgJ0

In my case, I wish to do shaders on the texture on the cube on screen and I don’t see how to use a render function for the cubes material, I only see render function used in the case of a post effect. If there is a render function for cube material then I could use that with the above pastebin to chain fragment shaders. Is there a way to use render function with cube material?

Without the ability to use a render function, I made the following progress and discovered.

  1. A texture can be created in JS. eg. var colorBuffer = new pc.Texture(gd, {
  2. A render target can be created upon a texture so that it will be a target for renders. eg. var target = new pc.RenderTarget(gd, colorBuffer, { depth: false });
    So this would then be likely the target of a shader but then also I need this texture to be the input texture for the shader and I only know how to do that with material as per the doc example. http://developer.playcanvas.com/en/tutorials/custom-shaders/
  3. A texture can be set as the input to a shader, but I only know how to do this while using a material.
    eg per the doc on the update function: this.material.setParameter(‘uTime’, t);
  4. Material and shader are normally associated with this.material.setShader(this.shader);, I don’t see how to get the render target (pc.RenderTarget) to be the material. Should I be setting the texture on the Material? since the texture is the subject of the render target.

So I can think there is two ways to solve this problem, either place a render function on the material or texture but I don’t know if that is allowed or good or if there is any example.
The second way would be to a) create a texture b) create a render target c) create a shader d) make shader output to render target but I don’t know how to do because when I work with material I can only associate shader with material eg this.material.setShader(this.shader);, I don’t know how to associate shader with texture then texture showing on material and it all works e) make texture an input to the shader but I don’t know how to do this without material, so now I’m lost.

I just thought of a do-able third way which would be to place another camera in the scence and attach render on that camera and use the render function method to do the fragment shader processing as per the render to texture example which has two camera. https://playcanv.as/b/oBOsHF1W/
You could imagine in this case a Bloom filter could be placed on the camera which is looking at the running man and Bloom filter can be adjusted to do what I wanted, but this is a hacky way to do it to have a second camera, not only that, its not immediately clear how I would loop back the image data, would I render to the canvas which was the running man? seems a bit hacky.

Just read this http://answers.playcanvas.com/questions/2797/shaders-render-to-texture which gives me a clue that I can associate model diffuse map with a texture ```
app.root.findByName (‘Terrain’).model.material.diffuseMap = renderTarget.colorBuffer;


Edit update: Or maybe should on each update(..) call on something I should render a shader into a texture then sometimes copy the texture across to a material? I can try this.

Regards, Philip

Hi David,

I’ve made some progress, at least it does at the moment use the previous buffer to cycle around. I had to use two textures as webgl not let me render into the texture I was reading from. Also I assumed I can only do the rendering code in the post-effect area and not say on an update. The only problem left is how to get the texture to display on the material of a cube? (I don’t want it full screen, I want the effect, the texture on a cubes material).

At the moment in this demo its rendering a rectangle on the screen and incrementing the red portion of the image rgba and (r mod 255) around as you can see if you view this project.That’s not what I wanted or expected to happen. I expected the material on the cube to render on the cube.

Edit here:
https://playcanvas.com/project/449634/overview/cycle_shader

Run view here:
https://playcanvas.com/editor/scene/488246/launch

How can the texture be drawn on the cube? At the moment I’m trying to do it using a shader but the cube no longer behaves like a cube and allows for rotation. In customShader.js I do these lines with intent to add the texture to the cube.

this.material = new pc.Material();
this.material.setParameter("uColorBuffer", effect.textures[0]);
this.material.setShader(effect.passThroughShader2);
 model.meshInstances[0].material = this.material;

Any hints appreciated.


UPDATE! I solved it, yay. New project here.

https://playcanvas.com/project/453599/overview/cycle_shader2

Play here https://playcanvas.com/editor/scene/492572/launch

I noticed in another project that a diffuseMap can be set to texture, so these three lines in the same position of customShader.js as I mentioned code before. So I do the shader work in the post effects part and set the texture in the material as a diffuseMap and it works.

// The idea here is we are passing the texture from the effect which rendered into the texture to this material to display it
this.material = model.meshInstances[0].material;
this.material.diffuseMap = effect.textures[0];
this.material.update();

Update I changed it to show conways game of life to show it can use previous data of image.

Philip

1 Like