Hi Dave,
Thanks, so then I see the post effects all have render functions and in there the shaders are chained but for materials showing shader based images I do not find any render functions, so I’m not sure how to chain the shader chaining from one to the next. In my case, I wish to do shaders on the texture on the cube on screen, not doing shader on the camera.
A cut-down version of the Bloom script which no-longer does Bloom but is simplified I paste here
http://pastebin.com/Cr1jQgJ0
In my case, I wish to do shaders on the texture on the cube on screen and I don’t see how to use a render function for the cubes material, I only see render function used in the case of a post effect. If there is a render function for cube material then I could use that with the above pastebin to chain fragment shaders. Is there a way to use render function with cube material?
Without the ability to use a render function, I made the following progress and discovered.
- A texture can be created in JS. eg. var colorBuffer = new pc.Texture(gd, {
- A render target can be created upon a texture so that it will be a target for renders. eg. var target = new pc.RenderTarget(gd, colorBuffer, { depth: false });
So this would then be likely the target of a shader but then also I need this texture to be the input texture for the shader and I only know how to do that with material as per the doc example. http://developer.playcanvas.com/en/tutorials/custom-shaders/
- A texture can be set as the input to a shader, but I only know how to do this while using a material.
eg per the doc on the update function: this.material.setParameter(‘uTime’, t);
- Material and shader are normally associated with this.material.setShader(this.shader);, I don’t see how to get the render target (pc.RenderTarget) to be the material. Should I be setting the texture on the Material? since the texture is the subject of the render target.
So I can think there is two ways to solve this problem, either place a render function on the material or texture but I don’t know if that is allowed or good or if there is any example.
The second way would be to a) create a texture b) create a render target c) create a shader d) make shader output to render target but I don’t know how to do because when I work with material I can only associate shader with material eg this.material.setShader(this.shader);, I don’t know how to associate shader with texture then texture showing on material and it all works e) make texture an input to the shader but I don’t know how to do this without material, so now I’m lost.
I just thought of a do-able third way which would be to place another camera in the scence and attach render on that camera and use the render function method to do the fragment shader processing as per the render to texture example which has two camera. https://playcanv.as/b/oBOsHF1W/
You could imagine in this case a Bloom filter could be placed on the camera which is looking at the running man and Bloom filter can be adjusted to do what I wanted, but this is a hacky way to do it to have a second camera, not only that, its not immediately clear how I would loop back the image data, would I render to the canvas which was the running man? seems a bit hacky.
Just read this http://answers.playcanvas.com/questions/2797/shaders-render-to-texture which gives me a clue that I can associate model diffuse map with a texture ```
app.root.findByName (‘Terrain’).model.material.diffuseMap = renderTarget.colorBuffer;
Edit update: Or maybe should on each update(..) call on something I should render a shader into a texture then sometimes copy the texture across to a material? I can try this.
Regards, Philip