Hello! I’m using the Playcanvas engine standalone, and I’m trying to:
Create a standard Playcanvas material (in WebGL1 or WebGL2)
Scrape the raw GLSL from the generated material (shader?)
Manipulate the raw GLSL
Put the manipulated GLSL back into the Playcanvas standard material
Is this possible with Playcanvas? I don’t see anything in chunks:
const app = new pc.Application(playCanvasDom);
const box = new pc.Entity('cube');
box.addComponent('model', {
type: 'box',
});
const material = box.model.meshInstances[0].material;
console.log(material.chunks); // <--- This is an empty object, can I get the GLSL here?
Also when I try to modify the material’s GLSL with
To get the raw GLSL you can go to material.variants on runtime and the compiled shader/shaders. In there you will find the vshader and fshader (vertex/fragment) definitions.
To put it back on the material and compile a new shader with your changes … I haven’t done that before. I imagine using customFragmentShader would do the trick, make sure to call material.update() afterwards to compile a new shader.
I’m not sure there’s a good way to do this, as our shader generation is more complex. This is what happens under the hood:
the user creates the material, sets up its properties, and optionally specifies some override shader chunks. This is attached to a mesh instance.
at some point later, when we render a camera / layer with this material, we know what pass gets rendered (forward, shadow, picker, some user custom pass), we also know what are the global settings at that point (IBL lighting, clustered lighting settings …), and we generate shader at that point based on all this info, for a specific pass.
For WebGPU we do further processing to make the shader compatible, the main part is to covert all uniforms to uniform buffers / give them fixed locations.
We compile the shader on the platform and use it. Note that we do this in a more parallel way, but for simplicity that is not relevant.
So I think you’d want a callback between steps 2 and 3, to allow you to process the shader?
We’re also working on a different way that could perhaps be useful for you, in a way similar to Unity’s surface shaders if you’re familiar with that. The shader conceptually contains two parts. The first one generates surface properties (albedo, opacity, emission, worldNormal and many others), and the second part evaluates the lighting using those values. This is what the first part can write to: https://github.com/playcanvas/engine/blob/main/src/scene/shader-lib/chunks/standard/frag/litShaderArgs.js
I also see there is no customVertexShader, which is surprising!
I’m hoping there’s a way to monkeypatch the methods needed to intercept the Shader creation somewhere, so I can at least fake hack test a full fshader/vshader override from the outside.
FWIW both three and babylon have callbacks to fully intercept shader creation and do whatever you want with the code. I understand playcanvas has specific chunks to make that a more fine grain controlled process, but it would also be cool if end users could manipulate full shader code at will.
I build a material internally, and force it to update (generate new GLSL) with
shaderMaterial.chunks.hackSource = Math.random();
And then to inject GLSL into a built material, I override pc.ProgramLibrary.prototype.generateShaderDefinition which lets me set fshader/vshader right when playcanvas builds the shader for a render pass.
I will report back after trying out the first class way to intercept GLSL generation.
Heh you know your way around hacking your way in pretty well
Yes, the PR I mentioned is merged in, and we’re looking at releasing 1.65.0 this week (next couple of days most likely). Till then you can build the engine yourself.
You let the Playcanvas engine create a throwaway pc.Standardmaterial(), render the scene to force Shader creation, then scrape the GLSL from the shader. The throwaway/“internal” shader also needs all the same attributes/uniforms as the one in the graph, for example it needs a blank pc.Texture() put into the diffuseMap to force the “internal” shader to generate the diffuseMap uniforms in the source code.
The Shaderfrog engine works with GLSL, kind of like a module system, and you “plug in” the output from the pc.Standardmaterial into another shader. You can see what’s happening under the hood by opening the Shader tab > Fragment subtab on the link above. Basically what’s happening is it turns the pc.Standardmaterial’s main() function into one that returns a vec4, and then you can inject standardmaterail_main() into any place in any other shader.
You take the manipulated GLSL and put it into a brand new pc.Standardmaterial, so Playcanvas passes all the engine/lighting/etc data into it, and the Shaderfrog runtime adds additional uniforms that come from the custom shaders.
By the way, you can use the Shader → Fragment subtab to live edit the compiled Playcanvas GLSL. You can’t save (since it’s compiled output), but this alone is super helpful for learning about and debugging GLSL shader issues. I’ve personally used this editor a bunch to figure out how to work with Babylon/Three/Playcanvas materials since it lets you see the final preprocessed code.
I still need to bump the PC version I’m using to try out the new hook. Also, is there a way to force shader compilation outside of calling .render() ? Threejs has renderer.compile() and Babylon has material.forceCompilation(). My hack right now is to call .render() to get the Shader to be generated.
Not currently, as the shader we generate depends on global settings we only know when we’re about to render - like ambient / envmap and other scene settings. It also depends on what attributes the mesh provides the shader gets attached to.