Trouble with Waves Project

Background to start: [SOLVED] Is there workaround to createFullscreenQuad for vertexBuffer? - #6 by eproasim

After debugging that, I set to work on attempting to implement this shader/script into a project of my own, but I have a behavior that I am entirely unsure how to address or investigate. It looks like the objects that I place in the WaveSources Layer appear in the very center of the texture at a very very small size and strangely I have waves coming from uniform spots across the plane…

And as I typed that out I realized that he camera must have it’s clear color set to #00000000 to fix that problem. Still, removing the terrain and relying on only a few rock for ripple result in a very small representation of the ripples in the center of the plane:

I’ve tried adjusting the Orthographic height of the Water Camera, adjusting both the land width and wave width, texture size, placement of objects, UV tiling, and I think I’ve run out of ideas. Any insight on this project, or how this might be overcome would be incredibly appreciated!

Hi @eproasim,

That seems indeed like a camera setup issue. Try previewing either in editor or on runtime what exactly that camera sees.

Then try playing again with ortho height and with the near/far planes to get it to render the objects that need to intersect with the water surface.

Thank you @Leonidas !

Fresh morning eyes revealed that the updateWater() function was actually setting the camera’s orthoHeight and position, meaning basically any of the changes I made with the camera were overwritten.

Now I’m getting a pretty good result that will be tweaked in the future. I decided to see if the shader could be updated, especially considering that I saw that the script registers an app level event to update the water to dirty the layer. I noticed that the event isn’t called anywhere in the script, so I’m guessing it was meant to be updatable. Firing the event produces a Trying to bind current color buffer as a texture error:

Unfortunately, I’m not familiar enough with the way shaders work to know where to start. Do you have any ideas? The event fires this function:

Water.prototype.updateWater = function() {
    // = this.entity.getLocalScale().x / 2;
    // var pos = this.entity.getPosition();
    //, 16, pos.z);
    var rot = this.entity.getEulerAngles();
    if (rot.x > 90 || rot.x < -90) {, 180 - rot.y, 0);
    } else {, rot.y, 0);
    this.layer.enabled = true;
    this.rendering = true;;
    this.rendering = false;
    this.layer.enabled = false;
    var scope =;
    for(var i = 0; i < this.blurPasses; i++) {
        pc.drawFullscreenQuad(, this.renderTargetB, this.vertexBuffer, this.quadShaderBlurHorizontal);

        pc.drawFullscreenQuad(, this.renderTargetA, this.vertexBuffer, this.quadShaderBlurVertical);
    this.material.diffuseMap = this.textureA;

And I can see in the stack trace that the offending line is:;

Do you have any ideas?

Not sure what the issue is, that seems like a complex setup (using a second layer composition).

Normally to update your render target all you need is to enable for at least a single frame the camera that is rendering to that. That should be enough.

If you keep having trouble try sharing a sample project to take a look.

The issue is in the order of rendering to textures A/B using the drawFullScreenQuad. The renderer is trying to write to texture that is currently rendered, which is not allowed, hence the error.