Normal maps flipped when layer is rendered into a texture

Hey hey

A few weeks ago you introduced the change to flip the Y-Coordinate on textures. This had some implications when rendering a layer into a texture, which was then flipped onto its head. You fixed this by adding a flag “flipY” to the RenderTarget class, with which we were able to fix this.

Now we noticed something else and we are not sure if we are missing something in our implementation, or if this is another bug. The code we use to render into a texture:

RenderToTexture.prototype.postInitialize = function() {
    var element = this.targetEntity.element;
    var colorBuffer = new pc.Texture(, {
        width: element.width,
        height: element.height,
        format: pc.PIXELFORMAT_R8_G8_B8_A8,
        autoMipmap: true,
        minFilter: pc.FILTER_LINEAR,
        magFilter: pc.FILTER_LINEAR,
    var renderTarget = new pc.RenderTarget({
        colorBuffer: colorBuffer,
        depth: true,
        flipY: true,

    var layer =;
    layer.renderTarget = renderTarget;

    element.texture = layer.renderTarget.colorBuffer;
    element.color = pc.Color.WHITE;
    element.opacity = 1;

This results in the following: The left side is the element component into which we render our layer, the right side is what the main camera sees. As you can see, the normal map on our texture is interpreted wrong when rendering a layer to a texture.

If we set “flipY” to false though, the normal map gets used correctly, but the image is flipped of course.

Can you help us, are we missing something? Are we supposed to create a depth buffer as well and pass it to the render target? Or do you think the issue lies somewhere else?
Thanks again :slight_smile:

@slimbuck One for you!

@slimbuck Do you have an idea, can you help us?

Hi @AliMoe,

To give some context, we generate render-to-texture images ‘upside-down’ (see here) in order for the resulting image to be usable as a texture (otherwise the texture would appear upside down). Unfortunately this means the lighting basis which is constructed from the underlying texture coordinates becomes inverted.

The simplest fix would be to invert the B vector here when performing render-to-texture, but that seems nasty.

I’m not sure what the cleanest solution is right now, but will chat with the team and try figure something out.

Just to mention, lighting should work fine if you leave the render target’s flipY: false and flip the resulting texture when you render it instead.

Sorry for the trouble!

Unfortunately flipping the texture introduces other problems. The texture we are rendering into is used in our ui, so any child elements will be affected as well. If we rotate it by 180 degree, the hierarchy will be mirrored. If we set the scale.y to -1, any child elements can’t be clicked anymore because of the negative scale. We could try to work around this issue, but we would like to avoid changing our scene hierarchy.

Thanks for looking into this, I will have a look at the TBNderivative chunk, but as you said, this doesn’t look like the right way to fix this.

Ahh damn. And it’s not possible to arrange the UI hierarchy slightly differently, so instead of:

  • RTT-element
    • child
    • child


  • Parent element
    • RTT-element (with flip)
    • child
    • child

FYI: Fix lighting basis during RTT by slimbuck · Pull Request #3424 · playcanvas/engine · GitHub

Awesome. So with the next engine release, this will be fixed, and we won’t even have to do anything? Thanks for looking into this :slight_smile:

Hi @AliMoe,

Yup that’s right, it should just work.

1 Like