Rendering transparent materials to an antialiased texture creates weird effect

The Setup

I have a 3D garage door with transparent windows.

I then enable a plane behind the door with a brick wall texture on it (tinted red here for visibility) and take a screenshot of the door through an orthographic camera.

The render texture has the following config:

this.renderTarget = new pc.RenderTarget({
            colorBuffer: colorBuffer,
            depth: true,
            flipY: true,
            samples: pc.app.graphicsDevice.maxSamples
        });

and the color buffer has the following config:

{
    width: 0, // dynamic
    height: 0, // dynamic
    format: pc.PIXELFORMAT_R8_G8_B8,
    autoMipmap: true,
    mipmaps: true,
    minFilter: pc.FILTER_LINEAR_MIPMAP_NEAREST,
    anisotropy: pc.app.graphicsDevice.maxAnisotropy
}

I then place that texture on a material, place that material on a quad, and place manipulate the corners of the quad to match a house.

Note that in order to get the perspective effect, I’m using a custom shader. That shader looks like this:

varying vec2 vUv0;

uniform vec4 _Color;
uniform mat4 _Matrix0;
uniform sampler2D _MainTex;
uniform int textureExists;

void main(void)
{
    vec4 persp = _Matrix0 * vec4(vUv0, 1, 1);
    vec2 uv_persp = vec2(persp.x / persp.z, persp.y / persp.z);

    vec4 c = _Color;
    if (textureExists == 1) {
        c = texture2D(_MainTex, uv_persp.xy);
    }

    gl_FragColor = c;
}

Essentially all it does is calculate new UV coordinates from a precalculated matrix. As you can see it does nothing to the pixel color whatsoever.

The Problem

In the house image above, the windows are red when the actual texture clearly shows them to be black. And when I stretch it out to fill the screen, it transitions to black.

When the polygon is stretched, we can see it fade smoothly from black in big parts, to red in smaller parts.

The Clues

When I turn antialiasing off on the render texture (samples: 1), the problem goes away completely.

When I reenable antialiasing on the render texture and change the pixel format on the color buffer to include an alpha channel (PIXELFORMAT_R8_G8_B8_A8), we can now see straight through the black windows. The red windows are opaque and show the red brick wall, as intended.

The Question

As far as I can tell, this has to do with the way transparent pixels are handled during hardware antialiasing. Is this due to something I’m doing wrong? Or is this an engine bug?

Thanks for the detailed explanation.

@mvaligursky any idea?

Not sure what the problem could be here … but perhaps turning off mipmaps could solve this? Perhaps when antialiased buffer is resolved, the mimaps are not generated correctly.

1 Like

Disabling mipmaps does fix the problem.

It also looks terrible, so this is not an adequate solution for me

Just to cross check, how should it look?

Like this?

No, it should be mostly black. Like the actual texture I’m applying.

The glass material here has an alpha of 0.96, which should be more or less opaque. It seems to approach 0 as the screen size becomes smaller.

For reference this is the result I get with alpha = 1, it’s almost completely transparent and I’m only seeing what’s behind it

And the brick texture is part of the scene that the rendertarget is rendering? Not in the scene with the garage photo?

Yes. The brick texture is simply a plane placed behind the garage door.

To be perfectly clear, this is what I’m taking a picture of.

Me and Quincy were talking last night on discord trying to reproduce this issue. So far it only happens on the above project with both AntiAlias and Mipmaps enabled.

However, I was unable to reproduce the issue in a new project with a similar setup: https://playcanvas.com/project/863594/overview/aa--mips-bug

Quincy was able to reproduce the issue in a new project which helped a lot: https://playcanvas.com/project/863834/overview/aa--mips-bug

Looking at that project, the capture is done on the layer itself by setting the renderTarget onto the Screenshot layer.

Looks like the fix/workaround is to render layers Screenshot and Immediate so either a https://developer.playcanvas.com/api/pc.LayerComposition.html needs to be used or the easiest route (which is what the current screenshot tutorial does) is to use a camera and assign the render target to that instead.

Here is a fork of the above project with removed unused code and using the camera method: https://playcanvas.com/editor/scene/1300135

I don’t know why the Immediate layer is needed, perhaps one of the graphics gurus can help when they are back in the office in the new year.

1 Like

@Quincy Did this resolve the issue for you?