Black Border on Rendertexture element

Hi,

I have a project where i need to make a render texture out of a spine element. it works fine but since the workaround with superPNG and the alpha channel is obsolete (in order to get no black borders because of the premultiply thing) … i was wondering if there is some settings in how to render the image in order to not show the black border then in the rendertexture.

it’s a little bit confusing but maybe @yaustar you know something.

Thank you!

1 Like

Oh I see, you are using a render texture for the source texture and getting alpha fringing. And since it’s a render texture, you can’t use the workaround of extruding the edge colour.

Don’t have good answer off the top of my head beyond multiple renders onto the source texture with small offsets to create a bigger ‘edge’

Maybe the PlayCanvas team has better ideas (@mvaligursky ?)

This type of issue right? PNG white outline and weird edges

1 Like

yes thats my problem. since the texture is made in the engine because of a rendertexture, the outlines appear again.

We use this shader internally to expand the non-transparent edges for during lightmaps baking, so that could be a solution:

use it to “copy” your render target to another render target, and it expands the edges by a pixel.

1 Like

Hi, thanks for your help. I’m working with Slapstick on this problem and am trying to implement it atm.

I’m having trouble grasping how to apply this shader chunk to our current setup. What we do right now is we create a Texture, then create a RenderTarget with it and set it on our render camera. Now we can use the same texture and set it on our ImageElement to display it in our UI.

const colorBuffer = new pc.Texture(this.app.graphicsDevice, {
    width: Math.round(width * this.pixelDensity),
    height: Math.round(height * this.pixelDensity),
    format: pc.PIXELFORMAT_RGBA8,
    minFilter: pc.FILTER_LINEAR,
    magFilter: pc.FILTER_LINEAR,
    addressU: pc.ADDRESS_CLAMP_TO_EDGE,
    addressV: pc.ADDRESS_CLAMP_TO_EDGE,
});

const renderTarget = new pc.RenderTarget({
    colorBuffer,
    depth: true,
    flipY: true,
});

this.camera.renderTarget = renderTarget;

this.imageElement.texture = colorBuffer;

How would I apply the shader in this process?

The engine creates the shader

            const name = `lmDilate-${bakeHDR ? 'hdr' : 'rgbm'}`;
            const define = bakeHDR ? '#define HDR\n' : '';
            this.shaderDilate[index] = createShaderFromCode(device, shaderChunks.fullscreenQuadVS, define + shaderChunksLightmapper.dilatePS, name);

it also allocates another texture (same dimensions) and render target, and renders to it in this function:

it’s bit more complicated as it runs multiple passes to apply this and other filter multiple times, but basically this is the code part, where it assigns the source texture to be processed, and executes rendering to the other RT

                    this.lightmapFilters.setSourceTexture(tempTex);
                    drawQuadWithShader(device, nodeRT, dilateShader);