Confusing RenderTarget Behavior

I have a bit of a weird on here.

In my project, I am using an of screen camera to render a 3D model to a texture and then applying it to UI element. This is so that I can rely on a ButtonComponent ‘click’ event when using a touch screen where multiple touches are being processed simultaneously.

This project contains two separate games in two scenes that can be chosen from a menu scene. When launching this particular scene, directly from the editor, the texture renders as expected:

However, when launching the menu scene first and then switching to this game, the texture becomes distorted/stretched:

The 3D objects are already existing in the scene in front of the required camera and are enabled when required. A screen space ParticleSystemComponent with a child ButtonComponent are instantiated and assigned the texture provided by the correct camera like this:

PowerUpManager3D.prototype.createPowerUp = function() {

    let rand = (Math.random() > 0.5);
    const choice = rand ? 'ball' : 'paddle';
    const other = (!rand) ? 'paddle' : 'ball';

    if(!this.activePowerUps[choice]) {
        const templates = this.templates[choice];
        const particles = this.particles[choice];
        const randomTemplate = Math.floor(pc.math.random(0, templates.length));

        
        particles[randomTemplate].enabled = true;
        let leftEntity = templates[randomTemplate].resource.instantiate();
        this.screen.addChild(leftEntity);
        leftEntity.script.powerUp3d.renderCamera = this.cams[choice];
        leftEntity.script.powerUp3d.dir = true;
        leftEntity.script.powerUp3d.partner = this.paddles[0]
        

        let rightEntity = templates[randomTemplate].resource.instantiate();
        this.screen.addChild(rightEntity);
        rightEntity.script.powerUp3d.renderCamera = this.cams[choice];
        rightEntity.script.powerUp3d.partner = this.paddles[1]
        
        rightEntity.enabled = true;
        leftEntity.enabled = true; 

        this.activePowerUps[choice] = {
            entities: [leftEntity, rightEntity],
            particles: particles[randomTemplate]
        };
    }

    else if(!this.activePowerUps[other]) {
        const templates = this.templates[other];
        const particles = this.particles[other];
        const randomTemplate = Math.floor(pc.math.random(0, templates.length));

        
        particles[randomTemplate].enabled = true;
        let leftEntity = templates[randomTemplate].resource.instantiate();
        this.screen.addChild(leftEntity);
        leftEntity.script.powerUp3d.renderCamera = this.cams[other];
        leftEntity.script.powerUp3d.dir = true;
        leftEntity.script.powerUp3d.partner = this.paddles[0]
        

        let rightEntity = templates[randomTemplate].resource.instantiate();
        this.screen.addChild(rightEntity);
        rightEntity.script.powerUp3d.renderCamera = this.cams[other];
        rightEntity.script.powerUp3d.partner = this.paddles[1]
        
        rightEntity.enabled = true;
        leftEntity.enabled = true;

        this.activePowerUps[other] = {
            entities: [leftEntity, rightEntity],
            particles: particles[randomTemplate]
        };
    }

    this.startPowerUps(pc.math.random(5, 15));

};

As you can see, the properties are assigned to object after instantiation but before being enabled. When enabled, here is the code that the instantiated object runs:

PowerUp3d.prototype.initialize = function() {

    this.powerUpMgr = this.app.root.children[0];
    this.app.on('newCameraRender', this.onNewTexture, this);
    this.on('destroy', this.onDestroy, this);

};

PowerUp3d.prototype.postInitialize = function() {

    this.onNewTexture();
    this.button.button.on('click', this.onClick, this);

};

PowerUp3d.prototype.onNewTexture = function() {

    if(!this.renderCamera) return;
    this.renderTexture = this.renderCamera.script.renderCameraTotexture.renderedTexture;
    this.button = this.entity.findByName('PowerupButton');
    this.button.element.texture = this.renderTexture;

};

PowerUp3d.prototype.onDestroy = function() {

    this.floatTween.stop();
    this.app.off('newCameraRender', this.onNewTexture, this);

};

At first, when testing the switching of scenes, I was met with an error that indicated that this.renderCamera.script was undefined. This was strange because the referenced entity already exists in the scene for at least 10 seconds before instantiating the template. Despite the error, the distorted texture would appear a second or two after. I assumed that this was potentially a hierarchy order problem and moved things around. The project is not longer providing this error, but the distortion remains.

The most confusing part of this for me is that these issues (including errors) are/were not issues when launching the scene directly from the editor.

This is a somewhat mission critical problem for me to get solved, and I would really appreciate any guidance on why this is occurring and any workarounds that may address it.

Probably render target size issue?
As far as I know of, the render target size doesn’t change dynamically.
So, I assume that the render target size of the menu scene is smaller than the main scene.

1 Like

Hi @sooyong_Kim,

Thanks for replying! The render camera and subsequent target do not exist in the menu scene. They only get created once the main scene is loaded.

If no camera exists intentionally, the render target size depends on app canvas size. That is something I understood. So, even though there is no camera in menu scene, the render target size is already fixed to menu scene size…
and it’s something I am assuming…lol

I’ll go ahead and post the script used to render the camera to a texture:

var RenderCameraTotexture = pc.createScript('renderCameraTotexture');
RenderCameraTotexture.attributes.add('scale', {type: 'number', default: 1.0});
RenderCameraTotexture.attributes.add('elementEntity', {type: 'entity', description: 'The element image that will display the render target',});

RenderCameraTotexture.prototype.initialize = function () {
    this.createNewRenderTexture();
    
    var onResizeCanvas = function() {
        this.secsSinceSameSize = 0; 
    };
    
    this.app.graphicsDevice.on('resizecanvas', onResizeCanvas, this);
    
    var device = this.app.graphicsDevice;
    this.lastWidth = device.width;
    this.lastHeight = device.height;
    
    this.secsSinceSameSize = 0;
    
    var onRenderScaleChange = function (scale) {
        this.scale = scale;
        this.createNewRenderTexture();
        this.secsSinceSameSize = 0;  
    };
    
    this.app.on('renderscale:change', onRenderScaleChange, this);
    
    this.on('destroy', function() {
        this.app.graphicsDevice.off('resizecanvas', onResizeCanvas, this);
        this.app.off('renderscale:change', onRenderScaleChange, this);

        this.elementEntity.element.texture = null;
        this.renderTarget.destroy();
        this.texture.destroy();
    }, this);
};


// update code called every frame
RenderCameraTotexture.prototype.update = function(dt) {
    // We don't want to be constantly creating an new texture if the window is constantly
    // changing size (e.g a user that is dragging the corner of the browser over a period)
    // of time. 
    
    // We wait for the the canvas width and height to stay the same for short period of time
    // before creating a new texture to render against.
    
    var device = this.app.graphicsDevice;
    
    if (device.width == this.lastWidth && device.height == this.lastHeight) {
        this.secsSinceSameSize += dt;
    }
    
    if (this.secsSinceSameSize > 0.25) {
        if (this.unScaledTextureWidth != device.width || this.unScaledTextureHeight != device.height) {
            this.createNewRenderTexture();
        } 
    } 
    
    this.lastWidth = device.width;
    this.lastHeight = device.height;
};


RenderCameraTotexture.prototype.createNewRenderTexture = function() {
    var device = this.app.graphicsDevice;

    // Make sure we clean up the old textures first and remove 
    // any references
    if (this.texture && this.renderTarget) {
        var oldRenderTarget = this.renderTarget;
        var oldTexture = this.texture;
        
        this.renderTarget = null;
        this.texture = null;
        
        oldRenderTarget.destroy();
        oldTexture.destroy();
    }
    
    // Create a new texture based on the current width and height of 
    // the screen reduced by the scale
    var colorBuffer = new pc.Texture(device, {
        width: this.elementEntity.element.width,
        height: this.elementEntity.element.height,
        format: pc.PIXELFORMAT_R8_G8_B8_A8,
        minFilter: pc.FILTER_NEAREST,
        magFilter: pc.FILTER_NEAREST,
        autoMipmap: true
    });

    var renderTarget = new pc.RenderTarget({
        colorBuffer: colorBuffer,
        depth: true,
        flipY: true,
        samples: 4
    });

    this.entity.camera.renderTarget = renderTarget;

    this.elementEntity.element.texture = colorBuffer;

    this.unScaledTextureWidth = device.width;
    this.unScaledTextureHeight = device.height;
    
    this.texture = colorBuffer;
    this.renderTarget = renderTarget;
    this.renderedTexture = colorBuffer;
    this.app.fire('newCameraRender');
};

At first, the cameras already existing in the main scene have a default element that is disabled after the first frame renders. After that, the camera continually renders to the texture created in the initialize step and resizes if the window resizes.

This all works of the main scene is launched directly from the editor but breaks when switching scenes which is the most confusing part.

log out the render target sizes to make sure they make sense.
and maybe try changing aspectRatio on the camera component? By default it’s AUTO and that matches render target, is that what you need?

1 Like

I think you would need to set your screen element width and height to that of render target or at least to its ratio variant.

Edit:

Sorry, not render target but graphics device :slight_smile:

Example: PlayCanvas 3D HTML5 Game Engine

1 Like

Hi @LeXXik and @mvaligursky,

Thank you both for replying!

I’ve tried some of the things suggested and here are the results.

  • Logging the render target sizes

    Logging the sizes of the render target and texture both remain consistent at 128x256 when opening the scene via the editor or from the main menu.

  • Logging Aspect ratio

    This did actually change between scenes 1.7777777777777777 from the editor and 1.7774420946626384 from the main menu. But when launched from the editor, the aspect ratio would update to 0.5 after assigning the new render target

  • Forcing 1.777+ aspect ratio with aspectRatioMode set to pc.ASPECT_MANUAL

    Both ways of opening the scene are wrong

  • Forcing 0.5 aspect ratio with aspectRatioMode set to pc.ASPECT_MANUAL

    Both ways of opening the scene are corrected!!!

So, it seems like the camera did not want auto update it’s aspect ratio when presented with the new render target if the scene was not directly opened. Not sure why, but knowing the issue at least lets me fix it. Thank you @LeXXik and @mvaligursky for the suggestions!

2 Likes

That is strange, as the engine updates the aspect ratio every frame. If you had a simple repro, I could investigate this.

Glad all is good now.

1 Like

Thanks @mvaligursky .

If I find some time soon, I will see if I can put a repro together.

1 Like