Hello,
I was trying to render the view of a secondary camera in a texture for a VR project and found the Tutorial: Render to Texture. After enabling WebVR on the project, the camera seems to stop rendering in the texture as soon as I enter VR but works otherwise.
I created a project to show this: https://playcanvas.com/project/615766/overview/render-to-texture-vr
Thank you!
I think render to Texture seems is broken in VR 
https://playcanvas.com/project/615766/overview/render-to-texture-vr
Am I doing something wrong? it works before getting into VR mode.
EDIT: It seem to work in other browsers
I think its more of ‘doesn’t work for VR’ rather than ‘broken’. There’s a different render pipeline for WebVR and it might not be able to handle more than one camera in the scene?
Is there any other technique I could use to render a layer in a texture?
You can try digging in the engine code to see if there is a way to have more than one camera. I can imagine it might be done for optimisation?
Thank you for your help @yaustar. I was reading the engine code:
/**
* @function
* @name pc.CameraComponent#enterVr
* @description Attempt to start presenting this camera to a {@link pc.VrDisplay}.
* @param {pc.VrDisplay} [display] The VrDisplay to present. If not supplied this uses {@link pc.VrManager#display} as the default
* @param {Function} callback Function called once to indicate success of failure. The callback takes one argument (err).
* On success it returns null on failure it returns the error message.
* @example
* // On an entity with a camera component
* this.entity.camera.enterVr(function (err) {
* if (err) {
* console.error(err);
* return;
* } else {
* // in VR!
* }
* });
*/
enterVr: function (display, callback) {
if ((display instanceof Function) && !callback) {
callback = display;
display = null;
}
if (!this.system.app.vr) {
callback("VrManager not created. Enable VR in project settings.");
return;
}
if (!display) {
display = this.system.app.vr.display;
}
if (display) {
var self = this;
if (display.capabilities.canPresent) {
// try and present
display.requestPresent(function (err) {
if (!err) {
self.vrDisplay = display;
// camera component uses internal 'before' event
// this means display nulled before anyone other
// code gets to update
self.vrDisplay.once('beforepresentchange', function (display) {
if (!display.presenting) {
self.vrDisplay = null;
}
});
}
callback(err);
});
} else {
// mono rendering
self.vrDisplay = display;
callback();
}
} else {
callback("No pc.VrDisplay to present");
}
},
but I do not see any indications of why the second camera would turn off. I don’t know much about engines so if anyone could point me to where I may find more about this, that would be of great help.
Thank you!
So I tested in on the rift and it seems to work when using firefox!
So it seems to be an Oculus Browser Problem.
Does anyone know of another way of rendering the camera output to a texture? I have been doing a lot of troubleshooting with the Oculus Browser but I can’t find what is the problem.
-The image shows for a second and then the texture goes black.
-The texture shows the camera output correctly when not in VR mode.
-Project: https://playcanvas.com/project/615766/overview/render-to-texture-vr