Ijon_h
October 23, 2025, 8:49am
1
Following this older discussion , I’ve been trying to access the scene depth texture (will be used later).
I tested both approaches:
RenderPassDepthGrab
CameraFrame.rendering.sceneDepthMap = true
However, in both cases, the resulting texture appears pure red when drawn.
Here’s the snippet I used:
// init()
this.entity.camera.requestSceneDepthMap(true);
// update()
const renderPassDepthGrab = this.entity.camera.camera.renderPassDepthGrab;
if (!renderPassDepthGrab) return;
const depthRenderTarget = renderPassDepthGrab.depthRenderTarget;
if (!depthRenderTarget) return;
const depthTexture = this.app.graphicsDevice.isWebGL2
? depthRenderTarget.depthBuffer
: depthRenderTarget.colorBuffer; // Why colorBuffer here?
console.log('depth texture:', depthTexture);
this.app.drawTexture(0.7, -0.7, 0.5, 0.5, depthTexture);
Is this expected, or is there a correct way to sample it?
Any guidance or confirmation from others who’ve tried this recently would be appreciated!
Project Link: https://playcanvas.com/editor/scene/2346401
Wagner
October 24, 2025, 8:49am
2
A depth texture is quite a specific type of data. It’s important to understand that depth textures come in different formats, and not all of them can be displayed directly on the screen. To visualize such a texture in PlayCanvas, you should use the dedicated function provided in the following link:
* Should be in the range [-1, 1].
* @param {number} y - The y coordinate on the screen of the center of the texture.
* Should be in the range [-1, 1].
* @param {number} width - The width of the rectangle of the rendered texture. Should be in the
* range [0, 2].
* @param {number} height - The height of the rectangle of the rendered texture. Should be in
* the range [0, 2].
* @param {Layer} [layer] - The layer to render the texture into. Defaults to {@link LAYERID_IMMEDIATE}.
* @ignore
*/
drawDepthTexture(x, y, width, height, layer = this.scene.defaultDrawLayer) {
const material = new ShaderMaterial();
material.cull = CULLFACE_NONE;
material.shaderDesc = this.scene.immediate.getDepthTextureShaderDesc();
material.update();
this.drawTexture(x, y, width, height, null, material, layer);
}
/**
* Destroys application and removes all event listeners at the end of the current engine frame
this.app.drawDepthTexture(0.7, -0.7, 0.5, 0.5)
Ijon_h
October 27, 2025, 2:30am
3
Hi @Wagner , thanks for the reply. For my use-case I’m not trying to display the depth (it’s only for debugging). I need it for compute shader. Concretely, I’m looking for a reliable way to obtain a depth resource that I can bind as texture_2d<f32>, ideally linear depth in R32F.
Wagner
October 27, 2025, 4:22am
4
Very few mobile devices support r32f texture, you are better off using rgba8 texture into which you encode float.
GLSL:
precision highp float;
#include "floatAsUintPS"
uniform highp sampler2D uSceneDepthMap;
#ifdef WEBGPU
#ifdef SCENE_DEPTHMAP_FLOAT
#define getDepth(xy, offset, level) texelFetch(uSceneDepthMap, xy + offset, level).r
#else
#define getDepth(xy, offset, level) uint2float(texelFetch(uSceneDepthMap, xy + offset, level))
#endif
#else
#ifdef SCENE_DEPTHMAP_FLOAT
#define getDepth(xy, offset, level) texelFetchOffset(uSceneDepthMap, xy, level, offset).r
#else
#define getDepth(xy, offset, level) uint2float(texelFetchOffset(uSceneDepthMap, xy, level, offset))
#endif
#endif
precision highp float;
uniform highp sampler2D uDepthMip;
void main() {
ivec2 xy = ivec2(gl_FragCoord.xy);
float depth = getDepth(uDepthMip, xy, 0);
#ifdef WRITE_DEPTH
gl_FragDepth = depth ;
#elifdef WRITE_FLOAT
gl_FragColor = vec4(vec3(depth), 1.0);
#else
gl_FragColor = float2uint(depth);
#endif
}