How to access pixels of material.diffuseMap

I have material which was loaded as an pc.Asset. This material has diffuseMap (which was loaded with the material) and I want to access pixels of this texture. I do it as follows:

mat.diffuseMap.lock({mode: pc.TEXTURELOCK_READ});

But instead of an array with pixels I get ImageBitmap object with properties { width: 512, height: 512 }

Hi @NokFrt,

Check this post:

There is also support for this added in the engine recently. I think it’s merged, though I’m not aware of a public example on how to use it. @slimbuck @mvaligursky any idea?

The async texture download is just an internal implementation we’re testing, it’s not ready for release. The API will most likely change as well as we’ll implement WebGPU version.

1 Like

Hi @Leonidas, thank for the reply. Your methid readPixelsFromTexture() works but I still have a problem with it.

I have a material with a diffuseMap texture. I load this material as an Asset and I don’t use this material on any MeshInstance.
I want to create a new texture based on the original texture from the material an then I want to use this new texture for another material.

The problem is that I can’t get pixels from the original texture if I don’t use the original material on a MeshInstance. If I try to use you readPixelsFromTexture() before using the original material, all pixels are zero.

Is there a way how to initialize the texture without applying the material to a MeshInstance?

Most likely the texture hasn’t been uploaded to the GPU before being used by a model. Can you try calling upload to your texture before accessing it?

https://developer.playcanvas.com/en/api/pc.Texture.html#upload

const baseMat = assets.find(MonsterManager.BODY_ASSET_KEY, pc.eAssetType.material).resource as pc.StandardMaterial;
baseMat.diffuseMap.upload();

setTimeout((): void => { this.initBodyMaterial(); }, 1000);

initBodyMaterial() use the baseMat.diffuseMap texture for filling dynamic texture. It works if I use the baseMat on some visible MeshInstance but not without it :frowning: pc.Texture.upload() didn’t help.

One way to extract data from any texture (including compressed formats) is to render it to another texture. I’m just adding code to handle this to usdz exporter, but this is the core of this:

const textureBlitVertexShader = `
    attribute vec2 vertex_position;
    varying vec2 uv0;
    void main(void) {
        gl_Position = vec4(vertex_position, 0.5, 1.0);
        uv0 = vertex_position.xy * 0.5 + 0.5;
    }`;

const textureBlitFragmentShader = `
    varying vec2 uv0;
    uniform sampler2D blitTexture;
    void main(void) {
        gl_FragColor = texture2D(blitTexture, uv0);
    }`;

const { width, height } = texture;

const dstTexture = new Texture(device, {
    name: 'ExtractedTexture',
    width,
    height,
    format: PIXELFORMAT_RGBA8,
    cubemap: false,
    mipmaps: false,
    minFilter: FILTER_LINEAR,
    magFilter: FILTER_LINEAR,
    addressU: ADDRESS_CLAMP_TO_EDGE,
    addressV: ADDRESS_CLAMP_TO_EDGE
});

const renderTarget = new RenderTarget({
    colorBuffer: dstTexture,
    depth: false
});

// render to a render target using a blit shader
const shader = createShaderFromCode(device, textureBlitVertexShader, textureBlitFragmentShader, 'ShaderCoreExporterBlit');
device.scope.resolve('blitTexture').setValue(texture);
device.setBlendState(BlendState.NOBLEND);
drawQuadWithShader(device, renderTarget, shader);

// read back the pixels
// TODO: use async API when ready
const pixels = new Uint8ClampedArray(width * height * 4);
device.readPixels(0, 0, width, height, pixels);

dstTexture.destroy();
renderTarget.destroy();
2 Likes