Localized reflection and lighting

Hi folks :slight_smile:

I think this might be something for you @mvaligursky

How would I go about setting up and rendering separate cubemaps for different areas of a scene, so meshes have reflections and lighting that fit their immediate environment?

Unity has light and reflection probes for this. Is something like this possible in PlayCanvas?

If not, I guess I could make a custom system for this myself with something a la https://playcanvas.github.io/#/graphics/reflection-cubemap

At first I might skip blending between cubemaps for simplicity. Just change the pc.StandardMaterial.cubeMap property as per encapsulating volumes or something. I believe the Source engine did it like this with their env_cubemap back in the day. To my understanding, wouldn’t that mean that pretty much each mesh needs its own unique material instances, hindering reuse?

Does the method in the above example include all the necessary information, though? Or would I need something extra for lighting?

1 Like

I think you’re spot on here. Typically, this would work great for indoors environment based on box type rooms - as the cubemap can be made to match reflections well. In this case, you’d render a cubemap similarly to what the example does, once time at startup or similar, and use it for all meshes of the room. Which as you said, means separate material instance for each materials in the room.

And have a separate one (usually downloaded) for the outdoor part of the level - that does not need to be set per mesh but be global.

1 Like

Well, well… It works flawlessly! :smiley: Thanks @mvaligursky

image

The green wire sphere, uses the CubemapRenderer component from the PlayCanvas engine repo and assigns it to a clone() of the StandardMaterial on the big reflective sphere in the center.

2 Likes

Hmm I’m having an issue when setting up multiple of these cubemap renderers with generateAtlas. They seem to collide or overwrite each others’ textures. Either both of them turn out completely grey, or the last one to render its sides as per hierarchy order overwrites the textures of all instances.

Is there some trick to it? Having to give the textures unique names, or making sure only to render one at a time or something? :thinking:

Ooh console is spewing GL_INVALID_OPERATION: Feedback loop formed between Framebuffer and active Texture

The console error is probably because you have a render ‘loop’/infinity mirror where it’s capturing the render target in the rendering.

As in you have Camera A rendering to a texture that can be seen by Camera A.

2 Likes

Interesting! Had never considered that could happen. Wonder if it’s the source of the overwrite behaviour, too

I’m getting some progress, but I’ve run into an issue that seems to be present in Reflection Cubemap example as well.

For some reason when assigning the rendered cubemap to the material’s cubemap property, it appears way darker than the actual cubemap is as seen on the rendered quads. Any idea why that is?

I don’t think it’s tone mapping… At least disabling it in the example doesn’t fix it.

see the highlighted code for the reason / fix

2 Likes

It works! Thanks guys :smile:

image

The cyan wire spheres mark the locations of ReflectionProbes, which can be seen on the nearby shiny spheres.

If it’s worth anything to anyone, here are the two scripts I ended up with. The cubemaps are rendered one at a time via a static processing queue to avoid ending up grey.

One for each Reflection Probe. No Camera is necessary, it’ll be made and deleted on the fly.

export class ReflectionProbe extends pc.ScriptType {
	layers: string[];
	min: pc.Vec3;
	max: pc.Vec3;

	envAtlas: pc.Texture;

	cubeResolution: number = 64;
	atlasResolution: number = 512;
	boundingBox: pc.BoundingBox;

	static processingQueue: boolean = false;
	static renderQueue: ReflectionProbe[] = [];
	static list: ReflectionProbe[] = [];

	static readonly EVENT_REFLECTIONS_CHANGED: string = "ReflectionProbe_ListChanged";

	static processQueue() {
		if (ReflectionProbe.processingQueue)
			return;

		const probe = ReflectionProbe.renderQueue.shift();

		if (!probe)
			return;

		this.processingQueue = true;		
		probe.render();
	}

	static getNearest(point: pc.Vec3) {
		let minDist = Number.MAX_VALUE;
		let minProbe = null;

		for (let i = 0; i < ReflectionProbe.list.length; i++) {
			const probe = ReflectionProbe.list[i];
			const dist = probe.entity.getPosition().distance(point);

			if (dist < minDist) {
				minDist = dist;
				minProbe = probe;
			}
		}

		return minProbe;
	}

	static getFirstContaining(point: pc.Vec3) {
		for (let i = 0; i < ReflectionProbe.list.length; i++) {
			const probe = ReflectionProbe.list[i];

			if (probe.boundingBox.containsPoint(point))
				return probe;
		}

		return null;
	}

	static getFirstIntersecting(other: pc.BoundingBox) {
		for (let i = 0; i < ReflectionProbe.list.length; i++) {
			const probe = ReflectionProbe.list[i];

			if (probe.boundingBox.intersects(other))
				return probe;
		}

		return null;
	}

	initialize() {
		this.boundingBox = new pc.BoundingBox();
		this.boundingBox.setMinMax(this.min, this.max);

		ReflectionProbe.renderQueue.push(this);
		ReflectionProbe.processQueue();
	}

	render() {
		const resolution = Math.min(this.cubeResolution, this.app.graphicsDevice.maxCubeMapSize);

		const cubemap = new pc.Texture(this.app.graphicsDevice, {
			width: resolution,
			height: resolution,
			format: pc.PIXELFORMAT_RGBA8,
			cubemap: true,
			mipmaps: true,
			minFilter: pc.FILTER_LINEAR_MIPMAP_LINEAR,
			magFilter: pc.FILTER_LINEAR
		});

		const cameraRotations = [
			new pc.Quat().setFromEulerAngles(0, 90, 0),
			new pc.Quat().setFromEulerAngles(0, -90, 0),
			new pc.Quat().setFromEulerAngles(-90, 0, 180),
			new pc.Quat().setFromEulerAngles(90, 0, 180),
			new pc.Quat().setFromEulerAngles(0, 180, 0),
			new pc.Quat().setFromEulerAngles(0, 0, 0)
		];

		const layers = this.layers.map((x: string) => this.app.scene.layers.getLayerByName(x)?.id);

		const cameraEntities: pc.Entity[] = [];

		for (let i = 0; i < 6; i++) {
			const renderTarget = new pc.RenderTarget({
				colorBuffer: cubemap,
				depth: true,
				face: i,
				flipY: true
			});

			const e = new pc.Entity("CubemapCamera_" + i);
			e.addComponent("camera", {
				aspectRatio: 1,
				fov: 90,
				layers: layers,
				renderTarget: renderTarget
			});

			cameraEntities.push(e);
			this.entity.addChild(e);

			e.setRotation(cameraRotations[i]);

			e.camera.onPostRender = () => {				
				if (i === 5) {
					this.envAtlas = new pc.Texture(this.app.graphicsDevice, {
						width: this.atlasResolution,
						height: this.atlasResolution,
						format: pc.PIXELFORMAT_RGB8,
						mipmaps: false,
						minFilter: pc.FILTER_LINEAR,
						magFilter: pc.FILTER_LINEAR,
						addressU: pc.ADDRESS_CLAMP_TO_EDGE,
						addressV: pc.ADDRESS_CLAMP_TO_EDGE,
						projection: pc.TEXTUREPROJECTION_EQUIRECT
					});
			
					pc.EnvLighting.generateAtlas(cubemap, {
						target: this.envAtlas
					});

					cubemap.destroy();

					for (const cameraEntity of cameraEntities) {
						cameraEntity.camera.enabled = false;
						cameraEntity.camera.renderTarget.destroy();
						cameraEntity.camera.renderTarget = null;
						cameraEntity.destroy();
					}

					ReflectionProbe.list.push(this);
					ReflectionProbe.processingQueue = false;
					ReflectionProbe.processQueue();

					this.app.fire(ReflectionProbe.EVENT_REFLECTIONS_CHANGED);
				}
			};
		}
	}
};

pc.registerScript(ReflectionProbe, ReflectionProbe.scriptName);

ReflectionProbe.attributes.add("layers", {
	type: "string",
	array: true,
	default: ["World", "Skybox"]
});

ReflectionProbe.attributes.add("min", { type: "vec3", default: [0, 0, 0] });
ReflectionProbe.attributes.add("max", { type: "vec3", default: [1, 1, 1] });

And one for any Model whose MeshInstances should use the generated EnvAtlas cubemaps.

import { ReflectionProbe } from "../Rendering/ReflectionProbe";

export class ShaderReflectionProbe extends pc.ScriptType {
	continuous: boolean;
	useNearest: boolean;

	interval: number = 250;
	intervalID: number;

	initialize() {
		this.on("destroy", this.onDestroy, this);
		this.app.on(ReflectionProbe.EVENT_REFLECTIONS_CHANGED, this.onReflectionsChanged, this);
	}

	onDestroy() {
		clearInterval(this.intervalID);

		this.off("destroy", this.onDestroy, this);
		this.app.off(ReflectionProbe.EVENT_REFLECTIONS_CHANGED, this.onReflectionsChanged, this);
	}

	onReflectionsChanged() {
		clearInterval(this.intervalID);

		this.getProbe();

		if (this.continuous) {
			this.intervalID = setInterval(this.getProbe.bind(this), this.interval);
		}
	}

	getProbe() {
		if (!this.entity?.model)
			return;

		for (const meshInstance of this.entity.model.meshInstances) {
			const probe = this.useNearest ? ReflectionProbe.getNearest(meshInstance.aabb.center) : ReflectionProbe.getFirstContaining(meshInstance.aabb.center);
			const envAtlas = probe?.envAtlas ?? null;

			// @ts-ignore
			meshInstance.material.envAtlas = envAtlas;
			meshInstance.material.update();
		}
	}
};

pc.registerScript(ShaderReflectionProbe, ShaderReflectionProbe.scriptName);

ShaderReflectionProbe.attributes.add("continuous", { type: "boolean", default: false });
ShaderReflectionProbe.attributes.add("useNearest", { type: "boolean", default: false });
4 Likes

Many thanks for sharing!

Now how cool would be to be able to smoothly fade from one to the other for the local player :innocent:

2 Likes

Exactly! I’d have no idea where to start :stuck_out_tongue:

I bet it involves some shader magic :magic_wand:

I think overriding the shader chunks where the cubemap is sampled, but the big question is … is there space for two cubemaps in the shader? Or are we going over the max samplers limit (usually 16)?

@mvaligursky what do you think?

Internally, the envAtlas is a 2D texture that looks like this:
Screenshot 2023-04-21 at 09.23.51

Currently, you create multiple of them, and need to blend them. One way would be to do this as the StandardMaterial shader samples it - sample from 2-3 and cross blend them.

But an easier way would be to blend those 2-3 at the start of the frame into your ‘active’ envMap and use that on the objects you need to.

The code would be similar to this: engine/morph-instance.js at 664d24afbcc182ee41d1ed2cd1ef3208bf00ddd0 · playcanvas/engine · GitHub
where multiple 2d textures are blended together. That example code is more complex as it handles more than 16 textures being blended together by splitting that into multiple passes, you need just a single draw call.

3 Likes

Thanks for the tip @mvaligursky :smiley:

I believe, I got it working. You meant something like this, right?

import { ReflectionProbe } from "../Rendering/ReflectionProbe";

export class ShaderReflectionProbe extends pc.ScriptType {
	continuous: boolean;
	blend: boolean;

	interval: number = 100;
	intervalID: number;

	boundingBox: pc.BoundingBox = new pc.BoundingBox();
	intersectingProbes: ReflectionProbe[];
	intersectionVolumes: number[];

	blendColorBuffer: pc.Texture;
	blendRenderTarget: pc.RenderTarget;

	initialize() {
		this.on("destroy", this.onDestroy, this);
		this.app.on(ReflectionProbe.EVENT_REFLECTIONS_CHANGED, this.onReflectionsChanged, this);

		this.intersectingProbes = new Array<ReflectionProbe>(3);

		if (this.blend) {
			this.intersectionVolumes = new Array<number>(this.intersectingProbes.length);

			this.blendColorBuffer = new pc.Texture(this.app.graphicsDevice, {
				width: ReflectionProbe.ATLAS_RESOLUTION,
				height: ReflectionProbe.ATLAS_RESOLUTION,
				format: pc.PIXELFORMAT_RGB8,
				addressU: pc.ADDRESS_CLAMP_TO_EDGE,
				addressV: pc.ADDRESS_CLAMP_TO_EDGE
			});
	
			this.blendRenderTarget = new pc.RenderTarget({
				colorBuffer: this.blendColorBuffer,
				depth: false
			});
		}
	}

	onDestroy() {
		clearInterval(this.blendIntervalID);

		this.off("destroy", this.onDestroy, this);
		this.app.off(ReflectionProbe.EVENT_REFLECTIONS_CHANGED, this.onReflectionsChanged, this);

		this.blendRenderTarget?.destroy();
		this.blendColorBuffer?.destroy();
	}

	onReflectionsChanged() {
		clearInterval(this.intervalID);

		this.updateReflection();

		if (this.continuous) {
			this.intervalID = setInterval(this.updateReflection.bind(this), this.interval);
		}
	}

	updateReflection() {
		if (!this.entity?.model)
			return;

		this.boundingBox.copy(this.entity.model.meshInstances[0].aabb);

		for (let i = 1; i < this.entity.model.meshInstances.length; i++) {
			this.boundingBox.add(this.entity.model.meshInstances[i].aabb);
		}

		// Get intersecting ReflectionProbes non-alloc
		const intersectionCount = ReflectionProbe.getIntersecting(this.boundingBox, this.intersectingProbes);

		if (this.blend) {
			let totalVolume = 0;

			for (let i = 0; i < intersectionCount; i++) {
				const volume = this.boundingBox.intersectionVolume(this.intersectingProbes[i].boundingBox);
				this.intersectionVolumes[i] = volume;
				totalVolume += volume;
			}

			for (let i = 0; i < intersectionCount; i++) {
				(this.app.graphicsDevice as pc.WebglGraphicsDevice).setBlending(true);
				(this.app.graphicsDevice as pc.WebglGraphicsDevice).setBlendFunction(pc.BLENDMODE_CONSTANT_ALPHA, i > 0 ? pc.BLENDMODE_ONE : pc.BLENDMODE_ZERO);
				(this.app.graphicsDevice as pc.WebglGraphicsDevice).setBlendColor(0, 0, 0, this.intersectionVolumes[i] / totalVolume);
		
				pc.drawTexture(this.app.graphicsDevice, this.intersectingProbes[i].envAtlas, this.blendRenderTarget, undefined, undefined, undefined, true);
			}

			for (const meshInstance of this.entity.model.meshInstances) {
				(meshInstance.material as any).envAtlas = this.blendColorBuffer;
				meshInstance.material.update();
			}
		}
		else {
			let envAtlas: pc.Texture = null;

			if (intersectionCount > 0) {
				let maxProbe: ReflectionProbe = null;
				let maxVolume: number = Number.MIN_VALUE;

				for (let i = 0; i < intersectionCount; i++) {
					const intersectVolume = this.boundingBox.intersectionVolume(this.intersectingProbes[i].boundingBox);

					if (intersectVolume > maxVolume) {
						maxProbe = this.intersectingProbes[i];
						maxVolume = intersectVolume;
					}
				}

				envAtlas = maxProbe.envAtlas;
			}

			for (const meshInstance of this.entity.model.meshInstances) {
				(meshInstance.material as any).envAtlas = envAtlas;
				meshInstance.material.update();
			}
		}
	}
};

pc.registerScript(ShaderReflectionProbe, ShaderReflectionProbe.scriptName);

ShaderReflectionProbe.attributes.add("continuous", { type: "boolean", default: false });
ShaderReflectionProbe.attributes.add("blend", { type: "boolean", default: false });

I had a look at how Unity blends their reflection probes. They calculate blending value per probe using intersection volumes between bounding box and that of the mesh instance.

image

Seems to work pretty well, but I’m getting this odd pixelated artifact on the blended texture. Any idea why? :thinking:

1 Like

Oi, just solved it! I just had to set mipmaps to false in the blendColorBuffer.

this.blendColorBuffer = new pc.Texture(this.app.graphicsDevice, {
    width: ReflectionProbe.ATLAS_RESOLUTION,
    height: ReflectionProbe.ATLAS_RESOLUTION,
    format: pc.PIXELFORMAT_RGB8,
    addressU: pc.ADDRESS_CLAMP_TO_EDGE,
    addressV: pc.ADDRESS_CLAMP_TO_EDGE,
    mipmaps: false
});

image

2 Likes