Batching executing automatically

I would like to be able to stop the batcher running automatically on the first frame. We are populating the scene dynamically with objects that are to be batched at the end.

I can run it manually when the time comes but seems a waste of resources having the batcher when only part of the scene is ready.

Is there a way to do this?

Hi @Aastar and welcome!

Good question, from what I know there isn’t any way to disable that behavior if you are setting up the batch groups/models in the editor (which makes sense to do).

The pc.Application instance while booting will automatically prepare and run the batcher. You might try to override that behavior by patching the pc.Application class before the app starts but this will be tricky to do from inside a script.

This is a good feature request for the Playcanvas engine:

Here is the relevant code where that happens:

Thanks for the quick reply! I will take a look if I can bypass this myself.

There’s a PlayCanvas Engine repo for bugs and feature requests (on mobile so can’t get to it). Something like this could be a feature in the settings.


How about setting up your batch groups in a script as required rather than specifying them in the Editor?

That would be a good alternative, though I’d say I found myself in cases where the level designer would very much like to play with groups/models in editor.

The pipeline usually is:

  • A list of disabled “template” entities with batch group IDs defined on their model component
  • On runtime those entities are cloned and procedurally placed in the level.
  • As soon as any of these cloned instances is enabled the batcher automatically kicks in.

Here it would be very helpful to have a manual mode for the batcher, have it run manually [] after everything is positioned and enabled and not when just a single entity with a batch group is enabled.

Agreed, it can work if everything is done in code (batch group creation and assignment) but that means script components and attributes, which adds an overhead.